udk game development alan thorn pdf, udk game development pdf download more ebooks: the-good- creativeways-to-paul-jarvispdf. Alan Thorn is a London-based game developer, freelance programmer, and author Game Development with Unity and Blender and UDK Game Development. UDK Game Development. Alan Thorn. These days, the games industry is a multi- billion-dollar business. In this competitive marketplace, developers seek to gain.
|Language:||English, Spanish, Indonesian|
|Genre:||Academic & Education|
|ePub File Size:||24.46 MB|
|PDF File Size:||16.32 MB|
|Distribution:||Free* [*Regsitration Required]|
UDK Game Development and millions of other books are available for Amazon Kindle. . This item:UDK Game Development by Alan Thorn Paperback $ [pdf][epub] all access to udk ios game development beginner s guide pdf. free game development alan thorn pdf free, udk game development alan thorn pdf. independent projects. udk game development - soundofheaven.infoumi - tags: udk game development pdf download, udk game development alan thorn pdf free, udk.
It means, in theory, we can import NotificationsManager. Movement on X would be 1,0,0 , and on Z would be 0,0,1. Joel B Pryde rated it liked it Jan 25, Indeed, all of them are powerful and versatile in their own ways. There are several solutions for fixing this. It influences processes as diverse as lightmapping and navigation meshes.
More Details Other Editions 2. Friend Reviews.
To see what your friends thought of this book, please sign up. To ask other readers questions about Mastering Unity Scripting , please sign up. Be the first to ask a question about Mastering Unity Scripting. Lists with This Book.
This book is not yet featured on Listopia. Community Reviews. Showing Rating details. Sort order. Mar 20, Heleen Durston rated it it was amazing. Packt Publishing released a new book on Unity scripting in February of this year.
That being said the f Packt Publishing released a new book on Unity scripting in February of this year. That being said the first two chapters of the book are C refreshers and cover the basics of creating a project.
The book then reviews C all the way through classes and polymorphism. Chapter three covers Singletons which are ways of making an object persist throughout all of the levels and parts of your game. This chapter also discusses many other facets of game objects including when to update and change them. Chapter four discusses event-driven programming and how to use this method for optimizing your game.
Essentially the updating of game objects is removed from the Update function of your class and placing them into new classes that are driven by the events of the game. This seems to be a useful technique for all but the smallest of games. Chapter five is all about cameras, using multiple ones, their placement and working with line of sight issues. MonoDevelop has useful features that Unity programmers can use build extra functionality into their games.
Chapter seven demonstrates how to create artificial intelligence in Unity. The whole idea of artificial intelligence is to have enemies appear to make intelligent decisions based on what the player does. Chapter eight discusses ways of adding functionality to the Unity editor. Chapters nine covers Unity 2D objects and textures. Specifically, setting up a skybox where clouds are rotating is explained through an example that is provided. Examples are also given for setting up an asset database and setting up a scrolling texture for your game.
Chapter ten discusses setting up source control, specifically Git, for your game. Creating save game files and resource files is also explained. These features are important to most games, so it is wonderful that this is explained here. It is a great reference book that explains how to code in Unity many different aspects of game design that most game creators will need at some point in their game development career. Miranda rated it really liked it Jun 22, Chris Donnelly rated it really liked it Aug 27, Dmitry Minsky rated it it was ok Dec 17, In this competitive marketplace, developers seek to.
Chapter 3 talks about the viewports and the content browser in good detail. You build a playable level with lights, sounds, and basic animation. I will update this review as I go through the book. Again, everything and its use was laid out understandably and in good detail.
They are in black and white, and not very high quality. Getting Started with Game Maker, 1st Edition. It even lists different game development tools that can be used as alternatives. I felt the project at the end of the chapter could have been more elaborate and using some more tools, but gamd gives you a good general idea about how to create alwn outline of the level you want. The main focus is on the viewports, content browser which adds items to the leveland the tool box adds geometry, switches the editing tools you use, etc.
So to stay organized while project managing and coding, keep the following principles in mind: Every Unity project relies on assets. These include meshes, textures, audio files, animations, materials, scripts, scenes, and more.
Doing that will lead to confusion in the long term. Organization applies not just to assets in the Project panel, but to GameObjects in the scene, too. When building scenes with lots of objects, take a quick pause and scan through the Hierarchy panel and look at the names of your objects.
Ask yourself: Are these names meaningful? One way to reach practical judgments about this is to see if you can guess what the object is, what it does, and where it is in the scene, purely from the object name alone, without looking in the Scene tab or Game tab at all. If you encounter names, such as Cube01 or Obj1, and cannot reasonably determine what the objects do, then consider renaming your objects.
But it can ultimately save you hours of time when selecting objects. GameObjects live in the game world. Prefabs are collections of game objects configured together into a standalone template, which is reusable as though they were one complete entity. Take it a step further and use asset labeling, object tags, and layers. For example, consider labeling environment meshes under Architecture, and character meshes under Character, and so on.
Labeling an asset or more than one is simple. Click a tag from the menu to apply an existing one, or else use the Edit Box to type in a completely new tag for the selected assets.
On selection, the Project panel is immediately filtered to show only matching results. These remain unchanged regardless of labeling. Labels are Unity-specific metadata attached to the file for your convenience when searching through assets. Tags are special identifiers that you may attach to objects in your scenes, to group them and search them, and to perform other kinds of operations to them directly in code. To create a new tag and assign it to an object, select any object in the scene and click the Tag drop-down list at the top-left corner of the Object Inspector.
To assign a tag to one or more objects, select all relevant objects and choose their tag from the Tag list in the Object Inspector. Designing and Preparing 19 Figure You can also compare a tag with a String value using the GameObject. CompareTag function. See http: Layers are functionally similar to tags in that they mark or group GameObjects together by a specific criterion.
In contrast, layers are typically though not always used in conjunction with cameras. Layers are further considered in Chapter 8. The final organization tip to mention concerns code comments: My advice is: The following lists two of them, and I provide a reasoned response to illustrate the importance of commenting.
I know what this code does. And besides, nobody else is probably ever going to read my code anyway. The core assumption here is that code comments are exclusively for the benefit of other people. But this is not true. Code comments can help you write down and clarify your own ideas. They can help you remember what your own code is doing when you return to it weeks or months later.
So use code comments as an aid to memory. Code comments are pointless. They just get in the way and make things even more confusing.
Far from being helpful, they can steer us in the wrong direction. It just reminds us to be careful and concise in our commenting, keeping them relevant and informative. So keep comments as short as possible and stick to the point. Commenting can be extended into your very coding style. By using meaningful function, variable, and class names, you can make your code a lot clearer and easier to work with.
Tip 4: But there are times when this behavior can be problematic. In itself, doing that will not generally cause any major issues.
Many developers have experienced slow-downs and crashes in such circumstances. Tip 5: Among all these file formats, however, two main types may be identified: Proprietary and Exported. The ultimate purpose of both methods is to serialize or output meshes to a persistent file, which can be opened and read by many applications. However, despite the common aim, there are significant differences between the Proprietary and Exported files, which have implications when working with Unity. The upshot is generally this: Why should this advice be followed?
Designing and Preparing 23 Figure Blender is a free 3D modeling application that can be downloaded from www. Thus for Unity, Proprietary meshes create a dependency on their modeling software; Unity needs that software to import the mesh successfully. In doing this, however, Unity asks you no questions and provides no options.
It simply creates an FBX version with default settings applied. The result is that manually exported FBX files are typically cleaner and more efficient, because they feature only the data you truly need. In contrast, the Unity-generated FBX files from proprietary files generally include plenty of data that you never wanted exported anyway, such as lights, dummy objects, meshes and faces you forgot to delete, and so on.
The term unstable is used here in a narrow but important sense. This is due to possible changes or updates made to the FBX Exporter. In short, importing a Proprietary mesh with one version of the 3D software installed will not necessarily produce the same results when a different version is installed.
Exported meshes for Unity can be found online at http: Tip 6: This base or default illumination is known as Ambient Light. It represents a non-shadow-casting light that is projected outward from the scene origin in all directions infinitely, and it affects every mesh surface with equal intensity.
Designing and Preparing 25 Figure Specifically, child objects inherit the transformations of their parents. This feature is useful for making objects move and interact together believably. This hierarchical relationship can be put to good use in many ways and not just at runtime; it can help at design time, too. BroadcastMessage on the Root object to send an event or notification to every object in the scene, just by one line of code. Tip 8: So put a value on your time, give it respect, and invest it wisely when developing.
One way you respect your time and effort is by making regular backups of your data to prevent repeating work in the event of data loss. This is to protect you against unforeseen events, such as computer failures, data corruption, virus attacks, and other accidents.
For Unity projects, making a backup is really as simple as making a copy of your project folder, and then archiving it onto a separate storage device, such as an external hard drive or cloud-based storage, or both. For example, if the original files are at your office, then keep the backup at home. At the end of each work day or work session, ask yourself: How much work and time would I have to reinvest simply to catch up to where I was?
It surely goes without saying. If this is how you feel, then splendid! However, despite the lip service often given to the importance of backups, I often find people never making them and then later regretting that decision. So the importance of making regular backups cannot be overstated.
Tip 9: Now, it can tedious and time-consuming to name each of these objects individually. Unfortunately, Unity at the time of writing has no out-of-the-box functionality to automate this process. In this section, therefore, I want to introduce you to a custom-made Batch Rename tool, which is an editor extension that plugs into the Unity interface and offers simple renaming functionality for multiple objects.
The source code for this tool is listed in Listing for your viewing, and is also included in the Project Files inside the Chapter01 folder. Collections; public class BatchRename: The Batch Rename tool is simply provided to help improve your general workflow. More details on the implementation of this tool, and others, are covered in the Apress Book Learn Unity for 2D Game Development, available at www.
And then drag and drop the BacthRename. Once BacthRename. Then select those objects in the Hierarchy panel. Once specified, click the Rename button to complete the operation and rename the objects. You now have a Batch Rename tool. Designing and Preparing 31 Figure They serve a similar function to Dummies or Dummy objects in 3D software. An Empty is simply a GameObject that has no renderable components. Their lack of visibility makes them great for marking respawn points in the scene, or for acting as pivot points a point around which other objects revolve , or marking out regions in the level.
However, despite their usefulness in-game, Empty objects come with a drawback for the developer when working with them in the Scene Editor. That helps you to know where the object is. This can be a tedious workflow. Designing and Preparing 33 Tip Use the Stats Panel When play-testing and debugging your games inside the editor, be sure to make the Stats panel your friend; it is also known as the Rendering Statistics window.
It features lots of helpful information, updated in real time while your game is playing. The Stats panel appears in the top-right corner of the Game tab and offers an overview of how your game is performing in terms of frame rate and resource usage, among others. For this reason, always be sure to test and benchmark game performance on your target hardware; that is, on the minimum specification for which your game is intended.
FPS frames per second. This shows the number of frames that your game is actually rendering to the screen each second. Generally, the higher and the more consistent this number, the better. There is no ultimate right or wrong definitive answer as to what this number should be; it will vary over time. The more important question is: Does your game look and perform as intended on your target hardware?
And perhaps the Stats panel can help you diagnose what it is. That being said, the FPS should not usually be less than 15 frames per second. This refers to the total number of times per frame that the Unity engine calls on the lower-level rendering functionality to display your scene to the screen. The higher this value, the more complex and expensive your scene is to render. For this reason, lower values are generally preferred. There are two easy ways to reduce draw calls.
Each unique material in your scene will cost an additional draw call. For this reason, if you merge textures together into larger atlas textures, and also share and reuse materials across multiple meshes, then you can significantly reduce draw calls and improve performance.
This indicates the number of batching operations Unity managed to perform on your objects to reduce the number of draw calls. Typically, each Saved by Batching operation saves us at least one additional call. In most cases, the higher this value, the better. This is the total number of triangles being rendered in the current frame, after culling and clipping have been applied. Most contemporary graphics hardware on desktop computers and consoles are adept at processing triangles quickly, meaning high tri-counts can, in principle, be achieved.
The same cannot always be said of mobiles, however. It tells us how much video memory on the graphics hardware is being used for processing and texture storage.
Tip Achieving this is easier now than it ever has been. Switch to the Game tab, and then click the Aspect drop-down box in the top-left corner. Designing and Preparing 35 Figure And finally, in preparation for that work, it listed a range of practical and relevant guidelines and workflows for using Unity in real-world projects.
In short, by now you should be able to do the following: This involves a wide breadth of steps; specifically, creating a new Unity project, importing and configuring assets, building Prefabs and levels from modular environment pieces, building lighting and lightmapping, and configuring a NavMesh for pathfinding. This project is also included in the book companion files, in case you want to skip this chapter and concentrate just on C coding, which begins in the next chapter.
Step 1: Create your project folders first so you can quickly arrange and categorize the assets you import right from the outset.
For this project, the following folders will be required: Getting Started Figure Step 2: Importing Textures and Meshes Importing meshes and textures is an interrelated process because meshes typically rely on textures. Consequently, you can make importing run smoother and easier if you import textures before meshes. By importing in this order, Unity detects which textures and materials to autoassign onto your meshes at import time.
This means your meshes will automatically show their texture in the preview pane from the Object Inspector, and even on their thumbnails inside the Project panel.
The latter method is preferable when importing multiple assets together. Getting Started 39 Figure All meshes and objects inside the game will reference the appropriate texture areas inside the atlas, as opposed to referencing separate files. Doing this allows us to share a single material or the fewest number of materials across all objects, leading to improved rendering performance.
Remember from Chapter 1, which discussed the Rendering Statistics window, that each unique material rendered to the display increases the draw calls. Getting Started Step 3: Specifically, Unity applies a default mesh scale factor of 0.
The result is that every imported FBX, by default, will appear in your scene times smaller than its original size. The value 1. In any case, you can easily change the scale factor for a mesh from the Object Inspector. But typing this in manually can be tedious. Instead, we can code an editor extension to automate the process, forcing Unity to apply a scale factor of 1.
Due to floating-point inaccuracy that can result from arithmetical operations, avoid having your meshes very small or very large. Create a new C source file inside the Editor folder of the project.
Nowhere else is acceptable. Important lines are highlighted in bold. Listing InPreprocessModel method will be called once automatically by the Unity Editor for each imported mesh. On each execution, the mesh scale factor represented by the globalScale property will be set to 1. Some meshes may appear rotated by 90 degrees in the preview pane, but this is not a problem. They will appear at their correct orientation when added to the scene Step 4: This material is assigned automatically onto all imported meshes.
Although we want to keep the material itself, the folder organization is not neatly compatible with our own system and folder structure. Getting Started 43 Figure So if you accidentally delete something from your project, you can easily restore it again.
Unless you empty the Recycle Bin or Trash! Secondly, imported meshes lack collision information by default. There are several solutions for fixing this. Doing this autogenerates a mesh collider component with appropriate collision data based on mesh geometry and attaches it to the mesh.
This collision data typically produces accurate results, leading to high-quality collisions. Consequently, the second method is often a preferred alternative. But our game with its low-poly meshes may safely use the first method.
Like regular UVs, lightmapping UVs are a set of mapping coordinates for meshes. Standard UVs define how regular textures, like Diffuse and Bump textures are projected onto the mesh surface in three-dimension.
In contrast, lightmap UVs help Unity and the Beast lightmapper understand how to project baked lighting such as indirect illumination from lightmap textures onto the mesh surface. There are occasions when this choice may not be troublesome: To achieve this, however, a mesh needs a second UV channel, and there are two main options available for creating this channel. And the second method is to have Unity generate a second lightmap UV channel.
This latter approach is achieved by selecting all appropriate meshes in the Project panel, and then by enabling the Generate Lightmap UVs check box from the Object Inspector. If your meshes are organic, curved, spherical, and smooth, then better lightmap UVs can usually be generated with higher values for the Hard Angle setting.
For CMOD, the default settings will be suitable. Step 5: Its dimensions have been chosen for two reasons. Nearly every texture, except a GUI texture or dedicated sprite, should be a power-2 size, for both performance and compatibility reasons. However, the texture need not always be square. By making textures to the largest size possible therefore we can always downsize, if required, through all the possible sizes that Unity supports.
This is always preferable to making textures smaller than needed, because upsizing always incurs quality loss and blur, due to resampling. In general, make textures exactly the size you need, and no smaller or larger. This is because all resizing involves resampling.
Of course, this still involves implicit resampling and quality loss as described earlier, but the resized versions are always generated from the original imported texture and not from any other resized versions we may have generated previously. This means any quality loss incurred through resizing is not accumulative, even if you resize multiple times in succession.
Getting Started 47 Figure Sometimes the default settings will be just what you need, in which case no further tweaking is necessary. In our case, however, the default settings are probably very different from what we need. How so? And what should be changed? The following points answer our two questions.
This size might be suitable for legacy hardware and some mobile builds. Rather, it restores the original texture without downsizing it.
Switching between these tabs means we can specify different textures sizes on a per-build basis, allowing Unity at compile time to automatically select and build the appropriate texture for our target platform. Texture refers to general textures applied to most types of 3D meshes, including environment meshes and animated meshes. In many cases, this setting will be acceptable.
That is, as 2D images constantly aligned upright and facing the camera. This means the atlas will be multipurpose. Getting Started 49 Figure Go ahead and copy over those settings on your system. First, Alpha is Transparency is enabled. Transparency is important for sprites. Getting Started Second, Sprite Mode has been set to Multiple, because the texture contains multiple sprite characters and animations.
Third, Pixels to Units is set to For your own projects, this value will likely require tweaking. Fourth, Wrap Mode has been set to Clamp to prevent distortion or tiling artifacts from appearing around the edges of sprites. Step 6: Building Sprites Unity 4. As mentioned, CMOD will make use of some of these features for creating billboard sprites in the level.
These are produced using the Sprite Editor. In short, the Sprite Editor allows us to mark rectangular regions UV rectangles inside an existing atlas texture to use as a sprite. All of them must be marked by clicking and dragging a Sprite Selection rectangle around them. Getting Started 53 Figure These are grouped together under the atlas texture asset in the Project panel.
In doing this, CMOD now has all required sprites. You can even drag and drop sprites from the Project panel and into the scene, via the Hierarchy panel, to instantiate a sprite in the scene.
Step 7: The book companion files for CMOD feature only a few audio sound effects, created using the sound generator tool SFXR, which can be downloaded for free from www. SFXR is software for procedurally generating the kinds of sound effects commonly used in old-school video games, such as the original Super Mario Bros. These include Explosion.
Go ahead and import all these sound effects into the Unity project, using the conventional drag-and-drop method. The audio assets for this book were generated using the free program SFXR Every audio file for this project shares an important characteristic that requires us to adjust the default import settings applied to them.
Specifically, every audio file will be 2D and not 3D. That is, none of our sounds are located at any specific 3D position in the scene. Rather, the sounds should simply play in all speakers at a single and constant volume.
Once completed, all audio assets are now successfully imported and configured ready-for-use in the scene. Getting Started 55 Figure Create Prefabs Importing assets is primarily about collating together and preparing the raw materials on which our game will be founded and assembled.
Once importing is completed, www. Getting Started the next stage of development is to create abstracted assets. That is, to use the raw and imported assets the asset files to create any further or more complex assets inside the Unity Editor. One such asset is the Prefab. In short, the Prefab allows us to drop a collection of assets, like meshes and scripts, into the scene to compose a more complex entity or thing.
Based on that, we may create an asset, which can thereafter be treated as a complete and separate whole. They save us from building and rebuilding similar objects across different scenes, and even within the same scene.
Specifically, these pieces include individual items of furniture, such as file cabinets and desks, and architecture and props like corner sections, crossroad sections, T-junctions, and door sections. These pieces are designed to be instantiated in the scene, where they may be combined and recombined into unique arrangements to form more complete and seamless environments.
This building-block method of level creation is often called the modular method because each environment piece is seen as a module in a larger set. Of course, you can alternatively build a single and huge environment inside your modeling software of choice, and then import that into Unity as a final and unchangeable mesh. But doing that comes at the cost of flexibility and versatility, as well as performance.
By importing separate and reusable environment pieces instead, you can recombine them together into a potentially infinite number of environment combinations. This allows you to assemble many different levels from the same, basic mesh ingredients.
Plus, it works better with Occlusion Culling. It can save you lots of time. Getting Started 57 Importing environment pieces for the modular building method typically leads us into using Prefabs. Why is this? From the Hierarchy panel, as well as the Project panel, you can see that this larger arrangement is formed from a total of eight smaller mesh instances, using four different mesh assets.
Together these form a complete corner section for a corridor. However, a scene will typically have many corridors with turns and corners and T-sections, as well as other similar architectural configurations that repeat themselves over and over again. By using Prefabs, we can dramatically reduce our workload in this context. This can make Prefabs an invaluable tool for making modular environments. Vertex snapping lets you align meshes together, exactly at the seams. To access Vertex Snapping, activate the Translate tool W key and then hold down the V key Vertex Snapping to align two meshes together at the vertices.
To access Grid Snapping, to move along the grid in discrete increments, hold down the Ctrl Cmd key while translating objects. Prefabs are especially useful tools when creating modular environments By default, a Prefab is created as an empty container object, and has no association with any other assets. To build this association, a parent-child relationship should first be established between all meshes that you want to include in the Prefab.
Getting Started 59 Figure Doing this establishes the essential connection between the meshes in the scene and the Prefab asset. Use this technique to identify all other relevant architectural configurations that could be abstracted into reusable Prefab objects. The earlier you do this in your workflow, the easier your level-building experience will be in the long term.
Getting Started 61 Figure Both Mesh assets and Prefabs will inevitably form the raw materials and ingredients from which scene geometry is made. Doing this is not essential, of course, as you can effectively position your meshes anywhere. But starting at the world origin and building outward adds a certain degree of neatness and cleanliness to your mesh positions and scene limits.
This might at first seem a trivial preoccupation, but investing time upfront for good organization and a desire for tidiness like this helps make development smoother.
Ideally, development should smoothly flow along from beginning to end, each step logically and reasonably following the previous. Doing this allows you to foresee and correct for structural or logistical problems that could arise from specific scene arrangements, such as: Will that enemy be able to fit through that walkway?
And so on. This allows you to move the entire scene by transforming one object, should you want or need to. Dynamic objects include those such as the player character, enemies, weapons, vehicles, doors, particle systems, and many more.
Static objects include walls, floors, ceilings, tables, chairs, stairs, windows, mountains, hills, rocks, and more. Typically static objects account for the majority of scene objects and dynamic objects for the minority this is not true of every game, but probably true for most games, and certainly most FPS games. This distinction is an important one for games and for performance. It influences processes as diverse as lightmapping and navigation meshes.
In short, if an object is static and never moves, then we can mark it as such directly from the Unity Editor by using the Static check box available for meshes in the Object Inspector.
Enabling this for static objects achieves a range of performance benefits. For example, only static objects can be lightmapped and only static objects can be baked into navigation meshes. Getting Started 63 Figure Click Yes to apply www.
Getting Started Step For me, lightmapping is one of the most fascinating ideas in game development. It was introduced as a limited solution to an intractable problem; one which even today has no all-encompassing solution. Lightmapping is one of the solutions to this problem. This is a tool that casts rays of light into the scene, outward from all light sources, and then traces how those rays bounce and react to scene geometry.
The purpose of this is to assess how bright or dark and which color the impacted surfaces should be. This process can be time-consuming in terms of hours or even days , but it lets developers precalculate the effects of scene lighting at design time using the Unity Editor, and to bake the results of that process into dedicated textures, known as lightmaps. The lightmaps contain information about shadows, color bleeding, indirect illumination, and more.
Unity then automatically blends the lightmaps onto the scene geometry at run-time, on top of the regular textures and materials, to make the geometry appear illuminated by the lights. The famous author Arthur C.
Getting Started 65 However, lightmapping has its limitations. The solutions to this problem take various forms, and Unity offers Dynamic Lighting and Light Probes a special kind of semidynamic lighting. Light probes will not be considered further in this book.
Dynamic lighting is simply the process of calculating lighting for moving objects in real time. This means moving objects will be rendered less realistically than static ones.
For those interested in light probes, more information can be found at the Unity official documentation at http: Lights can be added to the scene using the main menu: But one general rule to follow when lighting is less is more. Cut back on excess and get the best results possible from the fewest number of lights.
Directional Lights simulate bright light sources at a distant location, casting infinite rays of light in a single direction. And Area Lights can be excluded generally because we must illuminate both static and dynamic objects, and they pertain only to lightmapped objects. This leaves us with a realistic choice between two light types to use we can use a mixture, too ; specifically the Point Light and the Spotlight.
That is, Spotlights are typically more expensive than Point Lights. Getting Started 67 Figure All scene lights are grouped as children beneath a single, parent GameObject. You may like this arrangement, too. From the Lightmapping window, open the Bake dialog. The points that follow explain some of the options available and the reasons behind my decisions.
In Unity 4. Both dual and directional lightmapping produce two set of lightmaps. Dual lightmapping produces a Near map and a Far map, and directional lightmapping produces a Color map and a Scale map. The usage of these maps varies. The point of both of these modes is to increase realism, giving you a balance between prebaked lightmaps and per-pixel dynamic lighting, with effects such as specularity and normal maps.
This mode produces only one lightmap set, as opposed to multiple. This map or set of maps features all the direct and indirect illumination for the scene.