- Unity 5.x Shaders and Effects Cookbook
- Alan Zucconi Kenneth Lammers
- 880字
- 2021-07-16 12:59:37
Adding a texture to a shader
Textures can bring our shaders to life very quickly in terms of achieving very realistic effects. In order to effectively use textures, we need to understand how a 2D image is mapped to a 3D model. This process is called texture mapping, and it requires some work to be done on the shader and 3D model that we want to use. Models, in fact, are made out of triangles; each vertex can store data that shaders can access. One of the most important information stored in vertices is the UV data. It consists of two coordinates, U and V, ranging from 0 to 1. They represent the XY position of the pixel in the 2D image that will be mapped to the vertices. UV data is present only for vertices; when the inner points of a triangle have to be texture-mapped, the GPU interpolates the closest UV values to find the right pixel in the texture to be used. The following image shows you how a 2D texture is mapped to a triangle from a 3D model:

The UV data is stored in the 3D model and requires a modeling software to be edited. Some models lack the UV component, hence they cannot support texture mapping. The Stanford bunny, for example, was not originally provided with one.
Getting ready
For this recipe, you'll need a 3D model with UV data and its texture. They both need to be imported to Unity before starting. You can do this simply by dragging them to the editor. As the Standard Shader supports texture mapping by default, we'll use this and then explain in detail how it works.
How to do it...
Adding a texture to your model using the Standard Shader is incredibly simple, as follows:
- Create a new Standard Shader called
TexturedShader
. - Create a new material called
TexturedMaterial
. - Assign the shader to the material by dragging over it.
- After selecting the material, drag your texture to the empty rectangle called Albedo (RGB). If you have followed all these steps correctly, your material Inspector tab should look like this:
The Standard Shader knows how to map a 2D image to a 3D model using its UV data.
How it works…
When the Standard Shader is used from the inspector of a material, the process behind texture mapping is completely transparent to developers. If we want to understand how it works, it's necessary to take a closer look at TexturedShader
. From the Properties
section, we can see that the Albedo (RGB)
texture is actually referred to in the code as _MainTex
:
_MainTex ("Albedo (RGB)", 2D) = "white" {}
In the CGPROGRAM
section, this texture is defined as sampler2D
, the standard type for 2D textures:
sampler2D _MainTex;
The next line shows a struct called Input
. This is the input parameter for the surface function and contains a packed array called uv_MainTex
:
struct Input {
float2 uv_MainTex;
};
Every time the surf()
surface function is called, the Input
structure will contain the UV of _MainTex
for the specific point of the 3D model that needs to be rendered. The Standard Shader recognizes that the name uv_MainTex
refers to _MainTex
and initializes it automatically. If you are interested in understanding how the UV is actually mapped from a 3D space to a 2D texture, you can check Chapter 3, Understanding Lighting Models.
Finally, the UV data is used to sample the texture in the first line of the surface function:
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
This is done using the tex2D()
function of Cg; it takes a texture and UV and returns the color of the pixel at that position.
Note
The U and V coordinates go from 0 to 1, where (0,0) and (1,1) correspond to two opposite corners. Different implementations associate UV with different corners; if your texture happens to appear reversed, try inverting the V component.
There's more...
When you import a texture to Unity, you are setting up some of the properties that sampler2D
will use. The most important is the Filter mode, which determines how colors are interpolated when the texture is sampled. It is very unlikely that the UV data will point exactly to the center of a pixel; in all the other cases, you might want to interpolate between the closest pixels to get a more uniform color. The following is the screenshot of the Inspector tab of an example texture:

For most applications, Bilinear provides an inexpensive yet effective way to smooth the texture. If you are creating a 2D game, however, Bilinear might produce blurred tiles. In this case, you can use Point to remove any interpolation from the texture sampling.
When a texture is seen from a steep angle, texture sampling is likely to produce visually unpleasant artifacts. You can reduce them by setting Aniso Level to a higher value. This is particular useful for floor and ceiling textures, where glitches can break the illusion of continuity.
See also
If you would like to know more about the inner working of how textures are mapped to a 3D surface, you can read the information available at http://http.developer.nvidia.com/CgTutorial/cg_tutorial_chapter03.html.
For a complete list of the options available when importing a 2D texture, you can refer to the following website:
- Android Wearable Programming
- 編寫高質量代碼:改善C程序代碼的125個建議
- Java Web基礎與實例教程
- Learning Network Forensics
- Domain-Driven Design in PHP
- Kotlin Programming By Example
- Google Maps JavaScript API Cookbook
- C++游戲設計案例教程
- Instant JRebel
- jQuery權威指南
- 代碼整潔之道:程序員的職業素養
- Test-Driven iOS Development with Swift 4(Third Edition)
- 微信公眾平臺應用開發實戰
- Building Probabilistic Graphical Models with Python
- 數據庫技術及應用(Access)實驗指導與習題集(第2版)