General 3D Content Creation
In order to facilitate the management of 3D assets at scale, so they can be used in many different contexts without problems or manual conversion effort, it is necessary to follow fairly strict best practices. This document outlines a set of basic standards, practices, and details that help achieve higher quality results in an efficient manner. We recommend that all users of the platform follow these guidelines.
The ThreeKit platform has import support for several main 3D asset formats - FBX, OBJ, GLTF/GLB.
The typical workflow has the artist export their 3D models from their 3D software of choice as FBX.
Currently, this supports only mesh, material, and embedded texture data. Animation, skeleton, and blendShape information is ignored. On import, the FBX will generate a 3D asset on the platform. It will also generate Material assets for the associated materials and maintain material assignment. Last, but not least, it will also create Texture assets for embedded textures and maintain their association inside the materials.
For FBX imports, the material properties will not fully match the unique properties of the material inside the 3D software from where they were exported. It mainly acts as a starting point or placeholder. If you are using a PBR material inside your 3D software, then the best way to export it would be to use GLTF.
The GLTF import works similarly to the FBX import, except that it also retains the full PBR material properties stored inside the GLTF.
The implementation teams may also require from clients the original 3D files in - 3dsmax, Maya, Blender, Zbrush, etc, as well as FBX exports. In many cases, the FBX export would be sufficient, and this would be made clear at the initial call with the ThreeKit internal art team for a particular project.
For material work, it may also be required to provide the original assets in PSD or Substance format in addition to the exported JPG or PNG maps.
We recommend that you avoid the following geometry issues:
- Isolated vertices - inefficient, can cause issues
- Coincident vertices - Mesh smoothing operations may generate undesirable results
- Coincident/coplanar faces - leads to z-fighting and subdivision issues
- Coincident edges (unwelded seams)
- Inverted or Inconsistent face normals - flipped normals from mirroring is a typical example, and it can cause a number of issues.
We prefer quad geometry when possible because it makes it easier to do edits and run subdivision. There are no limitations on the platform as far as triangles or n-gons are concerned. The limits there are mostly that they cause subdivision artefacts the same way as it would in any other 3D software.
Triangulated mesh imports are necessary in some instances. These are general real-time considerations. An example would be for applications where normal maps are required on folds and creases, or where the exact triangulation is very important. A typical example is clothing, where the geometry was either sculpted or generated by a tool like Marvellous Designer. Baking normals for such meshes are highly sensitive to the mesh triangulation. Our platform has its own triangulation algorithm that is likely to generate different results than the software that baked the normals. This will cause the normal maps to apply incorrectly.
Another example where triangulation would be necessary is with animated meshes that are deformed by a skeleton chain. In some instances, the software needs to decide where to split the quads into triangles when the quads get deformed. If the quad was not pre-triangulated, the effect will show the inside edge being flipped during the animation.
We would prefer the models to have as few polygons as possible without sacrificing important details as well as ensuring clean/non-faceted silhouettes. This is a hard judgment call.
When a choice has to be made between adding the extra detail through normal maps vs geometry, there are cases where the normal map would require a significantly larger file size than the file size of the additional polygons. This should also be a factor in the decision. The intended target platform for the project is also a consideration. For certain applications like Google or Facebook embeds, the triangle count is very limited (around 10k).
We generally aim to stay within 100k triangles on screen at the same time. Most current devices allow smooth performance with millions of triangles, while performance can be seriously impaired on older devices. The main problem with millions of triangles becomes the resulting download size. In the end, visual requirements, performance and file size are all deciding factors. The larger the triangle count, the larger the asset file size.
For the Virtual Photographer renders using Vray there is no hard limit in terms of polygon count. However, for ease of operation it is advisable to work with medium resolution meshes that will get subdivided at render time. In addition to that, displacement maps can be used to add the additional detail. The platform can easily support millions of triangles in the viewport as long as you have a dedicated video card like an nVidia Geforce or ATi Radeon. A bigger hit to performance while you work with these large assets will actually be the download size, as the assets have to be loaded from the cloud whenever you open them.
Generally, we would prefer geometry that is compatible with the standard Catmull-Clark material ID and smoothing group/normal aware subdivision surface operator.
All objects should be created such that they are in real-world scale. This allows for multiple objects to be imported into the same context without scaling issues. It also improves the ease of accurate lighting because it enables lighting based on physical quantities.
Objects should be placed such that its natural base is located at 0,0,0.
The object should be oriented so that its natural front is oriented towards the front direction in your tool.
We allow for an internal object hierarchy. It is best if it is logically grouped.
Nodes should be in English and have meaningful names. Calling things Box001, Box002, Plane003 is not acceptable as it is meaningless.
Meshes that share the same material and do not need to be separately configurable should be combined into a single mesh.
One should aim to only have a sufficient number of nodes but not an excessive amount. 5 to 40 nodes per.
For WebGL purposes, all objects should have their UVs unwrapped to use the texture space as efficiently as possible. Avoid wasting texture space, as it forces you to use higher resolution textures which require more memory and file size.
For objects that are supposed to share the same material, such as different pillows, sofas, and armchairs, make sure to have consistent scaling for the UV shells/islands. This ensures that one material can map the same way across the different objects. The easiest way to achieve this is to use the Texel density feature inside Maya.
For AR purposes, please keep in mind that ARKit on iOS has a limitation currently that prevents it from reading more than one UV channel. This means that if your product is using one UV channel for tiling a fabric material, and a second UV channel for a baked AO map or normal map, then only the first UV channel will be read by ARKit.
Additional UV channels will affect the file size for the geometry. This can have a significant impact on file size especially for larger triangle counts. If you do not need the additional UV channels, then please avoid exporting them.
Our system will automatically convert textures internally to a power of 2, such as 4096x1024, 1024x1024, 1024x512 or 128x512. To avoid loss of quality by rescaling of textures, it is best to create your textures in a power of 2 size.
We recommend JPG and PNG texture formats. For WebGL purposes, PNG should only be used for textures that require an alpha channel as it does not compress as well as JPG. For the environment maps, we support both HDR and EXR textures.
Real-time applications generally work best with texture sizes less than 3MB in total per model unless the model is uniquely complex.
Mobile video memory can also be a limiting factor to texture size. Smartphones have shared video memory with the system memory, and textures get unpacked fully uncompressed into video memory. Thus, a 4k texture may be 1MB on disk, but it will end up being 64MB in video RAM for an RGBA type of texture. Care must be taken for mobile devices that textures and geometry do not overload the video memory. It is recommended that for mobile outputs we limit to 2k textures and less, unless specific testing shows that 4k textures will work on the targeted devices.
Procedural textures cannot be correctly saved to formats like FBX or GLTF, and thus cannot be transferred to real-time applications. Procedural textures would have to be baked to a texture file before use in Webgl.
Virtual Photographer with Vray does offer support for a number of procedural textures, as Vray can export them to vrscenes.
In order to automate texture import, the platform supports the upload of pre-configured texture files. This is done through the use of a text file with JSON configuration inside it. The JSON file needs to have the same file name as the texture it affects, and have the extension .pbrtex.
For example: MyTexture_diffuse.jpg MyTexture_diffuse.pbrtex
Upon import, the contents of the JSON file will be automatically applied to the texture asset properties.
Currently, these JSON files can be generated automatically through a variety of scripting means inside Windows, MacOS or Linux.
The upload process for these textures requires that they are zipped together into a zip file with the file extension changed from .zip to .pbrzip.
To further automate texture association with materials, metadata can be used to identify the correct textures to be loaded. Once metadata has been assigned to the texture assets, a template material can load the appropriate textures by using a query rule (currently available only through custom code).
Materials should be set up as ONE physical material per mesh. Although we do offer support for Multi-ID materials (multiple materials assigned to different polygons on the same mesh), it is not a recommended workflow generally.
WebGL materials are typically created directly on the ThreeKit platform. They could also be imported directly from other tools (such as Substance) through the use of the glTF format.
Many of the material attributes include both a factor as well as a map asset. These two work in conjunction with each other, where the factor acts as a multiplier on top of the texture values.
For example, if a texture is entered in the Roughness Image Asset slot, and the Roughness Factor is set to 0.25, then the resulting roughness will be only 25% of the texture values. If you wish to utilize just the texture directly, then you would have to use a Roughness Factor of 1.
The same applies to a colour map, like the Base Image Asset. If a texture is chosen there, the Base Color will act as a multiplier on top of the texture, tinting the end result.
Any of the attributes here can be customized with rules under the Logic section by using the Set Property Action.
Within the ThreeKit PBR material there are some additional features and operators that extend the typical properties of a real-time PBR material beyond what is currently supported by the glTF format. One such example is the Gem operator, which replaces the underlying material with a Gem only shader with some unique attributes.
The Tiling Override operator is also extremely useful for adjusting the tiling of any of the textures mapped to the material’s properties. This allows the user to apply an overall tiling setting to a number of maps for a particular material, ignoring the existing tiling information stored inside the individual textures.
In a scenario where a model needs to use a generic material that is being shared by other objects, such as a tiling fabric texture, the model may also need to apply its own unique normal map or AO map to the material. The ThreeKit platform supplies functionality for this purpose using a PolyMesh Operator called Map Override. This operator has to be added directly to the mesh. The maps specified in that operator will then get automatically assigned to any material that is applied to this mesh.