Anyone who is attempting to make a game in C++ without an engine or similar technology is going to run into the issue of integrating assets. The solution for most data types is to use standard file formats. Images and sounds are a great example where that works. However, 3D file formats for are a different beast.
I've seen plenty of debate on the file format of choice and there are even a few projects trying to create a game dev standard format. I'm not here to tell you the right way to do it, since this will always be subjective to your project's needs, but instead to convey what we have found that works for us.
We originally started our 3D asset integration with the Open Asset Import Library. This project loads in a multitude of formats into a common data structure. The integration is very simple and works as expected. However, as we began to have larger requirements such as custom texture targets, animations, and timeline data, this project fell short. This is likely a downfall of them attempting to find a common ground between so many different formats.
Instead of dealing with the headache of other formats or libraries we instead decided it was best for us to just write our own exporter. Doing so allows us full control of the data that we need and put it into a format that is easy for the engine. It also assured us that as our engine grew and we needed to evolve the asset pipeline, we had full control to that process instead of being limited by the format of choice.
Below is our current script for exporting both models and animations from Blender 2.78 into our own format. Keep in mind that this was written for our own project and does what we require, it's not a generic solution. However it's a great starting point to anyone looking to write their own.
Inside you will find three files.
- __init__.py - Registers the exporter with blender
- exporter.py - Class which performs the export
- file_writer.py - Helper class to write to our binary format
The export process is fairly simple and self explanatory. Most of the work is in knowing where blender hides the information you are looking for. The rest of the code just puts this into the format we want.
To run this script copy the "export" folder to the "blender/scripts/addons" directory and activate it through the addons tab in blender settings.
However because of python caching, if you plan to modify and iterate the script I would suggest combining the exporter and file_writer script and running it from the blender built in text editor. Also, to get any print statements you will need to run the blender executable from a command prompt.
What Is Exported
At the moment the exporter handled the scene structure, meshes, materials and animations.
Nodes are stored either inside the root chunk or inside another node chunk. This describes the scene hierarchy.
Meshes are stored inside the root chunk and are referenced from the nodes which use them. Although a blender mesh might have multiple materials, we break them apart into several meshes and nodes have a reference to each. We also convert all meshes to triangles to avoid doing this in engine.
Materials are also stored inside the root chunk and referenced by meshes. These are fairly simple right now as our requirements are low. Right now we export (color, metal, rough, normal) as defined by the texture "Influence" inside of blender.
Animations are also stored inside the root chunk. Right now we only have one animation which is the global scene animation. The frame data is currently baked out for each frame, and we only store the key frames which actually affect the values of an object. Though this process of culling can make the export take a few seconds to complete, it is well worth it in file size.
Baking the animation was a point of decision for me as we have the capability to simulate the blender curves in engine and the baked data isn't always small. The deciding factor for this was that it allowed for a much simpler integration engine side and allowed us to use many blender features, such as IK, without having to re-implement those in our engine.
As a side note: Within our engine we treat bones just like any other node. Blender, however, treats bones as a special object inside of an armature node. As you will see we have a few replicated functions in the script because of this fact.
The Binary Format
I love using XML, but for this project we decided on a binary format to store our data. We went with a chunked format similar to the Interchange File Format.
The data is broken into chunks, using a 4 byte tag, followed by 4 bytes describing the length of the chunk, followed by the chunk data. Chunks can either contain raw data or additional chunks allowing for a hierarchy.
With the final data layout we decided to separate as much data into its individual tags. We wanted to avoid any chances that as the file format grew, we didn't end up with old files that were no longer compatible. It's a small trade off of memory, but I believe well worth it for the length of our project.
Since we are exclusively working on x86 computers there wasn't much need to worry about byte endianness. We both read and write in little endian, which we consider standard for our file format. The one special case is with chunk tags as we swapped those to big endian. As the tags are 4 bytes represented by 4 characters, swapping the bytes allowed these to be easier to human read in a text editor if we needed to inspect a binary file. Although we can't really read the values, it is still useful to inspect the layout of a file manually.
Below roughly describes the final layout of the binary chunks and tags.
ROOT - root container VERS - model filetype version numer NODE - A node in the scene (Multiple) NAME - string name ORIG - 3 float vector ROTA - 4 float quaternion SCAL - 3 float vector MESH - Index of mesh SKEL - Skeleton Chunk SIZE - Number of bones BONE - String of the child bone's node name (Multiple) REST- Rest chunk ORIG - Rest origin ROTA - Rest rotation SCAL - Rest scale NODE... - (Multipe, Nested) MESH - Mesh Chunk (Multiple) MATR - Index of material VERT - Vert chunk SIZE - Number of verticies CHAN - Channel of data TYPE - String representing the channel type ("position", "normal", "uv", "bone_weight") DATA - Raw vertex data FACE - Face chunk (Right now, always triangles) SIZE - Number of faces DATA - Raw face data SKEL - Name of the skeleton that drives the mesh MATR - Material (Multiple) NAME - string name TEXR - Texture chunk (Multiple) FILE - file path relative to model file TYPE - string describing channel ("color", "metal", "rough", "normal") ANIM - Animations chunk (Multiple) NAME - Animation name START - Start frame STOP - Stop frame TMSC - Timescale, multiplier for keyframes to seconds LAYR - layer NODE - Name of node affected CHAN - keyframe channel TYPE - target channel ("position_baked", "rotation_baked", "scale_baked") KYNM - number of keys KEYS - key data VALU - key value SPAN - span type
So far this solution has worked great for us and we would recommend it to others trying to solve the 3d model integration problem. A time investment is required to learn how to properly export the data you want but I believe it's well worth it, given that you get the exact data you like and allowing for the growth of the exporter as your project does.
Hopefully you have found this post useful in your own efforts. If you have any questions about our implementation feel free to e-mail me directly.