Table Of Contents

Previous topic

13. Assets

Next topic

15. JavaScript Development Guide

This Page

14. Asset Pipelines and Formats

14.1. The Scene Pipeline

The scene pipeline supports the export of scenes from 3D modeling packages via Collada.

For a given Collada scene the dae2json tool will export the following elements:

  • Shapes
  • Materials
  • Effects
  • Images
  • Physics models
  • Physics materials
  • Physics bodies
  • Animations
  • Lights
  • Cameras
  • Nodes hierarchy

Materials, effects and lights can be overloaded at this stage or later at runtime. The dae2json tool supports passing include files that provide material, effect and light overloads. Overloads from the include files will be matched by name and will replace the elements from the Collada scene.

This is an example of how a JSON include file would look::

{
  "version": 1,
  "lights":
  {
    "point_light_1":
    {
      "type": "point",
      "color": [1, 1, 1],
      "material": "squarelight",
      "halfextents": [10, 20, 10]
    },
    "spot_light_1":
    {
      "type: "spot",
      "color": [0.89, 1, 0.99],
      "material": "conelight",
      "right": [0, 6, 26],
      "up": [21, 0, 0],
      "target": [0, -36, 9],
      "shadows": true
    }
    "directional_light":
    {
      "type": "directional",
      "color": [0.2, 0.2, 0.8],
      "direction": [0, -1, 0],
      "global": true
    },
    "red_ambient":
    {
      "type": "ambient",
      "color": [0.1, 0, 0],
      "global": true
    },
  },
  "effects":
  {
    "lambert-fx":
    {
      "type": "lambert",
      "meta":
      {
        "normals": true
      }
    }
  },
  "materials":
  {
    "squarelight":
    {
      "parameters":
      {
        "lightfalloff": "squarelight.dds",
        "lightprojection": "squarelight.dds"
      }
    },
    "conelight":
    {
      "parameters":
      {
        "lightfalloff": "white",
        "lightprojection": "roundlight.jpg"
      }
    },
    "rocket_colored":
    {
      "effect": "lambert-fx",
      "parameters":
      {
        "color": [1, 0.2, 0, 0.5],
        "diffuse": "textures/rocket.dds"
      },
      "meta":
      {
        "collisionFilter": [ "ALL" ],
        "noshadows": true,
        "transparent": true,
        "materialcolor": true
      }
    }
  },
}

These include files can be generated by hand or exported from other source assets, a Collada scene file can be exported passing as many of them as needed as parameters to the tool.

The dae2json tool automatically converts Collada scenes to have 1 unit per meter and the Y vector to point upwards, some modeling packages may use different conventions.

For details on generating your own JSON files see the Turbulenz Engine JSON formats documentation.

14.1.1. Workflow

  • Model scene in 3D modeling package
  • Export scenes to Collada (one or more Collada scene files)
  • Create include files with an editor supporting JSON or JavaScript
  • Execute dae2json on each Collada file, include files if required
  • Load converted scenes into user application

14.1.2. Collada feature limitations

Certain features supported in the Collada specification are not supported or have limited support, these limitations are documented below.

  • instance_node with internal url - An instance_node in a Collada scene will be correctly inserted into the converted scene but will have been de-instanced. As a result any operations made to the source node at runtime will not be propagated automatically to its instances.
  • instance_node with external url - References to other Collada scene files will be inserted into the converted scene and the whole external hierarchy will be de-instanced at runtime. This may not be the required result if the reference was not linking to the root node of the external scene.

14.2. The Animation Pipeline

The animation pipeline supports the export of keyframed animation for hierarchies of nodes from 3D modeling packages via Collada.

For a given Collada scene the dae2json tool will export 1 or more animation clips from a single Collada scene along with the geometry and nodes which are to be animated by the animation. It is also possible to run the tool with a switch to export only the animation clips from the file, this can be used where the animations are modeled in multiple scenes within the modeling package, and hence exported to multiple Collada scenes. Animation clips can be exported to Collada scenes by setting them up with tools such as the Trax editor in Maya. Where multiple animation clips are present in the Collada scene multiple animations will be exported to the output scene with matching names. Where no clips are present in the Collada scene all the animations in the scene will be grouped into a single animation name “default” (can be overridden via a tool parameter)

14.2.1. Workflow

  • Model skinned/rigid objects in 3D modeling package
  • Create one or more animations in that scene or clones of the scene
  • Export animation scenes to Collada (one or more Collada scene)
  • Execute dae2json on each Collada file (using animation only flag if there are many scenes)
  • Load converted scenes into user application

14.3. The Shader Pipeline

We support conversion of the CgFX shader file format to our internal format. For the broadest compatiblity we recommend targeting the OpenGL ES 2.0 feature set in order to be compatible with the WebGL and our compatibility mode.

We provide a tool for converting CGFX shaders to the Turbulenz Engine Shader format.

For a given CgFX file the cgfx2json tool will create a JSON file containing a shader definition:

  • Techniques
  • Programs code
  • Render states
  • Sampler states
  • Global shader parameters
  • Input semantics for vertex programs

Shader parameter semantics are ignored by the cgfx2json tool, parameters will be matched at runtime by the variable name.

It is recommended that the CgFX file compiles program code either into GLSL profiles or into ‘latest’.

For more information about the CgFX file format please read the NVidia tutorial.

For more information about JSON please visit json.org.

14.3.1. Workflows

Loading a shader definition file

  1. Create CgFX file with shader editor.
  2. Execute cgfx2json on each CgFX file to generate a JSON file containing the shader definition.
  3. Load contents of JSON file at runtime as a JSON string.
  4. Execute JSON.parse on the JSON string to create a JavaScript object.
  5. Create runtime Shader object by passing the JavaScript object to GraphicsDevice.createShader.

Inlining a shader definition

Steps 1 and 2 as on the loading case.

  1. Copy and paste contents of JSON file to your JavaScript code assigning it to a variable.
  2. Create runtime Shader object by passing the variable to GraphicsDevice.createShader.

This workflow is less flexible than loading the shader definition file at runtime but it avoids the added latency of requesting the file. The CPU cost of parsing the JSON string from the JSON file to create a JavaScript object is about the same as the cost of parsing and executing the JavaScript code that contains the shader definition.

14.4. The Turbulenz Engine Asset Formats

We provide a set of tools for our JSON format:

A JSON file is an object which can in turn contain more objects.

Objects are defined in a similar format as in JavaScript:

{
  "objectName": {
    "objectProperty1": "String",
    "objectProperty2": ["Array1", "Array2"],
    "anotherObject": {
      "anotherObjectProperty": 5
    }
  }
}

In our JSON format we have 2 object types; objects and collections.

Objects have well defined property names. For example, the “geometries” object will always have “inputs”, “sources” and “surfaces” properties. Any other properties on the “geometries” object would be ignored.

An object is a collection if it does not have well defined property names. Generally each property of a collection refers to an object and the property name is used as the name of the object. For example:

"roomItems": {
  "rug": {
    "color": [1, 0, 0]
  },
  "table": {
    "color": [0, 1, 0]
  },
  "chair": {
    "color": [0, 0, 1]
  }
}

Here roomItems is a collection which contains three objects: rug, table and chair.

Note

We will refer to the objects in the JSON file as JSON objects to avoid confusion with their similarly named JavaScript object equivalents.

All of the matrices in a Turbulenz JSON file are 4 rows of 3 columns and should be given as a row major order array of 12 numbers.

The top level object accepts the following properties:

  • “geometries”
  • “skeletons”
  • “images”
  • “lights”
  • “materials”
  • “effects”
  • “nodes”
  • “animations”
  • “physicsmaterials”
  • “physicsmodels”
  • “physicsnodes”

14.4.1. JSON geometries

The JSON geometries object is a collection of JSON geometry objects. Each JSON geometry object is is used to create a Geometry object in the scene.

Here is an example of the JSON geometries object:

"geometries": {
  "floor": {
    "inputs": {
      "NORMAL": {
        "offset": 0,
        "source": "normal"
      },
      "POSITION": {
        "offset": 0,
        "source": "position"
      },
      "TEXCOORD0": {
        "offset": 1,
        "source": "texturemap"
      }
    },
    "sources": {
      "texturemap": {
        "data": [0, 1, 1, 1, 0, 0, 1, 0],
        "max": [1, 1],
        "min": [0, 0],
        "stride": 2
      },
      "normal": {
        "data": [0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0],
        "max": [0, 1, 0],
        "min": [0, 1, 0],
        "stride": 3
      },
      "position": {
        "data": [-10, 0, 10,
               10,  0, 10,
               -10, 0, -10,
               10,  0, -10],
        "max": [10,  0, 10],
        "min": [-10, 0, -10],
        "stride": 3
      }
    },
    "surfaces": {
      "phong_floorSG": {
        "numPrimitives": 2,
        "triangles": [1, 3, 2, 0, 0, 2, 1, 3, 3, 1, 2, 0]
      }
    },
    "meta": {
        "graphics": true
    }
  }
}

Each JSON geometry object contains the following:

inputs

Has some of the following properties (up to 16) describing the inputs for semantic types:

ATTR0,  POSITION0,       POSITION
ATTR1,  BLENDWEIGHT0,    BLENDWEIGHT
ATTR2,  NORMAL0,         NORMAL
ATTR3,  COLOR0,          COLOR
ATTR4,  COLOR1,          SPECULAR
ATTR5,  FOGCOORD,        TESSFACTOR
ATTR6,  PSIZE0,          PSIZE
ATTR7,  BLENDINDICES0,   BLENDINDICES
ATTR8,  TEXCOORD0,       TEXCOORD
ATTR9,  TEXCOORD1
ATTR10, TEXCOORD2
ATTR11, TEXCOORD3
ATTR12, TEXCOORD4
ATTR13, TEXCOORD5
ATTR14, TEXCOORD6,       TANGENT0,         TANGENT
ATTR15, TEXCOORD7,       BINORMAL0,        BINORMAL

The semantics on the same line are equivalent, so each geometry can have only one semantic from each line. These are the semantics supported by CGFX. See the “(GP4GP) Semantics” section in the documentation http://developer.download.nvidia.com/cg/Cg_3.0/Cg-3.0_July2010_ReferenceManual.pdf.

Each semantic property object contains:

  • “offset” - The offset of this input in the surface definitions. So for the example above we have the surface indices list (split into triangles and vertices):

    "triangles": [((1, 3), (2, 0), (0, 2)), ((1, 3), (3, 1), (2, 0))]

    So the offset gives the position in a vertex. Here we have NORMAL and POSITION have offset 0 and TEXCOORD0 has offset 1.

    So the first vertex for this surface has:

    • NORMAL with the value at the sources “normal” data index 1.
    • POSITION with the value at the sources “position” data index 1.
    • TEXCOORD0 with the value at the sources “texturemap” data index 3.

    If TEXCOORD0 had offset 0, NORMAL had offset 1 and POSITION had offset 2 then the surface indices list would be:

    "triangles": [((3, 1, 1), (0, 2, 2), (2, 0, 0)), ((3, 1, 1), (1, 3, 3), (0, 2, 2))]
  • “source” - A string reference of the source for this input.

sources

A collection of sources for the inputs. Each source object contains:

  • “data” - An array of data (can be integer or floats). Must have length as a multiple of stride.
  • “max” - An array (can be integer or floats) of length equal to stride. This array contains the maximum values of each component over all the data.
  • “min” - An array (can be integer or floats) of length equal to stride. This array contains the minimum values of each component over all the data.
  • “stride” - The length of one element of data. For example, an array of 3D vertices would have stride 3.
surfaces

A collection of JSON surface objects. Each JSON surface object links together the inputs for a geometry in order to create a surface.

Each JSON surface object contains:

  • “numPrimitives” - The number of primitives that make up this surface.

  • “lines”, “triangles” - An array of vertex indices which connect the input objects. This array is of length numPrimitives * primitiveSize * (maxOffsets + 1). Here, maxOffsets is the maximum offset value of all of the inputs for this geometry and primitiveSize is the number of vertices for the selected primitive. If you take the triangles array from the example above:

    "triangles": [1, 3, 2, 0, 0, 2, 1, 3, 3, 1, 2, 0]
    
                 [(1, 3, 2, 0, 0, 2), (1, 3, 3, 1, 2, 0)]
                 [((1, 3), (2, 0), (0, 2)), ((1, 3), (3, 1), (2, 0))]

    The second line here shows how the array is grouped into triangles. The third line shows how the triangles are grouped into vertices.

    In this example both the inputs POSITION and NORMAL have offset 0 and so they both share the same indices. The input TEXCOORD0 has offset 1. This means that each vertex is made up of 2 (maxOffsets + 1) indices. The first value is the index in the POSITION and NORMAL inputs. The second value is the index in the TEXCOORD0 input.

meta

Meta data for the JSON geometry object.

  • “graphics” - Specifies that the geometry is intended to be interpreted as a graphical geometry, not as a “physics collision mesh”, for example.

14.4.2. JSON images

Images is a collection of file references. Each file reference is a string containing the relative path to the image.

Effects and materials use the images object as a reference for image files. For example:

"effects": {
  "duck": {
    "parameters": {
      "diffuse": "duckImage"
    },
    "type": "blinn"
  }
  "crate": {
    "parameters": {
      "diffuse": "textures/crate.png"
    },
    "type": "blinn"
  }
},
"images "{
  "duckImage": "textures/duck.png"
}

Then at load time the duck effect diffuse string would be replaced with “textures/duck.png”.

Effects and materials can also reference a file directly (the example crate effect references directly). Direct referencing should be used when the image is only used a few times or by unrelated effects or materials.

The images object stops image sources being duplicated and makes maintenance easier.

14.4.3. JSON lights

The JSON lights object is a collection of JSON light objects. Each JSON light is is used to create a Light object in the scene.

A JSON light is a flexible object allowing light objects to contain the parameters required by any custom renderer. This means that JSON light objects can have any properties on them. The light object’s prototype is set to its JSON light object, allowing access to any custom properties on the JSON light object.

For possible JSON light properties see the documentation for the light.create function with the exception of the following:

type (defaults to point)

A string with one of the following values:

  • “directional”
  • “spot”
  • “ambient”
  • “point”

For supported light types check:

halfExtents
An array of 3 numbers.

See the Light object documentation for more information on these properties.

14.4.4. JSON materials & effects

The JSON materials object is a collection of JSON material objects. The JSON effects object is a collection of JSON effect objects. Each JSON material object is used to create a Material object in the scene. JSON effect objects are used for multiple materials with similar effects to reduce the duplication of data.

The JSON material objects have the following properties:

effect, parameters

These two parameters are both used in the construction of material.techniqueParameters.

Initially, the effect property string is checked for a reference to a JSON effect. If it is a reference then the JSON effects parameters are used to populate the techniqueParameters. Then the JSON material parameters properties, if they are defined, are used to overwrite techniqueParameters.

This is best explained with an example:

"effects": {
  "colouredMaterial": {
    "parameters": {
      "ambient": [0, 0, 0, 1],
      "diffuse": "grey.png"
    },
    "type": "phong"
  }
}
"materials": {
  "grey-material": {
    "effect": "colouredMaterial"
  },
  "yellow-material": {
    "effect": "colouredMaterial",
    "parameters": {
      "diffuse": "yellow.png"
    }
  },
  "green-material": {
    "effect": "blinn",
    "parameters": {
      "diffuse": "green.png"
    }
  }
}

The first 2 materials are using the same effect. However, the “yellow” material overwrites the diffuse texture set by the effect. Both materials will have effect type “phong” while the “green” material has effect type “blinn”. So the material.techniqueParameters objects for each material will be as follows:

grey: {
  techniqueParameters: {
    ambient: [0, 0, 0, 1],
    diffuse: "grey.png"
  }
}

yellow: {
  techniqueParameters: {
    ambient: [0, 0, 0, 1],
    diffuse: "yellow.png"
  }
}

green: {
  techniqueParameters: {
    diffuse: "green.png"
  }
}

This example is not in JSON format since it is showing the values of the JavaScript objects after they have been loaded.

Any properties on the parameters objects with string values are assumed to be file references. See the JSON images object for more information on file references.

If the effect property is not a reference then it is taken as the materials effect type. For supported effects see the rendering documentation.

meta

The meta object contains possible extra information needed by the renderers. See the rendering documentation for valid values.

  • DefaultRending meta.
  • ForwardRending meta.
  • DeferredRending meta.

The JSON effect has the following parameters:

parameters, effectType
See documentation for effect property on the JSON material object.

14.4.5. JSON nodes

The JSON nodes object is a collection of JSON node objects. Each JSON node object is used to create a SceneNode object in the scene. Since nodes are referenced from their paths in the node hierarchy it is possible to have 2 nodes with the same name. However, 2 child nodes should not have the same name since they would then have the same path (this also applies for root nodes).

The JSON node objects have the following properties:

geometryinstances

This object is a collection of JSON geometryinstance objects. Each JSON geometryinstance object is used to create a GeometryInstance object in the scene.

The JSON geometryinstance objects have the following properties:

  • “geometry” - A string reference to a JSON geometry object.
  • “material” - A string reference to a JSON material object.
  • “surface” - A string reference to a JSON surfaces object. This reference must be inside the JSON geometry object referenced by the geometry property.
  • “skinning” - Set to true if the geometry is skinned.
dynamic, disabled, kinematic
For setting these properties on the SceneNode object.
nodes

A JSON nodes object for the children nodes. For example:

"nodes": {
  "character": {
    "dynamic": true,
    "nodes": {
      "root": {
        "dynamic": true,
        "nodes": {
          "chest": {
            "dynamic": true
          },
          "head": {
            "dynamic": true
          },
          "legs": {
            "dynamic": true,
            "nodes":
            {
              "leftLeg": {
                "dynamic": true
              },
              "rightLeg": {
                "dynamic": true
              }
            }
          }
        }
      }
    }
  }
}
camera

This object will be copied onto the SceneNode object object’s camera property. You can access this object later on with:

var cameraNode = scene.findNode("cameraNode");
var camera = cameraNode.camera;
lightinstances

A JSON lightinstances object is a collection of JSON lightinstance objects. Each JSON lightinstance object is used to create a LightInstance object on the scene node.

Each JSON lightinstances object has the following property:

inplace (defaults to false)
A boolean flag. See the reference property.
reference

A string reference to another turbulenz JSON file object. Currently, the pound character, “#”, is not allowed in file references and any reference containing a hash will be ignored.

If the inplace flag is set to true then the external reference is loaded in at the top level object. Be careful about name clashes when using this flag. If the flag is false the external reference JSON node objects are loaded in as this JSON node objects children (added to its nodes property).

matrix (defaults to identity matrix)
An array of 12 values giving the local transformation matrix (4 by 3) of the node.

Here is an example of a JSON nodes object which represents a collection containing a camera node in the scene:

"nodes": {
  "cameraNode": {
    "geometryinstances": {
      "geometry": "geometry-camera",
      "material": "material-camera",
      "surface": "geometry-camera-surface0",
      "skinning": false
    }
    "camera": {
      "comment0": "You can put any custom properties in here.",
      "comment1": "They will be copied onto scene nodes camera property.",
      "comment2": "For example:",
      "cameraOffset": [0.1, 0.5, 0]
    },
    "matrix": [1, 0, 0,
               0, 1, 0,
               0, 0, 1,
               -5, 4, 2],
    "dynamic": false,
    "disabled": false,
    "kinematic": false,
    "lightinstances": "light-camera"
  }
}

14.4.6. JSON skeletons

Skeletons is a collection of JSON skeleton objects. Each JSON skeleton object has the following properties:

numNodes
The number of bones in the skeleton. Each of the following properties should be arrays of numNodes length.
invBoneLTMs

An array of bone inverse local transform matrices (4 by 3). This can be computed by the following method:

  1. set invBoneLTMs equal the inverse of the current bone’s bind pose transform.
  2. go up the tree by following the index in the parents array.
  3. multiply invBoneLTMs by the inverse of this bone’s bind pose transform.
  4. repeat steps 2 and 3 until you hit -1 in the parents array.
names
An array of strings giving the name of each bone in the skeleton.
bindPoses
An array of bone bind pose transform matrices (4 by 3). These transforms define the transforms of the bones when the mesh was attached. Each bone bind pose transform matrix gives the transformation into bone space from the parent bone space.
parents
An array of indices giving the index of the parent of each bone in the skeleton. This must be ordered such that a bones parent is declared before its children (starting at index 0). The root has parent index -1 (and must be the first in the array).

Each index in the 4 arrays represents a bone in the skeleton. Here is an example of a JSON skeletons object which represents a collection containing a basic human skeleton:

"skeletons": {
  "basicHuman": {
    "numNodes": 10,
    "names": ["head",
              "chest",
              "upperRightLeg",
              "lowerRightLeg",
              "upperRightArm",
              "lowerRightArm",
              "upperLeftLeg",
              "lowerLeftLeg",
              "upperLeftArm",
              "lowerLeftArm"],
    "parents": [-1, 0, 1, 2, 1, 4, 1, 6, 1, 8],
    "bindPoses": [[1, 0, 0,
                   0, 1, 0,
                   0, 0, 1,
                   0, 0, 0],
                  [1, 0, 0,
                   0, 1, 0,
                   0, 0, 1,
                   0, -5, 0],
                   ... 8 more bind pose matrices],
    "invBoneLTMs": [[1, 0, 0,
                     0, 1, 0,
                     0, 0, 1,
                     0, 0, 0],
                    [1, 0, 0,
                     0, 1, 0,
                     0, 0, 1,
                     0, 5, 0],
                    ... 8 more inverse local transform matrices]
  }
}

14.4.7. JSON animations

The JSON animations object is a collection of JSON animation objects.

Each JSON animation object can have the following properties:

bounds

An array of objects giving the axis aligned bounding box of the mesh for a set of keyframe of the animation. Each object in the array has the following properties:

  • “center” - The center of the AABB for the mesh.
  • “halfExtent” - The half extents of the AABB for the mesh.
  • “time” - The keyframe time.
channels

The channels that this animation effects. Supported properties are:

  • “rotation”
  • “translation”
  • “scale”
hierarchy

This property is similar to a JSON skeleton object without the binding information. It takes the following properties:

  • “numNodes” - The number of bones in the animation hierarchy. Each of the following properties should be arrays of numNodes length.
  • “names” - An array of strings giving the name of each bone in the animation hierarchy.
  • “parents” - An array of indices giving the index of the parent of each bone in the animation hierarchy. This must be ordered such that a bones parent is declared before its children (starting at index 0). The root has parent index -1 (and must be the first in the array).

This hierarchy need not be the same as the skeleton that the geometry uses. However, the input to a GPUSkinController object must have same skeleton as the geometry.

length
The length of the animation in seconds.
numNodes
The length of the nodeData array this should be equal to the numNodes property on the hierarchy object.
nodeData

An array of nodeData JSON objects. This array gives the inputs for each bone’s animation and is of the length of the numNodes property on the hierarchy object. Each nodeData JSON object can have a baseframe object property or a keyframes object property.

A baseframe should be provided for channels on the bone that do not change during the animation. If a keyframe object attempts to use a channel defined by the baseframe then the keyframe object’s values for that channel will be ignored. If a baseframe is provided for each channel then the bone’s transform will not change during the animation.

A keyframes object should be provided when the bone is animated. The keyframes object is an array of keyframe objects of the length of the hierarchy objects numNodes property. Each keyframe object gives the transform of the bone at a certain time to be interpolated by an InterpolatorController object.

Both baseframe and keyframe can have the following properties which form a transform for each bone:

  • “rotation” - A normalized quaternion giving the rotation as an array of 4 numbers.
  • “translation” - A vector giving the translation as an array of 3 numbers.
  • “scale” - A vector giving the scale as an array of 3 numbers.

As well as any other custom channel properties (custom channel’s format must be an array of numbers) that are set on the channels object. The keyframe object also requires

  • “time” - The time in seconds from the start of the animation until this keyframe. This must be greater than the keyframe that precedes it and less than the animation length property. “time” does not have to be set at uniform intervals. If the start keyframe is not 0 then the InterpolatorController will use the values from the start keyframe while it waits for the animation to begin. Likewise, if the end keyframe is not equal to the animations length property.

The keyframes object must have at least two of keyframe objects in its array; a start and an end.

meta
Custom information for the animation controllers.

Here is an example of a JSON skeletons object which represents a collection containing a robot arm animation:

"animations": {
  "robotArmPickUp": {
    "hierarchy": {
      "numNodes": 5,
      "names": ["base",
                "upperArm",
                "lowerArm",
                "leftClaw",
                "rightClaw"],
      "parents": [-1, 0, 1, 2, 2]
    }
    "numNodes": 5,
    "length": 2.5,
    "channels": {
        "rotation": true,
        "scale": true
    },
    "bounds": [
      {
        "center": [3, 3, 0],
        "halfExtent": [3, 3, 1],
        "time": 0
      },
      {
        "center": [4, 4, 4],
        "halfExtent": [4, 4, 4],
        "time": 1.0
      },
      {
        "center": [0, 4, 4],
        "halfExtent": [1, 4, 4],
        "time": 2
      }
    ],
    "nodeData": [
      {
        "keyframes": [
          {
            "rotation": [0, 0, 0, 1],
            "scale": [1, 1, 1],
            "time": 0
          },
          {
            "rotation": [0, 0.706, 0, 0.707],
            "scale": [1, 1, 1],
            "time": 1
          },
          {
            "rotation": [0, 1, 0, 0],
            "scale": [1, 1, 1],
            "time": 2
          }
        ]
      },
      {
        "baseframe":
        {
          "rotation": [0, 0, 0, 1],
          "scale": [1, 1, 1],
        }
      },
      {
        "keyframes": [
          {
            "rotation": [0, 0, 0, 1],
            "scale": [1, 1, 1],
            "time": 0.5
          },
          {
            "rotation": [0, 0, 0, 1],
            "scale": [1, 1.25, 1],
            "time": 1.5
          }
        ]
      },
      {
        "baseframe": {
            "scale": [1, 1, 1]
        },
        "keyframes": [
          {
            "rotation": [0, 0, 0, 1],
            "time": 1
          },
          {
            "rotation": [1, 0, 0, 1.57],
            "time": 2.5
          }
        ]
      },
      {
        "baseframe": {
            "scale": [1, 1, 1]
        },
        "keyframes": [
          {
            "rotation": [0, 0, 0, 1],
            "time": 1
          },
          {
            "rotation": [1, 0, 0, -1.57],
            "time": 2.5
          }
        ]
      }
    ]
  }
}

14.4.8. JSON physicsmaterials

The JSON physicsmaterials object is a collection of JSON physicsmaterial objects.

A JSON physicsmaterial object has the following properties.

dynamic_friction
The friction value of the material.
restitution
The coefficient of restitution. Must a value between or equal to 0 and 1.
collisionFilter

An array of strings of the following types:

  • “ALL”
  • “DYNAMIC”
  • “CHARACTER”
  • “PROJECTILE”
  • “STATIC”
  • “KINEMATIC”

For more information see PhysicsDevice filters.

14.4.9. JSON physicsmodels

The JSON physicsmodels object is a collection of JSON physicsmodel objects. Each JSON physicsmodel object is used to create:

Here is an example of a JSON physicsmodels object:

"physicsmodels": {
  "capsule": {
    "dynamic": true,
    "mass": 1,
    "material": "Cone-PhysicsMaterial",
    "height": 1,
    "radius": 1,
    "shape": "cone"
  }
  "cube": {
    "dynamic": true,
    "mass": 1,
    "material": "Cube-PhysicsMaterial",
    "halfExtents": [1, 3, 0.5],
    "shape": "box"
  }
  "sphere": {
    "dynamic": true,
    "mass": 1,
    "material": "Sphere-PhysicsMaterial",
    "radius": 1,
    "shape": "sphere"
  }
  "mesh": {
    "dynamic": true,
    "mass": 1,
    "material": "Convexhull-PhysicsMaterial",
    "geometry": "phong_floorSG",
    "shape": "mesh"
  }
}

A JSON physicsmodel object has the following properties.

halfExtents
The half extents of the box Shape (see example above). Not used for any other shapes.
height, radius
Used to define the following Shapes: sphere, cone, capsule, cylinder.
geometry
A string reference to a JSON geometry object. Used to define the convex hull and mesh Shapes.
material
A string reference to a JSON physicsmaterial object.
shape

A string representing the models collision object shape, possible values are:

  • “box”
  • “sphere”
  • “cone”
  • “capsule”
  • “cylinder”
  • “convexhull”
  • “mesh”

For more information see the PhysicsDevice object.

kinematic
If true, sets the node to be kinematic.
dynamic
If true, a RigidBody object will be created. If false, a CollisionObject object will be created.
mass, inertia, velocity, angularvelocity
See PhysicsDevice.createRigidBody function parameters.

14.4.10. JSON physicsnodes

The JSON physicsnodes object is a collection of JSON physicsnode objects. Each JSON physicsnode object links a JSON node up to a JSON physicsmodel.

A JSON physicsnode object has the following properties.

body
A JSON physicsmodel object.
target
A JSON node object.

turbulenz_json_format