What can you do?
Here is a list of existing, in-development and planned product features and functionality.
If you do not see something that would be important to you, why not Contact Us and tell us about it.
Organisation of previs work and data in projects, including all resources that comprise the scene, such as 3D models or textures, animations and shots.
- Projects contains basic information such as: Project name, customer, assigned persons, creation and edit dates as well as deadline or publishing dates. Furthermore, all data related to the project such as: 3D models, scene graph, textures, animation data, shots, renders videos, images and lists and meta data are stored within the project.
Projects are organized into two sub structures:
- Scenes contain all 3D related information like the position of all 3D models stored in a scene graph, textures assigned to objects, positions of lights and cameras as well as animation data.
- Shots define a view onto a scene. They are represented as 2D views and have a camera from the scene assigned defining this specific viewpoint.
Import & Export
For integration into production pipelines, standard input and output file format support is offered.
- 3D models – OBJ, STL and FBX
- Image and texture – BMP, TIF, TGA, JPG, PNG and PSD
- Sound – MP3, OGG or WAV
- Characters and rigged objects – Unity prefab objects.
- Video (Green screen and CGI) – MOV, MPG, MPEG, MP4, AVI, ASF.
- Video – MOV, MPEG, MP4 and AVI
- Images – JPG and PNG.
- FirstStage assets (animations, motion capture and visual effects as well as the created set) – OBJ, STL and FBX
- Lists (data about the project or scene data, timing information, assets dimensions, etc.) – CSV
Sketching or Modelling
- Create and model 2D and 3D objects for outlining and dressing scenes, directly in 3D VR.
- 3d modelling by sketching shapes and manipulating vertexes, vertices and faces.
Paint and texture created models using a variety of tools.
- Sketch 2d images as assets, or attach to assets, to quickly visualize and communicate ideas.
Assets and Layout
Create 3D scenes using assets, models, animations, effects, etc.. Import, select and interact with pre-made 3D objects and animations from a database. Assets can be instantiated, and their position, rotation and scale defined.
- Search and import assets from asset libraries via meta data to find the assets needed.
- Placeholder assets are defined by a name which describes the object, which allows for simple sketched or modelled assets to be auto-replaced with higher detail alternatives at any time.
- Protect assets
- Grouped and/or parent assets to form a new asset that can be positioned in the scene. All grouped assets inherit the parameters of the group.
- Snap placement assists accurate placement and alignment.
- Rotate, scale and move assets using the Gizmo tool
Sketches of shots can be used as a background for the layout phase to build a set according to the drafts.
Posing & Animation
The software provides functionality to apply imported animations, or create animation from a variety of methods in VR.
- Steering animations can be applied to a rigged puppet, by directly moving the ground pivot to set a path.
- Rigid animations can be applied to any object or puppet, by directly moving the centre of mass pivot to set a path.
- Poses can be set on rigged puppets by directly manipulating keyframe handles. Or select from a predefined library of poses.
- Motion Capture can be recorded from data captured via any supported input device. Use on-set functionality to simply calibrate the user then record and save human performances as animations.
- Upper body capture using the Vive Headset and Controllers/Index only
- Full body capture using the Vive Headset, Controllers/Index and Trackers on feet and hips, or the Rokoko SmartSuite Pro4.
- Facial capture using the iPhoneX
Animations can be edited using any of the methods, as preferred.
Animations can be applied to rigged characters, assets, lights and cameras.
Animations can be created by the user or chosen from a predefined set of animations from a library.
All animations can be arranged on local scene or shot timelines.
The software provides an overview of all shots. Shot order can be changed and individual shots can be set active or inactive. New shots can be added by selecting or adding a camera to the scene. Shots can be defined by sketches or images.
Instantiate and interact with multiple virtual cameras in a scene for shot creation, set their position and rotation.
- Lines of sight can be tested via the 3d scene.
- 2D interface or gesture-driven VR control provides direct manipulation of the camera viewfinder and parameter modification.
- Aperture, focal length, depth of field internal parameters
- Lenses can be modified.
- Rigged cameras can be imported to support real physical camera setups, such as camera cranes.
- Shoulder and steady cams are supported.
- Exporting of physical camera information as a list.
- Plan shots via defining camera image previews as shots.
- Project pre-sketched shots into the scene for accurate shot set up.
The software supports camera animation over time using key frame animation and performance recording.
- Camera position and rotation
- Internal parameters
To account for real cameras used by camera men in the production stage later on, a physical interface with tracked heavy cameras can be used during performance recordings.
The software provides a set of predefined lights that match real life light sources, or import from open light libraries.
- Position, rotation and scale.
- Brightness, opening angle and colour parameters can be set.
- Key-frame animate light movement.
Lighting presets like sunset, dawn, indoor and projections are supported.
Import, create and edit visual effects in a scene.
The software uses real time physics simulation to create visual effects by setting and editing simulation parameters. It captures the path and timing of a human performance and translates it to simulation parameters.
The software provides intuitive GUIs (graphical user interfaces) on a desktop and tablet, and motion tracking NUIs (natural user interfaces) in VR, to simply present normally complex solutions, such as the timeline being represented in both 2d and 3d, and the creation of physics simulations for visual effects.
- Get around the set using NUI gestures to move forwards, backwards or sideways, up and down, or to scale yourself.
- Desktop – Currently Windows only. Support for Mac to follow post-Beta launch.
- Tablet – iOS to follow post-Beta launch. Other OS TBD.
- Virtual Reality – Currently HTC Vive Room Scale VR, including support for:
- Vive Headset
- Vive Controllers
- Vive Base Stations
- Vive Tracker
- Valve Index Controllers (Knuckles)
- Vive Deluxe Audio Strap
- Vive Wireless Adapter
- Rokoko SmartSuite Pro4
The software supports a number of ways to collaborate
- Saving and sharing of projects within registered teams.
- Cloud collaboration allows for unlimited users to simultaneously work or observe and communicate in VR.
- Shared communications include attach text, sound or video notes to all objects, scenes and shots, with the highlighting of new or changed notifications.