Projection Mapping


Typically, a content creation house will generate media and a matching 3D model either from a 3D CAD package or by laser-scanning the physical object.

The media and 3D model is then transferred to a Delta Media Server, which is configured in Mesh Mode where the scene is maintained in full 3D and completed with camera viewpoints which match the physical projector positions relative to the real 3D object.

By configuring a virtual scene within Delta to match the physical scene, each projector “sees” the correct view of the object, placing media on the correct portion of the model and allowing multiple projectors to blend together to cover the whole 3D surface.

Semi-Auto 3D Calibration


As part of the suite of alignment tools, 3D Auto-Cal is built into Delta Mesh Mode as one of the key alignment stages which helps to minimise the time required onsite.

Since the aim in Mesh Mode is to build a 3D virtual set which matches the real 3D scene, you could measure each projector position, angle and lens characteristics and type them in directly to Delta – this is a recognised technique and is successful if the measurements are reasonably accurate.

However, the time-pressure onsite and the reality of the onsite environment can make accuracy difficult to achieve, since the projectors may be physically unreachable and the lens zoom / offset hard to measure in situ.

Delta’s 3D Auto-Cal is a mechanism whereby the operator places at least 6 markers on the virtual 3D model within Delta, then moves each marker projected on the real 3D object to the corresponding 3D point. Once this is achieved, Delta calculates the position and lens characteristics which match the 6 points and configures that channel automatically. This is repeated for each projector and saves a lot of time compared to manually working out each projector position and of course the resultant view can still be adjusted if required.

Once the correct camera view from each projector is achieved, it is still possible that the geometric alignment is not perfect due to differences in the 3D virtual model compared to the real thing, and this is where Delta’s Mesh Editing is used.

Mesh Editing

MeshInGUIThe 3D model and media from the content creator typically consists of a OBJ format file which contains the 3D mesh and UV texture coordinates accompanied by a movie (TGA sequence preferred) or still imagery (any format).
Once imported to Delta in Mesh Mode, the 3D model is viewed from matching projector positions, but there may still be differences from the real 3D object to the virtual model.

Delta’s Mesh Editor allows the operator to view the 3D model from any viewpoint, select one or more vertices in the model and move them in X, Y or Z to better match the observed model. This alignment can be carried out for all projector channels (World coordinates) or done per channel to achieve the correct view from each projector. Once the alignment is complete, simply saving the normal Delta show saves all adjustments to separate companion files to the original OBJ file so that the original OBJ is left untouched – you can reset single or multiple points at any time, even after saving the file as the editing is completely non-destructive.

Other features in the editor are the ability to hide single or multiple faces in the model, apply clipping planes to the model to achieve easier selection, alter the movement size to match the model dimensions, and “bake” the complete adjusted model to a new OBJ file for round-trip editing with the content house.

Real-Time Tracking

MeshTexturedUsing third-party camera based tracking systems, Delta can be configured to receive the real-time data-stream of multiple points in the scene which are placed onto 3D points of the physical object.

This data is then used to move the corresponding 3D object inside Delta in sympathy with the physical model, providing a tracked match between virtual & physical worlds. The tracking data can also be extended to track projector positions so that both 3D object and projector can move within the scene and still provide matching content on the object.

Content placed on the object therefore stays on the object as it moves, providing creative possibilities to theatre productions or live events.

This technology also appears in museums, theme park rides and visitor attractions to entertain customers with unique and surprising audio visual experiences.