When it comes to science fiction, having shiny new technology kind of goes without saying. So when we set out to make Project Mammoth, we knew we couldn’t overlook the tech. Of course, with budget limitations, much of that tech would need to be created in post production.
Looking at modern tech, we found a launching point for much of the tech used throughout the show and comic. For the “nodes” found above the left eyebrow and temple of all Domer characters, we drew inspiration from Google Glass, mixed with an almost cyborg-like neural interface that allows for hands-free interaction with most technology. From sliding doors and holographic displays to hover cars and hologames, the world of the Domers is one where it truly is “Mind over Matter.”
While node-free interface is possible in the tech of Project Mammoth, and is even used on occasion to avoid leaving a “neural fingerprint”, the hands-free aspect gave us the freedom to have a highly interactive setting. This is something that grew increasingly necessary, given the limitations of budget and the fact that it is shot entirely on green screen using 3d elements for over 90% of the props and furniture.
The challenge came in showing the viewer that a character is interacting with the tech. For this, we had to add a visual cue in post, in the form of flashing lights. While the nodes worn by the actors are physical props, the lights coming from them are added in post production. Some scenes, in which a higher level of security is required for access, we demonstrate this by having multiple nodes flash.
Other physical props that require additions in post are the handheld holos used throughout the pilot. In order to create the visuals seen on the holos, numerous layers of animations with varying degrees of opacity and blend styles are composited in After Effects. Extensive planar tracking is then used to map/lock the final composition to the prop in the shot, creating the handheld holoscreen.
Another physical prop that requires work in post to bring to life is the plasma rifle. While one of the easier effects to add in post, it goes without saying that without it the gunfire scenes would be pretty dull. Perhaps the most difficult part of the effect is giving perspective to the plasma itself. When firing side to side on the screen, it’s not much of a concern. But when the plasma fires from background to foreground for example, the head and tail of the shot need to be varied in size to account for the perspective change.
Fortunately, this is not too difficult to pull off in the plugin I use to create the plasma – Video Copilot’s new AE plugin Saber. Accounting for the perspective change, I change the width of the head and tail as it moves from the start to end point, giving the illusion of the shot moving closer to the camera, as pictured below.
In the end, it’s the blending of the props with the CG’s created in post that ultimately brings the technology to life.