Our object tracking software integration allows users to reach new levels of freedom while interacting with virtual elements on set
OSLO, NORWAY — 23 July 2020: The Future Group, creators of live photo-realistic virtual production system Pixotope®, today unveils its latest Version 1.3 software featuring a wide range of advances that significantly improve how virtual environments interact with real-world elements.
Pixotope enables the production of mixed-reality (MR) content by bringing together physical components such as presenters, actors, props and free-moving cameras, with virtually created assets such as scenes, graphics, animated characters, or any other computer-generated elements. Pixotope forms the central production hub when creating mixed-reality content for broadcast and live events, with Version 1.3 offering new object tracking, powerful lighting integration, enhanced colour management and more.
The Future Group’s Chief Creative Officer Øystein Larsen explains, “The success of a mixed reality scene depends upon the relationship and interactivity between real and virtual components. Part of this success depends on technical accuracy, such as matching lighting, replicating freely moving cameras and having seamless keying. But there is also an emotional aspect which flows from enabling presenters and actors to freely express themselves through unrestricted movement and interacting with virtual objects as they would real objects. Version 1.3 of Pixotope provides large gains in both these areas.”
A major advance in Pixotope Version 1.3 is the ability to easily utilize and integrate data from real time object tracking systems. This allows Pixotope to use the position of moving tracking locators in the real-world environment and attach them to digitally created objects, so that those objects can be made to follow the tracked motion. This in turn enables presenters to freely pick up and rotate graphics or any other virtually generated asset, opening limitless creative possibilities. From showing a 3D model in the palm of their hand, to controlling any aspect of a virtual scene with their own physical movement, presenters and actors become free to interact with the virtual world around them.
Another benefit of Object Tracking is that presenters themselves can be tracked, so that Pixotope “knows” where in the scene they are. A challenge with normal virtual studios is that presenters must be mindful about where they stand and when. Presenters cannot walk in front of graphics that have been composited over the frame. However, when accessing an object’s position and orientation through Pixotope’s Object Tracking interface, they are free to walk in front, behind or even through virtual objects, because Pixotope recognises where the presenter is in respect to the position of other generated items within three dimensional space.
Also new in Pixotope Version 1.3 is the ability to control physical lights using DMX512 over the Art-Net distribution protocol. This enables Pixotope to synchronise and control any DMX controllable feature of physical studio lights from the digital lights used to illuminate virtual scenes. Lights can then either be driven via pre-set animation, or by using the new Slider widget available for user created Pixotope control panels. Such panels can be accessed via a web browser on any authorised device and be operated either by a technician or presenter.
Pixotope Version 1.3 also further improves the results of chroma keying (such as for green screen studios) with new features to help extract greater detail, like fine strands of hair and shadows, as well as new algorithms to process key edges to sub-pixel accuracy, improve colour picking and automate the reduction of background screen colour spill.
Colour Management has been extended to Pixotope’s Editor Viewport to ensure that artists working in any practical colour space, including HDR, can have complete confidence in the colour fidelity of the images they are creating.
Pixotope is natively integrated with the Unreal game engine, and in Pixotope 1.3 all the latest features of UE Version 4.24 are available to users. Benefits include layer-based terrain workflows for the creation of adaptable landscapes, dynamic physically accurate skies that can be linked to actual time-of-day, improved rendition of character hair and fur, as well as increased efficiency for global illumination that helps to create photo-real imagery.
Future Group’s CEO Marcus Blom Brodersen added, “The advances within Pixotope Version 1.3 deliver another step-change for producers of mixed-reality content. The extremely high quality images Pixotope produces, together with the creative and physical freedoms it allows for those, both in front of, and behind the camera, enable our customers to make ever-more exciting and attention-grabbing productions.”
About The Future Group
The Future Group is an international award-winning software company dedicated to developing the next generation of visual storytelling tools and experiences. The company is headquartered in Oslo, Norway, with regional offices in Croatia, Spain, UK, and the USA. Company’s flagship product, Pixotope®, enables content creators and broadcasters to produce best in class Mixed Reality content for television, online streaming, and film. The technology from The Future Group has powered Mixed Reality experience at events such as the Super Bowl, Eurovision Song Contest, League of Legends Championship broadcasts and The Weather Channel mixed reality experiences.
For more information, visit: www.futureuniverse.com
- Dejero Connects Subaru Creatives Virtually in Real Time for Commercial Shoot in Remote Canadian Rockies - February 25, 2021
- Comprimato JPEG2000 codec integrated in Interra Systems’ File QC platform - February 19, 2021
- Abu Dhabi Media Switches to Dejero for Live News Reporting as Part of News Department Upgrade - February 11, 2021