Solutions providers, Avid, Ross and Zero Density explain how gaming software Unreal Engine has redefined studio broadcasts.
From having a fleet of sports cars in the studio to placing the presenter at the heart of a raging tornado, studios are not restricted by space or geography. Solutions providers, Avid, Ross and Zero Density explain how gaming software Unreal Engine has redefined studio broadcasts.
The video game Unreal was released by Epic Games to rave reviews in 1998. Reviewers described the single-shooter as reinvigorating a tired genre”. Debuting along with it was Unreal Engine 1, which in the words of Epic Games founder Tim Sweeney was to “build up a base of code that could be extended and improved through many generations of games”.
Today, its fourth iteration is reinvigorating sports broadcasts, weather forecasts and election analyses.
Avid added support for Epic Unreal in September of 2017 and it was announced for the first time at IBC2017, reveals Ray Thompson, Director of Broadcast and Media Solutions Marketing at Avid, explaining that it was adopted in order to enhance photorealism in rendered graphics.
Designed originally for use by game companies to render graphics in real time, the Unreal Engine is now being used to render virtual sets and augmented reality with the same level of realism, which gives broadcasters a tremendous amount of flexibility when branding and changing the look and feel of a broadcast.
Canadian firm Ross Video has been active in virtual studio and AR technologies for a number of years. Director of Virtual Solutions Gideon Ferber recalls: We became aware that a couple of the gaming engine companies were making their engines available to third-party developers and we could see a nice synergy here a chance to bring the intense realism of gaming graphics into the world of broadcast. The desire to pursue that goal led us to work with Future Group [the Norwegian company that created the worlds first mixed-reality game show, Lost in Time] and the development of the Frontier product. Frontier was launched at NAB 2017 and we just announced version 2.0 at IBC in September this year.
Describing the early forays into the virtual studio as less than pleasing, Ferber elaborates: Historically, it was a challenge to make virtual studios look realistic. Certain materials wood and metal, for example were difficult to render and viewers wanted to see realistic shadows on the floor when objects or people moved around in the studio. Previous technologies did cause some surfaces and textures to look too shiny or plasticky.
We believe that where the engines are all the same, configuration and control therefore become critically important, and thats where we really add value Gideon Ferber, Director of virtual Solutions, Ross Video
Incorporating Unreal into Frontier has given us photo-realistic results that are very natural, and thats a big step forward. We can now render graphics in a much more believable manner and create incredibly lifelike effects. In one of our demos, we show rain hitting glass windows outside a virtual studio, and its really amazing to watch the droplets run, merge together and separate again.
One of the early adopters of Unreal Engine was Turkish firm Zero Density. We incorporated Unreal Engine technology first in 2014, following a vigorous search for the right game engine that fits our vision and product, says Aydemir Sahin, VP of Product and Support.
Citing the way the photo-realistic rendering capabilities of Unreal Engine enhance its solution, Zero Density provided a built-in keying technology and compositing tools for broadcasters, such as Fox Sports, Ziggo Sport and Canadian broadcaster TFO, that adopted the solution for live production.
Conceding that almost every company offering virtual studio solutions in this industry has now integrated Unreal into its products, Ferber of Ross Video highlights the aspects that differentiate their solution: At Ross, we believe that where the engines are all the same, configuration and control therefore become critically important, and thats where we really add value. We have a product for virtual studio control called UX thats highly flexible and customisable, with an easy-to-use intuitive touchscreen graphical user interface. It comes installed on a touchscreen PC and can handle camera calibration, scene manipulation, media replacement, event triggering, animation control and robotic camera movement control.
Highlighting the uniqueness Avid brings, Thompson says: Avid runs Unreal on its Maestro l Engine hardware (as well as our legacy HDVG hardware), with the ability to run the Maestro l Render Engine at the same time as running the Epic Unreal Engine, which means you can deliver a virtual set and data-driven AR graphics in real time at the same time. Its unique to the Avid implementation of Unreal.
Underscoring early adoption, Sahin of Zero Density says: Zero Densitys Reality Engine is on-air since 2016. We have developed real-time node-based compositor on top of Unreal Engine, which enables post-production-style visual effects in the domain of live video production. The main differentiation is that our solution is not an integration with legacy systems, which is the most popular way of using UE in the industry. We provide an Unreal Engine native solution and develop keying technology that is built-in, called Reality Keyer, which is the worlds first real-time image-based keyer.
Designed originally for use by game companies … Unreal Engine is now being used to render virtual sets … changing the look and feel of a broadcast Ray Thompson, Director of Broadcast and Media Solutions Marketing, Avid
One reason Zero Density opted for UE and not rival game engine Unity is related to the path it adopted. Sahin explains: They are both great game engines. UE was a better fit for us because source code access was fundamental for our development, as we needed to change the core of the engine code. Moreover, Unreal Editor is the best editing tool, allowing users to utilise any compatible content as a new virtual environment with the full feature set of the Reality Engine, in only a matter of minutes.
When solutions providers integrate graphics engines into broadcast systems, it is a collaborative process to ensure that aspects such as camera trackers work with the systems in place. We ask solutions providers how simple it is for the crew to use their systems.
Sahin of Zero Density clarifies that there are multiple user profiles. As we are Unreal Engine native solution providers, content developers must be knowledgeable on how to utilise Unreal Engine. On the other hand, the end user (TV channels, for instance) does not need specific UE skills to execute productions. On the need for training, Thompson of Avid says: Users need to learn how to use Maestro l Designer in order to author the graphics or work with a graphics production company to produce these graphics. In either case, once the elements are created, users can run Unreal Engine on the same hardware platform (Maestro l Engine) as the Maestro render engine, which supports video and graphics elements all delivered in real time.
While reiterating that the design team needs to learn Unreal, Ferber of Ross highlights the many user- friendly options for operators: From an operator perspective, Frontier (which obviously includes Unreal) can be controlled by something called UX that can manage every element of preproduction and calibration, as well as on-air production. That said, its worth mentioning that users can also manage on-air production using an open-platform control solution we offer called DashBoard. When broadcasters decide to move to a virtual studio set-up from a physical studio, were obviously on hand to help with training and commissioning, and well very often help run the first few productions to make sure customers are happy and comfortable with the system.
We incorporated Unreal Engine technology first in 2014, following a vigorous search for the right game engine that fits our vision and product Aydemir Sahin, VP of Product and Support, Zero Density
On future trends, Thompson of Avid highlights cloud and 5G as drivers: Cloud-based end-to-end workflows can take advantage of enhanced rendering techniques to enable news, sports, weather and live production in a single- or multi-cloud environment. As 5G becomes more widespread and devices that support 5G become more widely available, the increasing use of AR in news, sports, weather and live entertainment will immerse viewers in any environment, to provide a new level of storytelling not yet realised on traditional OTA and even digital platforms.
Urging the broadcast industry to ride the wave of innovation, Sahin of Zero Density says: There is enough evidence in the industry that game engines have started replacing graphics-related engines. The vast potential of utilising this ecosystem of game engines will revolutionise broadcasting.
While he sees technology becoming more powerful and graphics even better, Ferber of Ross sounds a cautionary note. Good content has to serve the consumer. I think well-designed virtual sets, AR elements and onscreen graphics can be powerful complements to the storytelling, but they shouldnt dominate or distract from the core content, and they rarely work well if the core content is weak. Complexity for the sake of it is never helpful.