BroadcastPro ME asks industry professionals to share their vision of how they foresee AR changing the face of broadcasting in various scenarios and how their own respective solutions can enhance the viewing experience.
Technical Solutions Director, SmartStage
Traditional AR can add value wherever content needs to be brought to life in a more dynamic, detailed or immersive way. AR allows us to build digital scenery and props which we might not be able to build physically – either financially or practically. These AR elements can change dynamically – being driven by external data or interactivity, which makes them ideal for visualising sports, weather, financial or other data in a more engaging way alongside the presenters, talent or guests.
What’s more exciting for us is when AR is utilised as part of a full xR/virtual production and can generate a virtual set extension as well as foreground props. This allows us to create infinite digital environments and place the presenters and guests inside those environments – allowing them to see and interact with their surroundings and take storytelling or subject explanation to another level.
We consult, design and supply turnkey AR, xR and studio set technology solutions for a range of markets including broadcast, education, and corporate. We work with broadcasters to understand their studio, show or OB requirements and build technical solutions for either a physical set, AR overlay, hybrid or fully xR environment. We specialise in media servers, virtual production, xR and tracking systems and help clients to get the most out of these technologies. Our SmartStage solution is a complete turnkey xR solution that uses LED video walls to replace the traditional green screen element of a virtual set and extends the canvas with a unique virtual set extension – immersing the presenter in the content and allowing real interaction and engagement with their surroundings.
There are myriad AR and graphics systems out there – some, which bolt on to existing broadcast workflows and some, which provide a unified toolset, but all are limited by the creativity and capability of the content production team. The big challenge is that different systems may require different skill sets or knowledge of different real-time rendering engines/workflows and quite often, it is hard for broadcasters to train and retain staff with good enough levels of these skills to get the most out of their AR system. Building multi-camera systems and dealing with the challenges around latency can also be troublesome for broadcasters. Using AR will also mean an introduction to camera tracking systems for broadcasters which can bring its own challenges – particularly around lens calibration and maintenance.
As requirements for 4K and even 8K broadcasts evolve, AR systems need to keep up which means more graphics render power, increased costs and more complex signal management – all of which make systems constantly evolve and therefore mean broadcasters need to adapt, continue to learn/invest and will find it hard to get the ROI on any purchased system.
What helps broadcasters through these challenges are systems that automate much of the calibration process, are render-engine agnostic and can be scalable to suit future standards or resolutions but acknowledge it can be a minefield for broadcasters to make decisions.
I am a great believer in the fact that AR will revolutionise the way we consume and interact with content and that, at some point, consumer wearables will be so discreet and high quality that we’ll rely on AR/virtual displays around us as much as we do our laptops, desktop monitors or home televisions. AR will become ubiquitous with content consumption and broadcasters will need to do more with AR to keep consumers engaged – both on their programme output and as an OTT/second screen experience.
xR and virtual production will become cheaper as the technology matures and becomes more mainstream, meaning we’ll see more content producers embracing that workflow and driving it forward. Camera, presenter and prop tracking will get easier with more automated calibration and cheaper devices.
Perhaps the biggest change we’ll see is in graphics render power and how that power is scaled/deployed. We are eagerly awaiting the arrival of Unreal 5 and the next generation of even more powerful GPUs, which will see the next jump in content realism and resolution.
We’re starting to see systems that allow this render power to scale across machines and I think over the next few years, we’ll see this develop exponentially. It will allow us to create more believable and interactive AR experiences for the consumer powered by distributed/cloud rendering or locally through the consumers’ own devices.
Commercial Director, Mo-Sys
AR graphics are already used extensively in news, sports and weather storytelling. Children’s TV and mainstream drama, content that can be sold multiple times over to other broadcasters is probably where AR graphics provide the greatest return, both in terms of producing content that otherwise wouldn’t be possible, or if possible, it would be cost-prohibitive to make.
Mo-Sys manufactures StarTracker, a precision camera and lens tracking system, which is used to blend AR graphics with the real world. The system is also used for virtual studios, mixed reality, and extended reality. Mo-Sys also manufactures an Unreal Engine plugin called VP Pro, which enables all types of virtual production, and a range of encoded remote camera heads for wider, more varied AR shots.
AR graphics primarily require all the cameras used to be tracked, and all lenses on the cameras to have been profiled. Once this is done, one can choose which virtual studio software to use to synchronise and composite the AR graphics with the real world. Either a traditional virtual studio software package with automation and playout triggering, or where this isn’t required, an Unreal Engine plugin will work.
The biggest decision to make is whether the graphics operations team should be experts in Unreal, or experts in a traditional broadcast virtual studio software. This will determine the type of software that can be used to deliver the AR graphics. Choosing the camera, lens tracking system, and the camera grip comes after.
In terms of where AR is headed, greater photo-realism using technologies such as ray tracing is the obvious one. We will also begin to see more photo-realistic avatars, human or otherwise, driven by actors in motion capture suits with facial expression headsets, interacting with real actors.
The aim of broadcasters deploying AR is to create highly immersive content that’s visually appealing, which is ‘sticky’ in terms of viewer numbers and viewer duration, whilst also providing differentiation from the competition. The longer-term goal is for broadcasters to use AR to create increasingly sophisticated photo-realistic content that wouldn’t otherwise be possible.