Making the internet an integral part of an efficient production workflow is challenging but can be done, says Trevor Francis The internet is revolutionising the way we consume TV and other entertainment. We can now watch or hear almost anything, anytime we want, wherever we are. Were not even confined to our homes and offices […]
Making the internet an integral part of an efficient production workflow is challenging but can be done, says Trevor Francis
The internet is revolutionising the way we consume TV and other entertainment. We can now watch or hear almost anything, anytime we want, wherever we are. Were not even confined to our homes and offices we can watch our choice of TV on hand-held devices even when were on the move. Advanced compression techniques, cheap storage, colossal bandwidth and some amazing consumer gadgets place the whole world, literally in the palms of our hands.
Thats the output, or the publishing side of the broadcast industry. Theyve been forced to adapt to the new order, of course; the only alternative was a slow death as the funding followed the consumers to their computers, phones and hand-held players. So with revenues squeezed, the broadcasters have been looking to the web to bring some benefits to the creators of content, not just the consumers. Could some of this technology cut the costs of producing television and offset the reduction in income? Yes it can. But there are some considerable problems to solve.
The trouble for broadcasters is theyre being pulled in two directions at once. Even before they started fighting for survival on the net, theyd opened another front: High Definition (HD). And from the success of HD, theres another challenge emerging: stereoscopic 3D.
HD and 3D are drawing us back to our TV screens with sparkling, immersive pictures and high-quality surround sound. This may be great news, but, even with the latest compression techniques, theres a lot of data to capture, store and move.
The movement of data especially in the quantities demanded by todays high quality productions is a problem. It is often time-consuming and expensive, doubly so if the producers, editors, colourists and other professionals are kept waiting while files transfer from one place to another.
Theres a clue to a better solution in the experience, usually un-noticed, of the internet viewer. He or she opens a browser, finds the TV show or the clip they want to see, then hits play. Where is the data coming from? The viewer probably doesnt care and really doesnt need to know. It would take some internet detective work to discover exactly where the data was originating for that particular stream.
Ultimately, it started on a server run by the content publisher and that could be anywhere in the world. The consumption and storage of data need not be co-located so if it works for TV viewers, why not for producers too? Well, they need to see and hear the original material and then store and review their work as they edit, composite, mix, correct and finish their piece. This is two-way movement of data, with constant, random access two major problems which are difficult enough when the storage is close and magnified massively when the internet is the conduit to distant storage.
Solutions to these problems are now emerging based on a unique and complex synergy of identity, file transformation and a virtual filing system. Working together, these techniques will support fully featured TV postproduction for SD, HD or 3D wherever the content is stored and wherever the operator is working, connected only by the internet.
The first part of the solution is to generate a universal mechanism to describe content at the most granular level. This can be achieved by applying a property called Identity and doing so at the level of video frames and audio samples. We are all very familiar with Identity at the human level; our parents give us names when were born and these define us within our families and schools, but theyre not fit as a unique definition of each of us in the wider context of the whole human population.
Mobile telephone numbers, however, have achieved this: we can provide everyone with a unique reference which avoids any confusion. If we do the same for media files, we can identify and track the movement and usage of media with extreme accuracy. A convenient mechanism is to use the International Standard for Generation and Registration of UUIDs (Universally Unique Identifiers) for each media file plus an added offset count for each frame or audio sample contained.
A second ingredient in this solution is a virtual filing system. This sits above the more familiar physical filing system which stores and manages individual files on a disk. For each media file, we may require more than one form suited to the purposes of editing, shared access in a workplace or viewing at a distance. We often use proxies to facilitate some of these forms of access. A virtual filing system, together with an identity scheme, is able to present each piece of media according to its description its metadata while concealing the various formats of the file itself. So, I may locate a file by searching across its various descriptors, but it will be delivered in the format required, either by the task Im performing or by my location, for example: full-quality HD if Im colour-correcting in a production suite, a 5 Mb/s H.264 proxy at a producers desk or a 300kb/s Quicktime file if Im reviewing some work on mobile phone.
If we are to provide workflows across the internet, with its unpredictable bandwidth and latency, we need some more help. Adaptive streaming technologies, such as Microsofts Smooth Streaming can do this for us. These require access to multiple file formats in ascending qualities, with the choice determined by the instantaneous performance of the connection. It is possible, of course, to create, store and manage many copies of each original media file, but this can be expensive and builds a brittle system. It may be difficult to extend the range of file sizes or to adopt a new form of coding without a complete re-build. An intelligent approach is to build a live transformation engine which can render any file type, size or wrapper, on demand.
To summarise, weve described a solution where the geographies of content and user have become irrelevant. We can access content using standard internet connections, wherever we are. We have a scheme to create the required files on the fly and to use a virtual filing system to manage media under an identity scheme, irrespective of the file type in use. This allows us to track media usage and, critically, build frame-accurate edits ready for output, from any location. We have a virtuous circle that uses identity to recognise which sections of content weve changed and which we havent. In many applications, where remote access is desirable news gathering, major sports events, for example editing may be as simple as shot selections or even metadata editing (logging). In all of these, the identity scheme allows us to complete the tasks without ever moving the original content at all. With connections that often may be weak or fragile, the best solution must be the one that imposes the smallest possible load while offering the fullest potential for adding value. We can empower people to work where they are, without forcing them to move to where the content is stored, and make the internet a practical part of an efficient production process.
Trevor Francis is worldwide marketing manager for Quantel.