There’s no need to explain why the requirements for live streaming and remote production have hugely increased over the past 12 months. The former is relatively straight forward for even the most technically challenged of us, at its most basic level within a few seconds anyone can be live streaming direct from their smartphone out to the wider world. At the other end of the scale we have incredibly powerful 4K, 10-bit HDR, HEVC live streaming encoders such as the Teradek Prism to deliver the highest quality feasible for high-end mission critical productions – there’s a way to live stream to suit all levels of users and budgets. But how about the latter, how can you run a live, multi-presenter production where participants are spread in different locations, even continents - all remotely? And how can you do that and actually make it look professional, engaging, and dynamic?
This solution is built around Epiphan’s Pearl production systems. If you’re not familiar with them they are essentially all-in-on production units that allow you to have multiple input sources, switch between them (model dependent), record, and live stream. There are three models in the lineup, in order of maximum capability (and cost); the Pearl 2, Pearl Mini, and Pearl Nano.
The Pearl 2 is really at the heart of this given it’s the most powerful unit of the three, it can support up to 6 full 1080p HD inputs, and both NDI and SRT (find out a little more on SRT here) – which is going to be key to this production. The Pearl 2 acts as our main switcher, so we’ll feed all of our video sources into it, use it to switch between sources, live stream and record to its internal storage. A big selling point of the Pearl systems is that not only can they be fully controlled over the local network from a web browser, but with the addition of Epiphan’s Cloud platform, they can be fully controlled remotely over the internet too. So really all that is needed is that the Pearl 2 is powered on and plugged into a network that has a reliable and fast internet connection. The operator then has the choice of running the production locally or from anywhere else in the world. In the video below you can see how the control panels of the Pearl units are accessed through Epiphan Cloud.
For bringing in remote presenters into the Pearl 2 we’ll be using SRT (Secure Reliable Transport), which offers low latency, secure, high-quality video transmission over the internet. The Pearl Nano is an ideal solution for each remote presenter to have at their location to do this, simply due to the cost and size, however this could also be either a Pearl Mini or Pearl 2 - it works in exactly the same way for this use case. They simply feed their camera and microphone into the Pearl Nano, and then connect the Nano to a network with internet connectivity. As soon as the Nano is online, the operator (who’s controlling the Pearl 2) will be able to see the device in their Epiphan Cloud account. From here they can access the full controls of the Nano and initiate an SRT connection between the Pearl Nano and the Pearl 2 – this process can be replicated for the other two remote presenters.
So now we essentially have a one way connection for all three of our remote presenters - transmission of each of their camera and audio feeds into the Pearl 2 unit using SRT. From here on the production operator can create custom layouts, switch between them, record the individual presenter streams along with the main output, and of course live stream directly to the platform (or multiple) of choice, all from the Pearl 2.
Of course the presenters will want to be able to interact with each other. Now we could set up another SRT stream for each presenter which is outputting from the Pearl 2 back to their respective Pear Nano. This return SRT stream would just show the presenters and would be output via the HDMI on their Nano. However, given that it’s not crucial that this return stream is of high quality (since it will never be on air), it makes sense to separate this function from the Pearl systems. This alleviates the network bandwidth at each of the presenter locations and also for the Pearl 2, plus it reduces CPU load on all of the Pearls - which is always beneficial. The Pearl 2 can actually handle up to 6 SRT sources, so if we offload the return SRT streams we leave processing power to easily expand our production to accommodate more presenters if needed.
The most straight forward way of giving the presenters the ability to be able to interact with each other is to use a video conferencing platform such as Zoom, Teams, or Skype etc. The production operator would create a meeting to which they can all join from their computers, tablets, or even smartphones. The video compression is very efficient on VC platforms (as connection reliability and latency is prioritised over image quality) so it really doesn’t eat into network bandwidth. These platforms are also something that pretty much everyone is used to using, especially more so in the past year. Since the production operator is part of the VC call, they too now have direct comms to the presenters with no risk of it going out on the live production.
So what about graphics? The Pearl 2 natively supports image files that can be loaded into the system and be applied to any of the custom layouts made. However, if you want something a little more dynamic, then the best way to do this would be to use one of the many live titling/graphics packages that can output via NDI – this would essentially run on a separate computer that’s on the same local network to the Pearl 2.
One of the big benefits of using NDI for this is that it can carry an alpha channel. There’s no need to look for an appropriate video I/O card for the computer, nor the need to use two SDI cables to transmit key and fill signals – it’s all done over the network. You can apply the NDI video source on top of all of the shots created in the Pearl 2, and the graphics (such as lower thirds) will only be visible once they are triggered from the graphics software. There are plenty of software packages that support outputting NDI with alpha, such as Newblue’s Titler Pro, or even editing packages such as Adobe After Effects and Premiere Pro - using programmes like these will allow you to use advanced motion graphics (as opposed to just still images), along with the ability of playing in video files.
This would mean that the operator would need to physically be at the computer running the graphics that feed into the Pearl 2. However, it is possible for this to be done remotely too, it really just comes down to the chosen graphics package. For example, a popular solution is Singular Live, which is a cloud based service. A computer is still required locally to output the NDI feed into the Pearl, but Singular Live can be fully controlled/triggered over the web.
11 May 2021
Top 3 tips for establishing a successful UAV program
26 Apr 2021
Loupedeck CT Long Term Review with photographer Edmond Terakopian
13 Apr 2021
Holdan appointed by Atomos as the new distributor for UK and Ireland
07 Apr 2021