Powers Global Simulcast


Mixing, Performance Systems, Recording, Sound Design

We recently put on (Pulitzer finalist!) Death and the Powers in Dallas, TX and broadcast the last show to 9 venues in the US and continental Europe. We created a relatively complex broadcast performance system which connected to people’s phones as well.

I also mixed the performances. The audio system required careful thought— we added a second Studer Vista 1 console with 128 inputs and outputs to the existing Vista 5. We had to make sure we had the required channels to mix the broadcast at the second console, and that word clock was correctly distributed to each console, stage box, and computer on the system. We also built a record system that could be switched onto the inputs of the broadcast console to implement a virtual sound check so the broadcast mix could be rehearsed. This was achieved with a Direct Out Tech. router.

Here’s an interview that PBS did and a piece on our group that was part of the coverage of the project. I’ve included a few diagrams of the various systems below. You can also read the NIME paper that Peter, Elly and I wrote on the project originally. It’s got a good description of the basic plot and various components in the production.

I wrote the app control infrastructure in NodeJS, load balanced by HAProxy, running on a quad i7 machine. We were able to sustain loads of over 14,000 concurrent websocket connections, sending realtime audio and performance data at about 10fps. This could have easily been extended by adding additional network IO and backends to the system. We ran a pair of VMs on the i7 to handle everything. Our assets were served by another VM hosted by Necsys. They were able to provide 5Gbit (!) of internet connectivity to the machine. All the static assets were served by Nginx and we used Cloudflare to cache everything as well. The design of our asset caching system meant that external caches would never need to be updated; new copies of assets were versioned by filename.

I wrote some basic monitoring utilities (JS/HTML5) to keep track of data the devices would report back: content version, currently running cue, OS version, etc… Since it turns out the DOM gets sluggish when trying to draw and update a gajillion little divs, there was some optimization that had to take place to get it usable. Brian ultimately monitored the whole thing and the network connection in the hall during the show. He ended up writing an OpenGL monitor that a was bit more responsive.

How the remote broadcast worked.

We also created a system for taking cameras used in the broadcast and creating a stylistic mashup of all the various angles. It was ultimately four HD-SDI inputs that were effected in real-time with looks that Peter designed. The system synchronized with the rest of the production using triggers sent from the orchestra. We were able to do this relatively inexpensively by getting a very cheap Blackmagic HD-SDI quad card and teaming it up with a gaming GPU. We originally borrowed an N’Vidia K6000 (a $6k graphics card??!) but it didn’t actually accelerate the things we needed (Quartz on OS X), so we went to a GTX770 which has the same amount of CUDA cores but is $400. The result was basically zero latency 4k realtime video effects. It was pretty snazzy.

Audio

A snippet of the audio system is shown below, we had 66 channels of surround on 5 levels of the hall, as well as a 64 channel front fill WFS array. A big challenge here was the fact that our rental vendor provided consoles with two different types of fiber. It was a puzzle to try and make everything work with the cable and expansion cards we had. The system is primarily MADI based. The WFS array uses CobraNet and some of the playback and effects machines connect via ADAT. The Vista series does not clock over MADI, so special care had to be taken to ensure the same wordclock made its way to each of the systems.

These are the major components to the audio system.

The diagram above shows the traditional show signal flow. The two blue arrows are where the system links out to the broadcast. Because we were limited in IO type and counts on the Vista 1, the record system had to be done in an interesting way. The diagram blow shows the signal flow to accomplish splits of the live mics, digital effects coming from the orchestra, while also providing reverbs and master bus compression running on the record machine. We used a cool little Direct Out split converter that functioned as a port level MADI router. That provided the capability to directly replace the Vista 1 inputs from the hall with returns from the computer (and SM/MM/BNC conversion). If the Vista 1 had more MADI IO, we could have achieved this using the secondary channel inputs, but this hack worked just as well. The only complication in this case, was the fact that the FX returns had to be sent from the computer all the way back to the Vista5 and then routed back down the multi-mode MADI to get into the Vista 1 in “Live” mode. The system has two redundant clock sources in “live” mode. The primary, an AES line running from the Vista5 core, and a backup from the computer which gets a direct split off the stage box in “live” mode. The outputs from the computer take advantage of the fact that the RME will send its outputs to both BNC and optical MADI outputs.

Screen Shot 2014-03-17 at 11.36.41 AM

The hall was a joy to work in; it sounded phenomenal and was very well thought out. Lot’s of cable mouse holes and trays to get anything anywhere. The whole building was wired for cat5, coax and fiber. Sections of the seating were on lifts to facilitate the perfect mix position and several sizes of orchestra pit. You can fit a 737 on the stage. Yes. That’s right, a 737.