Cutting the Cord - Follow Up
/Cutting the Cord - Follow Up
February 4, 2013
As is always the case with content published on this site, the first version is certainly not the last. Through valued reader feedback, these articles are amended and updated over time. Chris MacKarell at Arri CSC Digital, whose feedback has routinely been beneficial to this site, raised some valid points on my post on Wireless HD-SDI that I'd like to include. The main concern is that for Wireless HD Video, because this digital signal has to be converted to analog for transmission and then converted back to digital upon reception, this resulting signal can never really be "identical" to a cable-bound one no matter how high quality it is.
How true!
From Chris -
.. as far as wireless monitoring is concerned, here some questions you didn't touch on which certainly affect the quality of the image on the monitor:
At the Transmit end, precisely how is the deserializing and digital-to-analogue conversion of the original HDSDI signal performed?
How is the reverse, the analogue-to-digital conversion and serializing of the signal at the Receive end performed?
How high a fidelity are these DAC and ADC functions, what quantizing steps are taking place to perform them?
How are overly high peak-to-average power ratios in the modulated signal dealt with? - Note that this is usually done by clipping the transmitted sine wave.
How is intermodulation distortion i.e. non linearity in the signal chain handled?
If COFDM, how is this managed?
Of course, there are industry standards ( e.g. IEEE 802.20 ), but were they designed for the kind of critical application the DIT requires?
The key question is, can you trust what you see on your monitor at all times?
I suspect that for many, any questions at all around the image processing techniques applied and therefore the integrity of the subsequently transmitted image, are enough by themselves to mitigate against that signal's utility for serious critical imaging work.
And since all this processing will always be a necessary precursor of image transmission and reception, the wireless utility you seek for on set monitoring applications perhaps maynever be approached.
In short, whatever future technology developments occur, the underlying principles governing wireless transmission will not change. Wireless may therefore never yield the kind of critically accurate signal you require in your day-to-day work.
All of the above points will certainly affect image quality and is something to be aware of. As Chris pointed out, "the underlying principles governing wireless transmission will not change," so no matter how good these systems are, just how cautious should we be in using them? Should we even be using them at all?
Unfortunately, we often don't have much choice in the matter and if the shot calls for wireless, then it has to be wireless. In my opinion, if you know the nuances of your wireless system and have spent some time testing and evaluating the image it produces, you can come to trust it. My favorite thing to say on the set when things become questionable is, "the scopes don't lie." Waveform and Vectorscope are probably the only thing on the entire set that are truly objective. They reveal problems now that will certainly be problems later and basically tell you everything you need to know about the video images you're working with. In the case of wireless video, the scopes can tell you just how well your system is putting that signal back together upon reception.
Here's a great way to test this - using scopes, compare the exact same image - one coming to you over the air and the other through a cable. Use one of the camera's HD-SDI outputs to feed the wireless Tx and then run a hardline to you from the camera's other output. Now switch between the two and study the waveform and vector. Are you seeing any shift in Chroma in the wireless image compared to the cable? What about contrast? Does the highlight and shadow information in the wireless waveform sit at the same place in the cabled image? What about midtones? Are they more compressed in the wireless image?
Really if you're going to use this video signal for anything other than basic monitoring, any differences between the wireless and cabled image should be minimal. In the case of the Boxx Meridian, in my experience very little if any, shift in chroma or contrast is evident in the wireless image. As the signal degrades, the image becomes noisier and blockier but largely maintains its correct luma and chroma information. This is why we spend so much time testing gear. You're going to have to use it and often in very compromised situations. The more you know about its strengths and weaknesses, the more confident you can be in its operation.
I encourage anyone who follows this site to contribute to the knowledge base and feel free to eviscerate anything I've written here with a fine-toothed comb. I'm a technician and not an academic so many of the topics I address here are much more focused on practical application or the "end-user experience" and not the hard science driving it.