The Society of Motion Picture and Television Engineers (SMPTE) is an important organization that has been involved in the standardization of virtually every video and film standard – including compression, interconnection, test patterns, file formats, and so on. It’s safe to say that any filmographer or broadcast engineer would be aware of them by name.
Every two years, the Australian chapter of SMPTE organizes a conference and exhibition event in Sydney. This year’s tagline was “Persistence of Vision”, celebrating 100 years of SMPTE.
While I am not actually part of the industry, as an engineer and a hobbyist with interests in RF, networking, electronics, computing, video and satellite, this kind of trade show provides an opportunity to see what the present issues and solutions are. It also serves as a good opportunity to talk to the technical staff within the companies on exhibition, to ask those difficult questions that only they can answer, or just to chat about things in general.
One of my friends had attended in the past. I was lucky enough to have caught a radio ad about it on the opening day, so I was able to register on-site and attend on Wednesday 15th July.
This year, the show was held at Hordern Pavillion and Royal Hall of Industries, near Fox Studios in Moore Park. This was just “up the road” from UNSW, so it meant that I could easily squeeze this in before I had my afternoon meetings at the uni. When I first heard the radio ad about SMPTE15, I was on the way to uni and I was in the middle of online registration when it closed. Unfortunately, that meant I had to register on-site, meaning I had to use pen and paper. There wasn’t any fees for the exhibition, even registering in person, which was a pleasant surprise.
Upon entry, we were given an SMPTE showbag, in blue and white, and thus began four hours of walking around, and talking to more people than I could have ever imagined. It wasn’t possible, nor was it in my interests to visit every stand – so I only stopped as long as to talk to stands which seemed appealing to me. As a result, I will summarize some of the things I noticed at the show in summary format, with vendors named at my discretion. Unfortunately, I don’t have a habit of taking pictures of things at trade shows, so it’s mostly going to be a block of prose. As I’m not quite in the area, I will apologize in advance for any slip-ups in terminology.
Cameras and Optics
Cameras and lenses are the basic work-horse of the industry, so it wasn’t surprising to see some very powerful displays from several vendors on the floor. Fuji dominated the front desk at the Hordern Pavillon, commanding an unmissable location showing off an array of glass. Canon also had a large stall inside Hordern Pavillion, with a large screening area where people were invited to sit down and watch some sample footage, followed by a briefing from the director about certain aspects of the shots, all the while being filmed from the side with some rather bulky professional video lenses (like the sort you’d see on the side of a sporting field). Panasonic were showing off some of their range of cameras, from more “handycam” by name prosumer products to more professional devices, with 4k resolution capability.
Inside the Royal Hall of Industries, GoPro had a major stand, where they had a very large number of screens showing sample footage from various cameras, including the use of 720p 240fps modes for smooth slow motion, and their latest “ice cube” sized Session. They also held a draw every day at 3pm for the chance to win a camera, but lets just say, I’m beyond competitions of that sort and giveaway novelties. Some of the footage was a little underwhelming to the trained eye, with compression artifacts from their H.264 encoding especially during noisy scenes involving water. Blackmagic Design, an Australian company, seems to have hit their stride with a large display area with their products clearly categorized and delineated, from cameras to adapters, hubs and switches, capture devices, and software. Sony were also present with occasional “mini seminars” running where they try to explain the features of their most recent models. There were some other stands with very hefty aerial photography style lenses on show.
Camera Accessories, Audio
Where there are cameras, there are bound to be accessories. There were many lighting solutions on display, some using fluorescent lamps, with many utilizing LEDs of mixed colour temperature to achieve higher CRI. There were also camera cranes, tripods and heads.
One particular exhibitor showed off a card-copying solution which involved a standalone device that could accommodate two 2.5″ drives, and virtually any sort of memory card via adapters (SxS, Redmag, etc). An external USB 3.0 drive can be accommodated too and cloning of files can be done to all drives simultaneously. Review of video can be done in unit, and the target filesystem is exFAT. The vendor claimed that it was “corruption-proof” even on removal of power, unlike on a computer, although the copy has to be retried. I do think that it is a little exaggerated, as corruption can happen due to loss of data in the buffer of the target drive, or failed mid-write to an SSD causing corruption of mapping table or file system area. There are things that can be done to minimise the chance of corruption, but not eliminate it entirely – sometimes you have to have the timing just right to make it happen.
Of course, with video, goes audio. Sennheiser had a fairly impressive display of their microphones. Kayell Australia was also exhibiting Tascam audio recording equipment – one tiny unit which records inline with your lapel mic before it goes through the wireless link, and another which is a standalone recorder similar to a Zoom Handy Recorder, but now with added Wi-Fi goodness for remote control, file downloading and sharing.
Not really my area of interest, but there were some rather interesting displays. Ross Video was showing off a virtualized “news room” which involved a camera on a jig, and just one chair and a green screen. As the camera moved around, the green screen projected image was moved to create an illusion of a large studio with multiple screens and immediate presentation graphics overlay. Pretty cool! Avid Technology and Grass Valley, relatively big names within the industry, were showing off some of their live-video processing capabilities.
Reception and Monitoring
There were several companies involved in providing broadcast receiver equipment and monitoring equipment. One of them was AVW, where I had a great time speaking to the two Business Development staff there, and had the chance to admire their equipment on show, which included receivers, monitors, modulators, and IP-based transmission equipment.
Another was the Logitek stand, where a remote FM-monitoring station was on show, showing decoded RDS labels and logged historical information on the strength of modulation of the multiplexes, pilot tones, RDS signal, etc. This kind of system would be nice for compliance reasons – it was interesting to see that the signal in the stereo pilot tone was low for many of the Sydney stations – could they be putting more power into their signals rather than the pilot? Or is this a way to make sure receivers drop out of stereo mode as soon as there’s a sign of signal weakness.
Test equipment is vital when it comes to troubleshooting cabling and signalling issues. TekInsite Video Technologies had a few Tektronix based units, one with generation and analysis ability to look at SDI/ASI signals, plot eye diagrams, decode TSes, list PIDs, show the actual video image, etc. Two other models of waveform analysers were also on show.
The other major test equipment vendor was Rohde & Schwarz, which was demonstrating a fairly hefty software-defined broadcast signal generator, which was decked out with generation abilities for DVB-T*, DVB-S*, CMMB, DTMB, etc. Virtually every digital TV broadcast format was supported for generation, with additional features in simulating path losses and distortion – multipath is catered for, along with time-varying fading channel models.
They had an analyzer as well, which was happily showing a DVB-T2 256QAM signal, and was showing MER information amongst others. I had a nice chat to the representative, who informed me that they were also doing nifty things like “demodulation as a service” where they’d have a cloud demodulator being fed raw I/Q data from front-ends which are phase locked (say, by GPS) so the signals could be combined at an RF level rather than post-demodulation. This gives you a nice SNR advantage, and multiple-path protection against sun-outages, rather than the older “voting receiver” style combining where you pick good packets from two separate demodulators.
Drones and Aerial Photography
It seems that drones are one of the hottest things to hit the industry, with this years’ SMPTE featuring a special side-pavillion dedicated to drone demonstrations (which I missed). Quite interestingly, one of the stands was for a company which specializes in training for UAV flights, with they and others constantly recalling stories of irresponsible drone usage especially by wedding photographers who fly too close to humans, and hobbyists who don’t respect the airspace around aircraft.
Admittedly, my interest in drones is limited, not having owned one or had much of an inclination to buy or play with one. They are getting more and more affordable, and their usage is raising many eyebrows – it seems as if this is one of the emerging ways to undercut what would have traditionally been an expensive aerial photography session. Unfortunately, the rules around drone usage are still somewhat in development – if we followed the FCC’s prohibition on commercial drone usage, we could see the market collapse.
This brings us to aerial photography, where the market still seems to be healthy. There was even a helicopter in the hall! Many of the aerial photography exhibits showed serious muscle with big large lenses on motorized gimbal controllers. Some were a little more lightweight, with more heavy-weight custom drones being operated as a service.
4K Resolution Demos
It seemed that 4k resolution wasn’t quite as heavily promoted as I might have expected, but there were a few demos being run with different types of technology. I visited the Ericsson stand first, where they were running a 4k demonstration based around their own encoders and decoders configured for L-band loop-back. Their demonstration involved frame-synching four decoders of full-HD imagery and tiling it to produce 4k. Each of the streams was about 30Mbit/s H.264. I noticed some juddering at times, and a visible seam initially which disappeared (conveniently) when the exhibitor came around to have a chat. It was great to see the Ericsson gear in the flesh. They also had a DVB-T2 display on show – seeing as we have only just managed the transition to DVB-T2, I don’t think T2 is likely to play a large role in the immediate future. The features of better error correction coding, higher constellation density and H.264 coding support does bring viewer benefits, but we head back to the age old issue of new set-top boxes or new TVs. In the UK, T2 is running in parallel to DVB-T as a transition strategy … I wonder when we will see that. Maybe not soon.
Another 4k demo was shown at Logitek’s stand, where Ateme’s Titan HEVC encoding solution was being demonstrated with a sample clip playing at a silky smooth 59.94fps using only about 23Mbit/s. I was suitably impressed, and aware of Ateme back from the H.264 days, when their encoder was leveraged by Nero in their Nero Digital package.
Finally, I got around to the Techtel stand where Harmonic was on display with their 4k HEVC live encoding solution as well. Their demo clip did show some artifacts, when viewed critically, but I was assured that their product excelled in comparison to others which were not on show, and that the software file encoder could be downloaded and evaluated for suitability – so I guess it’s not a bad idea to try before you buy.
Interestingly, amongst all the vendors, Samsung seemed to be the brand of choice when it came to 4k TVs, and I got to see my first curved TV at SMPTE – vastly underwhelming. The interest in HEVC at this stage seems somewhat lukewarm, although a change is “impending”. It’s just a matter of time before everyone’s computational ability catches up to the needs of HEVC.
Satellite Distribution and Production Services
The outdoor area between the two halls was a place where satellite uplink/distribution and production services exhibitors were on show. It was also where a sausage sizzle, run by Fuji, was stationed – although I didn’t end up getting any of them either.
One of the happy moments was to see Astralinks, a satellite contribution feed uplink operator, in the flesh. Most of the feed uplinks around here are done by Astralinks and Globecast and I’m well familiar with them from my satellite feed chasing. I had actually seen their decked out ute at the Royal Easter Show, earlier this year – the same ute was on display.
This image was taken as they were packing up, but they had a remote RF link to a camera somewhere at the time. The gear in the truck includes some well recognizable Blackmagic Design screens, smart video hub, an Agilent Technologies spectrum analyzer, a Sony monitor, some mobile-phone audio channels, Ericsson encoders, and a Holkirk antenna controller.
They also had their flyaway pack on show as well, with a solid-panellized style dish. We also got to see the RF amps on the ground, which had hefty heatsink fins and forced ventilation requirements. They had handheld radio communications, including return audio over the satellite multiplex where phone communications weren’t available.
I didn’t end up visiting any of the other vans on show though.
Internet Distribution (IPTV, IP Backhaul/Contribution)
In our haste to simplify the world, IP “everything” seems to be the way forward. As a result, several companies were there exhibiting their IP-optimized solutions. I won’t name the companies, although I am definitely somewhat critical about their operation, as I am quite familiar with networking protocols.
One company offered a licensed product which could be used for file transfers or for transparent “pipe” style operation using a proprietary UDP based protocol that claims to speed up transmission and ensure perfect streaming quality. Their demonstration involved taking an RTMP stream (TCP), transcoding it and transmitting it using their technology and transmitting it in plain UDP to show the resultant mess.
This, in itself is a pretty misleading demonstration, as nobody would sensibly send a high bitrate UDP stream through the internet and expect it to survive. UDP as a protocol is not designed for reliable messaging. Instead, it would be better to send it as a TCP stream, and have adequate buffering to ensure it gets there in time (most of the time). I do this a lot, and I can tell you it works.
Every video you watch from Youtube, and indeed, the source clip for this demonstration (using RTMP) is coming in via TCP over the wide internet – so it’s clear that streaming over the internet doesn’t “need” this as much as they claim it does. The technology has prohibitive licensing fees which doesn’t allow it to be used to end users – they really only support it between your own peers or to CDNs.
Of course, they also claimed that TCP was not usable and was too slow. They alluded to the round trip time of intercontinental links being >250ms, and that this would make transferring a 10Gb file from the US over regular TCP “impractical”.
This too, is somewhat misleading as I regularly transfer 13Gb backups of my website from the US over the internet at the university at around 10MB/s over the GbE link at my desk. It’s not impractical, merely suboptimal at most.
The round trip time has less of an impact on TCP – they alluded that before the next packet could be sent, the acknowledgement has to be received. This is wrong. TCP has a receive window arrangement, which scales to handle enough packets to compensate for the delay. Packets can be unacknowledged and in flight until the window is full.
Loss is where the schemes differ – TCP loss that cannot be recovered from by a fast retransmit will result in slow start. This will result in a sawtooth style throughput profile as the speed ramps up until loss occurs. This is sub-optimal link utilization, and is really where another protocol could potentially improve the result.
When pushed, they admitted their UDP based scheme has their own “proprietary” integrity checking, and retransmission algorithms. They claim that upon sensing loss, their product does not slow down the rate of transmissions and just keeps blasting away.
This is where things get bad – a protocol that does not throttle its transmissions upon loss has the potential to do really bad things. This can end up overloading routers along the path with packets in the queue, resulting in more losses for everyone. It could also result in less efficient retransmissions because the lost packets won’t be in any particular order.
The worse part is that it results in unfair utilization of what is potentially shared infrastructure. Of course, we shouldn’t be discriminating on traffic as part of net neutrality, but having such schemes in play can severely degrade the experience for everyone if enough people play “rough”.
They claim that this isn’t a big issue, as they’re using UDP and it can be dropped at any stage with little impact to others.
This might be true, but only when it comes to TCP traffic. Other UDP traffic will be handled with a similar priority, so it is likely they will be degraded. Such protocols can be “bad neighbours”, and it is possible to tweak or modify existing protocols to be like that as well. In some sense, this protocol has some similarities with uTP in Bittorrent – UDP based, self managed retransmissions and integrity checking, or even Skype – video and audio transmission with redundancy and forward error correction and error concealment.
One thing they don’t detail is the transmission efficiency – i.e. number of bits put onto the wire versus payload delivered. A non-throttling algorithm is likely to transmit more packets that will be dropped just to try and have an advantage.
What this also means is that TCP based systems will have an advantage in theory over this particular system.
Another company offered a VPN-style appliance which forms a link between the units over the public internet using parallel TCP connections to enhance the speed.
This particular solution is sort of the counter-part to the above system. By using TCP instead of UDP, they’re hoping for a gain over UDP traffic and less retransmission handling has to be done in custom code.
But again, its success is only because others fail – it is a bad neighbour as well, by opening multiple ports, they are trying to game the TCP “fairness”. TCP generally distributes bandwidth evenly amongst connections, but if you have many many connections in the pool, you get a larger slice of the bandwidth.
Over the public internet, where congested links are becoming more and more normal, having a larger slice of the bandwidth means someone else (it could even be your end customers) have less. It also means that if everyone employs the same tactics, nobody benefits.
Some of these solutions can be relatively expensive. From what I can tell, they probably work just fine as well. That’s probably the limit of care for some of the purchasers.
But what’s not clear is how they coexist with other users – it seems that is not high on their priorities. It’s also not clear how the performance will evolve over time, as the internet is dynamic and a change to router behaviour, routes, and usage patterns could degrade performance below usable levels. Using these solutions over private networks might not be necessary, but is probably okay where you can accept disruption to other services over that network, especially where links are reaching their maximum utilization.
But it also shows just why circuit switched networks with dedicated paths for video are important even to this day. Certain uses of video which require near-zero latency cannot tolerate buffering, and thus cannot tolerate any disruption in transmission. Everything has to follow a strict timing schedule, to ensure the picture remains intact. Where the path is dedicated, this is easy to ensure, and results in the best quality. You can think of this as the difference between trying to run a call over Skype on 3G mobile data (packet-switched connection) versus actually calling someone directly using the mobile network in circuit-switched mode.
I think this also clearly illustrates how having more bandwidth to ensure less or no contention is the only universal cure to congestion, although tweaking your TCP stacks and configuring QoS end to end (where you can), can go a long way even without needing such solutions.
Storage and Archival
There were a few stands displaying storage solutions. Some of them were hard drive based “pods” which were connected in a SAN style architecture, in an expandable format. Some featured more clever file transfer abilities with optimized speed. Others implemented tiered storage solutions spread across multiple media types, where LTO-6 2.5Gb native tape seems to be the flavour of choice. It’s interesting to see that tape is not dead just yet, although its market is probably shrinking somewhat.
KVM Australia were showing some of their newer IP-KVMs, and USB over IP, as well as KVM over UTP Cat 5 (i.e. non IP) products. It’s interesting to hear that some of the products are capable of using UDP multicast for multi-receiver applications, along with the saturation of GbE connections being a real issue just with one sender due to the high quality of the devices (meaning network provisioning of 10GbE, 40GbE and 100GbE between switches is a real need).
On the whole, SMPTE15 was an extremely enjoyable event. Anyone with any interest or involvement in broadcast, distribution, archival, compression, monitoring, testing, film-making at any scale, buying camera equipment, audio production or drone piloting would have had something “up their alley”. It was a good opportunity to see the latest products being exhibited by a large number of exhibitors in a small space, with all of the vendors more than happy to talk at length into the technical aspects of their products.
It was also an opportunity for those within the industry to meet and greet and network with other people, or just to have a friendly catch-up between friends. The atmosphere was very inclusive, with people from different parts of the production chain all mingling together.
From what I could see, SMPTE15 was a big success and I definitely would have regretted not going. I look forward to seeing what SMPTE17 has to bring – maybe 4k and HEVC will be even more strongly represented than this year.