The annual industry pilgrimage to NAB in Las Vegas has concluded and there was plenty to see and discuss as always.
The headline act this year was 4K UltraHD, with most stands claiming some affiliation with the next generation special and temporal resolution that also dominated the same venue at CES in January. Most of the tangible products sit at both ends of the spectrum – cameras and TV screens, with a number of HEVC/H.265 implementations optimizing the compression in between.
However, it’s not mission accomplished by any means as the equipment and infrastructure that sits between these two endpoints, particularly in playout, usually had a ‘coming soon’ label next to it. This is not surprising, as the overhead of transporting and processing uncompressed 4K video is significant and leads on, sort of, to the next popular topic at this year’s event – the cloud.
The cloud loomed large in most product categories that sit between those cameras and screens. It was often used as a catch-all term to describe either subscription based licensing for as-a-service software or the migration of hardware based products to software only equivalents (not necessarily always both). Both of these trends, the ambiguous use of the term ‘cloud’ and the hardware to software evolution taking place in broadcasting, were discussed in the previous series of blogs on Broadcasting in the Cloud and it was encouraging to see almost every vendor announce (or pre-announce) their roadmap in this area.
The consensus view of ‘playout in the cloud’ revolves around software based components, running on virtual machines, on either standard IT hardware in a datacenter or as machine images running in a public cloud (such as Amazon AWS, Microsoft Azure, Google GCE and so on) all connected over IP networks. There is less consensus on the readiness and suitability of such products for real production use.
We are in the midst of a major technology shift right now in broadcasting and vendors are having to choose between early product introduction, perhaps with more limited features, to get a lead on the market, or delay and offer potentially more sophisticated offerings but risk arriving late to the party. This decision will be largely influenced by a vendor’s current market position and the existing customers and products they need to transition.
The key enabling technologies such as pure software implementations running on hypervisors with IP connectivity require significant investment and engineering effort to be fully realized in new products. Which brings us to the final trend at this year’s event – industry acquisitions and consolidation.
In a blog post before last year’s show I asked if software was eating broadcasting technology. It is of course and this has led to broadcast vendors eating each other. The rate of consolidation is accelerating, including a number of high profile acquisitions before and during the show. The rationale is obvious – the cost of significant R&D, as we are seeing now, can be more viable in larger entities (ditto go-to-market and other shared costs). So we should expect a greater number of such transactions to come. A number of the established names at last year’s show won’t make it to next year’s, but the pace of innovation seems likely to continue which is good for all of us.
Steve Plunkett, Chief Technology Officer.