Do you have a strategy to cope with mixed cadence content?

Like me, I am sure that you have been to a restaurant in a group and everyone orders from the set menu EXCEPT for that one person who orders the exotic, freshly prepared fugu, which requires an extra 30 minutes of preparation from a licensed fugu chef so that the customers don’t die eating it. Restaurant etiquette means that our main course is served at the same time, forcing everyone to spend a long time hungry, waiting for the special case. And if you split the bill equally, the special case becomes subsidised by the people wanting the set meal. Does this model relate to the media industry? Is there a cost for being special? How can we reduce that cost? What gets done with the cost savings? How can you help?

Fortunately those 5 questions lead into 5 reasons why delivery standards might be a good idea.

1. The set meal is more efficient than the a la carte

I must confess that when I write this blog while hungry there will be a lot of food analogies. I’m quite simple really. In the “set meal” case – you can see how it’s easier for the kitchen to make a large volume of the most common meal and to deliver it more quickly and accurately than a large number of individual cases. In the file delivery world, the same is true. By restricting the number of choices to a common subset that meet a general business need, it is a lot easier to test the implementations by multiple vendors and to ensure that interoperability is maximised for minimum cost. In a world where every customer can choose a different mix of codecs, audio layout, subtitle & caption formats, you quickly end up with an untestable mess. In that chaotic world, you will also get a lot of rejects. It always surprises me, how few companies have any way of measuring the cost of those rejects, even though they are known to cause pain in the workflow. A standardised, business-oriented delivery specification should help to reduce all of these problems.

2. Is there a cost for being special?

I often hear the statement – “It’s only an internal format – we don’t need to use a standard”. The justification is often that the company can react more quickly and cheaply. Unfortunately, every decision has a lifespan. These short-term special decisions often start with a single vendor implementing the special internal format. Time passes and then a second vendor implements it, then a third. Ultimately the custom cost engineering the special internal format is spent 3 or 4 times with different vendors. Finally the original equipment will end of life and the whole archive will have to be migrated. This is often the most costly part of the life cycle as the obsolete special internal format is carefully converted into something new and hopefully more interchangeable. Is there a cost of being special? Oh yes, and it is often over and over again. 

3. How can we reduce costs?

The usual way to reduce costs is to increase automation and to increase “lights out” operation. In the file delivery world, this means automation of transcode AND metadata handling AND QC AND workflow. At Dalet and AmberFin, all these skills are well understood and mastered. The cost savings come about when the number of variables in the system is reduced and the reliability increases. Limiting the choices on metadata, QC metrics, transcode options, workflow branches increases the likelihood of success. Learning from experiences of the Digital Production Partnership in the UK, it seems that tailoring a specific set of QC tests to a standardised delivery specification with standardised metadata will increase efficiency and reduce costs. The Joint Task Force on File Formats and Media Interoperability is building on the UK’s experience to create an American standard that will continue to deliver these savings

4. What gets done with the cost savings?

The nice thing about the open standards approach is the savings are shared between the vendors who make the software (they don’t have to spend as much money testing special formats) and the owners of that software (who spend less time and effort on-boarding, interoperability testing and regression testing when they upgrade software versions.)

5. How can you help?

The easiest way is to add your user requirements to the Joint Task Force on File Formats and Media Interoperability list. These user requirements will be used to prioritise the standardisation work and help deliver a technical solution to a commercial problem.

For an overview of some of the thinking behind the technology, you could check out my NAB2014 video on the subject, or the presentation given by Clyde Smith of Fox.

Until next time. 

Share this post:

Subscribe

Sign up for Dalet updates and our quarterly newsletter. Expect #MediaTech insights!

Leave a Reply

Your email address will not be published. Required fields are marked *

Recommended Articles

Navigating the Media Technology Storm Through Collaboration and Value

As 2025 approaches, marking a pivotal year for the media technology sector, Dalet's CEO reflects on ongoing challenges and identifies potential solutions

Read More

Accelerate Production and Expand Business Opportunities with the Latest Dalet Flex Release 

The newest Dalet Flex LTS release delivers faster access to growing files, support for new formats, easier reviews and greater visibility.

Read More

The Provenance Principle: How C2PA Combats Media Manipulation to Shape AI’s Future

As Artificial Intelligence (AI) continues to advance, detecting the authenticity of digital content is increasingly difficult. With deepfake scams growing more sophisticated, the role of content provenance is critical to prevent misinformation.

Read More

Navigating the Media Technology Storm Through Collaboration and Value

As 2025 approaches, marking a pivotal year for the media technology sector, Dalet's CEO reflects on ongoing challenges and identifies potential solutions

Read More

Accelerate Production and Expand Business Opportunities with the Latest Dalet Flex Release 

The newest Dalet Flex LTS release delivers faster access to growing files, support for new formats, easier reviews and greater visibility.

Read More

The Provenance Principle: How C2PA Combats Media Manipulation to Shape AI’s Future

As Artificial Intelligence (AI) continues to advance, detecting the authenticity of digital content is increasingly difficult. With deepfake scams growing more sophisticated, the role of content provenance is critical to prevent misinformation.

Read More