Best practices of applying 8-bit effects in Premiere Pro

Summary: Always apply your 8-bit effects as the last ones in the pipeline. 

A few years ago Karl Soule wrote a short explanation of how the 8-bit and 32-bit modes work in Premiere Pro. It’s an excellent overview, although it is a bit convoluted for my taste (says who), and does not sufficiently answer the question on when to use or not to use 8-bit effects, and what are the gains and losses of introducing them in the pipeline. Shying away from an Unsharp Mask is not necessarily an ideal solution in the long run. Therefore I decided to make a few tests on my own. I created a simple project file, which you can download and peruse ( 8-bit vs 32-bit Project file (1754 downloads ) ).

In essence, the 32-bit mode does affect two issues:

  1. Banding.
  2. Dynamic range and clipping.

For the purposes of testing them, in a Full HD sequence with Max Bit Depth enabled, I created a black video, and with a Ramp plugin I created a horizontal gradient from black to white, to see how the processing will affect the smoothness and clipping of the footage.

Settings for a simple horizontal ramp to test it.

Next I applied the following effects:

  1. Fast Color Corrector with Input White reduced to 215 (clipping white).
  2. RGB curves that roughly mirrors the Gamma 2.2
  3. RGB curves that is roughly the reverse of 2.
  4. Fast Color Corrector with Output White reduced to 215 (bringing white back to legal range)
  5. Offset 8-bit filter that does nothing, but breaks the 32-bit pipeline.

Settings for the first Fast Color Corrector.

Master curve in the first RGB Curves effect

Master curve in the second RGB Curves effect.

Settings for the second Fast Color Corrector.

To assess the results I advise opening a large reference monitor window in the YC Waveform mode. Looking at Program monitor will not always be the best way to check the problems in the video. You should see the diagonal line running through the whole scope, like this (note, that the resolution of Premiere’s scopes is pretty low BTW):

This is how the ramp looks like without any effects applied.

Now perform the following operations, and observe the results:

  1. Turn off the effect number 5 (Offset), and notice a bit more smoothness in the curve, however it’s rather superficial, and hardly likely to influence your final delivery.
  2. Turn on Offset, and move it before the second FCC. You should immediately see the clipping in the highlights. This is the result of 8-bit filter clipping the values at 255,255,255 (white), and disposing of all super-white information that the first FCC introduced.
  3. Move offset between RGB curves (between 2 and 3). Apart from the clipping, you will also notice the banding and posterization, both in the program monitor, and in the Waveform as little flat steps. This is the result of 8-bit filter rounding up or down the values to the nearest integer (123.4 is rounded to 123), and later modifications have less data to work with.
  4. Move offset before the first FCC. Most likely you will not see anything too different from when the Offset filter was the last one applied. However, if you had footage that had super-white values, they would be clipped and irrecoverable. Similarly, if you had 10-bit footage, it would have been converted to 8-bit, and all latitude and additional information would be lost as well.

The effect of the first Fast Color Corrector on the ramp. Notice the clipping.

The effect of the second Fast Color Corrector on the ramp. Notice the lack of clipping.

The effect of the first RGB Curves on the ramp.

The effect of the second RGB Curves on the ramp.

All four 32-bit effects on.

All filters on with Offset in the last place. Notice a bit less smoothness in scopes.

All filters on with Offset before the second FCC. 8-bit filter makes the data irrecoverable, and clips the highlights.

All filters on with Offset between the RGB Curves. Notice the irregularity in the curve, showing rounding errors and compounding problems.

All filters on with Offset after the first FCC. The curve is not as irregular, but the clipping is still present.

All filters on with Offset in the first place. Notice the similarity to the Offset in the last place. The 32-bit pipeline is not broken afterwards, although the footage is converted to 8-bit because of the filter pipeline. 

This simple experiment allows us to establish following best practices on applying 8-bit effects in Premiere Pro:

  1. Try to apply 8-bit effects as the last ones in the row, especially with footage that is 10-bit or more.
  2. Always perform your technical grade and make your image legal (ie. not clipping) before you apply any 8-bit effect in the pipeline.
  3. Ideally do all your color correction before any 8-bit effect is applied.
  4. Avoid performing heavy color correction or image processing after an 8-bit effect has been inserted to avoid banding and posterization, especially with 10-bit or RAW footage.

And that’s it. I hope this sheds some light on the mysteries of Premiere Pro’s 32-bit pipeline, and that your footage will always look great from now on.

Adobe Anywhere didn’t spring out of nowhere

Yesterday a few pieces of the puzzle came together in my head, and I realized that Adobe Anywhere in no way was conceived as a brand new solution, and is in fact a result of a convergence of many years of research and development of a few interesting technologies.

A couple years ago I saw a demonstration of remote rendering of Flash files and streaming the resulting picture to a mobile device. For a long time I thought nothing about it, because Flash has always been on the periphery of my interests. But yesterday I suddenly saw, how relevant this demonstration was. I believe it was a demo of Adobe Flash Media Server, and it was supposedly showing a great way to allow users with devices not having enough power to enjoy more advanced content without taxing the resources too much, and possibly streaming content to iOS devices not running Flash. Granted, the device had to be able to play streamed video, but it didn’t have to render anything. All processing was done on the server.

Can you see the parallels already?

Recently Adobe Flash Media Server – which Adobe acquired with Flash when it bought Macromedia in 2005 – changed its name to Adobe Media Server, proudly offering “Broadcast quality streaming”, and a few other functionalities not limited to serving Flash anymore. The road from Adobe Media Server to Adobe Anywhere Server does not seem very far. All you need is a customized Premiere Pro frameserver and project version control, which in itself perhaps is based on the phased out Version Cue. Or not. The required backbone technologies seem to already have been here for a while.

Mercury Streaming Engine backbone

What follows are a few technical tidbits that came with this realization and a few hours of research. Those of you not interested in these kind of nerdy details, skip to the next section.

To deliver the video at astonishing speed Adobe Anywhere most likely uses the protocol called RTMFP (Real Time Media Flow Protocol) which had its roots in the research done on MFP protocol by Amicima. Adobe acquired this company back in 2006. RTMFP, as opposed to most other streaming protocols, is UDP-based, which means that there is much less time and bandwidth spent on maintaining the communication, but also there is no inherent part of the protocol dedicated to finding out if all data has been sent. However, some of the magic of RTMFP makes the UDP-based protocol not only inherently reliable, but also allows for clever congestion control, and “absolute” security, at the same time bypassing most of NATs and firewall issues.

The specification of RTMFP has been submitted by Adobe in December 2012 to Internet Engineering Task Force (IETF), and is available on-line in its drafts repository.

More in-depth information about RTMFP can be found at two MAX presentations from Adobe. One of them is no longer available through the Adobe website, but you can still access its Google’s cached version: MAX 2008 Develop, and another from MAX 2011 Develop, and still available on the site. Note, that both are mostly Flash specific, although the first one has great explanation of what the protocol is and what it does.

It is still unclear what type of compression is used to deliver the footage. I highly doubt it is any inter-frame codec, because the overhead in compressing a number of frames would introduce a noticeable lag. Most likely it is some kind of intra-frame compressor, perhaps a Scalable Video Codec version of H.264 or JPEG2000 and its Motion JPEG 2000 version that would change the quality setting depending on the available bandwidth. The latter is perhaps not as efficient as the former, but even at full HD 1920×1080 JPEG2000 file at quite decent 50% quality is only 126 kB, 960×540 only 75 kB, and if you lower the quality to viewable 30%, you can get down to 30 kB, which requires about 5 Mbps to display 25 frames in real time, essentially giving you a seamless experience using Wireless connection. And who knows, perhaps even some version of H.265 is experimentally employed.

Audio is most likely delivered via Speex codec optimized for use in UDP transmission, and live conferencing.

Ramifications and speculations

There are of course several performance questions, some of them I already expressed – are you really getting the frame rate that your sequence is in (1080p60 for example) or is there a temporal compression to 24 or 25 frames as well – or any number, depending on the bandwidth available. And how is the quality of picture displayed on a broadcast monitor next to my edit station affected? Yes, I know, Anywhere is supposed to be for the lightweight remote editing. But is it really, once you have the hardware structure in place?

When it comes to server, if I had to guess today, a relatively fast SAN, and an equivalent of HP Z820 including several nVidia GPUs or Tesla cards is enough to take care of a facility hosting about half a dozen editors or so. Not an inexpensive machine, although if you factor in the lower cost of editing workstations, it does not seem so scary. The downside is that such editing workstations would only be feasible for editing in Premiere Pro, and most likely little else. No horsepower for After Effects or SpeedGrade. Which brings me to the question – how are the Dynamic Link and linked AE comps faring under Anywhere? How is rendering and resources allocation resolved? Can you chain multiple servers or defer jobs from one machine to another?

Come to think of it, in the environment only using Adobe tools, Anywhere over local ethernet might actually be more effective than having all the edit stations pull required the media from the SAN itself, because it greatly reduces bandwidth necessary for smooth editing experience. The only big pipe required goes between the storage and the server. And this is a boon to any facility, because the backbone – be it fiber, 10-Gig ethernet, or PCI-Express – still remains one of the serious costs, as far as installing the service is concerned. I might even go further, and suggest abandoning SAN protocol altogether, when only Adobe tools are used, thus skipping SAN overhead, both in network access, and in price, although I believe in these days of affordable software from various developers it would be a pretty uncommon workflow.

In the end I must admit that all of it is just an educated guess, but I think we shall soon see how right or wrong I was. Since Al Mooney already showed a custom build of the next version of Premiere Pro running Adobe Anywhere, it is almost certain, that the next release will have Anywhere as one of its major selling points.

Continuing Premiere’s FCP XML export woes…

If you are familiar with my recent complaint about the issues with FCP XML export, let me throw two more problems into the mix. They might not be as important as the drop and non-drop frame issue, because no data is lost in the process, and they are fixable, although they might be quite an annoyance.

After fixing the problem with exporting zero-marker duration, we shall find out two things:

  1. Although subclips are correctly exported to FCP XML, for some reason Premiere creates additional markers in the main clip which span through the whole duration of each subclip…
  2. …but in each subclip, each marker’s in point is wrongly calculated, and their positions change, often beyond the subclip’s borders.

Example

There’s a clip that has 1 marker in it, and 1 subclip. The subclip starts at a frame 228, and ends at the frame 645 of the main clip. The marker starts at frame 355 and ends at 528. After export, you will get:

  • 2 markers on the main clip:
    • 355-528
    • 228-645
  • 2 markers on the subclip:
    • 355-300 (sic!)
    • 228-417

And what you should get:

  • 1 marker on the main clip: 355-528
  • 1 marker on the subclip: 127-300

I think you have already figured out, what’s going on – the subclip’s offset is not removed from marker’s in point. As I said – hardly a big problem. Merely an annoyance. But it requires a tool to fix it if you are using certain kind of workflow: the subclip-spanned markers should be removed, the real markers’ in points in subclips recalculated.

Depending how you look at it, the fact that the very same errors are present in FCP XML exports from Prelude, can be either disheartening – because you’d think it’s such a basic functionality that somebody should have had noticed in in the production stage – or encouraging – because if they fix it in Premiere, they will also fix it in Prelude.

On a positive note – I’ve been in contact with Jesse Zibble from Adobe about XML export in general, and I know they have been working on these issues. But judging from the release cycle of previous bug fixes, I don’t think this is going to be amended in the current release.

All of us not using Creative Cloud yet, feel free to express your disappointment now. And if any of you needs a tool to fix these markers, drop me an email.

Premiere Pro doesn’t export markers of 0 duration to FCP XML

While doing research for a commission that I recently received, I found out that Premiere Pro CS6 does not export markers of 0 duration to FCP XML. This proved to be a bit of a surprise, and also turned out to be a major flaw for the software that I am supposed to develop.

I had to find out the way to automatically convert all the markers into the ones with specified duration. Fortunately, as I wrote many times, Premiere Pro’s project file itself is an XML. Of course, as it was kindly pointed out to me, it’s pretty complicated in comparison to the exchange standard promoted by Apple, however it is still possible to dabble in it, and if one knows what one is doing, to fix a thing or two.

Marker duration proved to be a relatively uncomplicated fix.

In the project file, markers are wrapped into the <DVAMarker> tag. What is present inside, is an object written down in JavaScript Object Notation. I’m not going to elaborate on this here, either you know what it is, or you most likely wouldn’t care. Suffice to say, that the typical 0 duration marker looks like this:

<DVAMarker>{“DVAMarker”: {“mMarkerID”: “3cd853f0-c855-46de-925c-f89998aade87”, “mStartTime”: {“ticks”: 6238632960000}, “mType”: “Comment”}}</DVAMarker>

and the typical 1 duration marker looks like this:

<DVAMarker>{“DVAMarker”: {“mComment”: “kjhkjhkjhj”, “mCuePointType”: “Event”, “mDuration”: {“ticks”: 10160640000}, “mMarkerID”: “7583ba75-81f5-4ef2-a810-399786f3a75d”, “mStartTime”: {“ticks”: 4049893512000}, “mType”: “Comment”}}</DVAMarker>

As you can see, the mDuration property is missing in the 0 duration marker, and the duration 1 marker is also labeled as “Event” in the “mCuePointType” property. It turns out, that it is enough to insert the following string:

“mDuration”: {“ticks”: 10160640000},

right after the second curly brace to create the proper 1 frame marker that gets exported. You can do it in your favourite text editor yourself, and then the corrected marker would look like this:

<DVAMarker>{“DVAMarker”: {“mDuration”: {“ticks”: 10160640000}, “mMarkerID”: “3cd853f0-c855-46de-925c-f89998aade87”, “mStartTime”: {“ticks”: 6238632960000}, “mType”: “Comment”}}</DVAMarker>

Granted, it’s a bit tedious to do it by hand for hundreds of markers (as was my client’s request), and unless Adobe decides to fix it in the near future (I already filed a bug report), or Josh from reTooled.net releases it first on his own, some time at the beginning of the next year I might have a piece of software that will automatically convert the 0 duration markers to 1 frame ones, so that they get easily exported to FCP XML. I understand that this is a pretty rare problem, but perhaps there are a few of you who could benefit from this solution.

The main bugger? Most likely it will be Windows only, unless there is specific interest for the Mac platform for something like this.