The Day I Dreamt About Has Come

dream-fi

My current daily job requires producing about 50-60 minutes of content per month of something that roughly falls into mostly unscripted reality show. Within 30 days we have to prepare the final edit, sound mix, including voice over and music, motion graphics, any potential VFX shots, mostly cleanups, and finally some rudimentary grade and DVD authoring. The time is precious, and short. And sometimes changes have to be introduced a day before, or even on the mastering day.

see more

Adobe Media Encoder Grows Up

ame-fi

Adobe revealed today, that Adobe Media Encoder will receive a few very worthwhile updates, including long awaited GPU acceleration, direct implementation of Premiere Pro engine, and a number of new features. There are two, which are of special interest to me: the so-called mini-pipe – adding new effects such as image overlay, timecode burn-in, or a given SpeedGrade look – and the ability to import FCP XML sequences without the need to pass them through Adobe Premiere. Both have potentially important workflow considerations in the long run.
see more

Warning about using RGB and Luma Curves in Premiere Pro CC

curves_feat

Update: The 7.1 version due out in October is going to fix the issue altogether. Great job, Adobe!

Update: The 7.0.1 patch for Premiere Pro CC fixes some of the below mentioned issues, although unfortunately not all of them.

To my great chagrin Premiere Pro CC changed the way curves operate. Right now the curves, both RGB, and Luma, clip the superwhites and superblacks, and there is no interpolation going on after the curves hit 0 or the white level (255 or 1.0). In CS6, the curves followed the general slope, and it was possible to recover some of the “overshot” material. Right now, if you stick to curves, all clipped data is lost.

cs6

Curve's interpolation in Premiere Pro CS6 allowed to recover superwhites or superblacks, and correct the contrast in the same instance.

cc

Premiere Pro CC clamps all superwhites and superblacks, and recovering the detail is not possible with the use of RGB or Luma Curves.

This is completely new, unexpected, and if you ever used curves, it changes your workflow dramatically, even if you don’t know it yet.

It means that you must remove the superwhites and superblacks from the clip before you use RGB curves. It means that if you were like me, using curves to apply the basic correction and contrast in one go, you cannot do it now. You have to first make the signal “legal” – reduce the superwhites and raise the superblacks with for example Fast Color Corrector, so that they fit between the range that curves operate on – RGB scale that is not overshot in either direction, even if you are working in the floating point (max bit depth).

fcc

In Premiere Pro CC you need to use Fast Color Corrector or Three-Way Color Corrector to bring the superwhites back into the RGB scale. Only then you can apply curves, and be sure that you are not loosing data.

It also means that there is no real backwards compatibility within the projects that used curves. Your colors will not be the same, if you had any superwhites in the project. I highly advise you to finish your current projects in CS6, and only then create the new ones in CC, being mindful about the necessity to use Fast Color Corrector before applying curves.

Adobe is aware of this issue, and hopefully some fix will come soon, but while using CC 7.0.0 version of Premiere you need to remember about this very real problem.

Apple’s move to FirePro GPUs in the new Mac Pro

macprogpu

Perhaps one of the biggest surprises during the recent sneak-peek of the new Mac Pro was the inclusion of AMD FirePro GPUs. While at first it might look like Apple again showing the middle finger to all CUDA users, and perhaps to Adobe or BlackMagic, “it ain’t necessarily so”.

Personally, I always liked AMD and ATI for their affordability, power saving, and sensible performance. Even though they usually lagged a bit behind nVidia and Intel, it was good to have competition which would keep the big players in check. In fact, the Pentium 4 fiasco was the moment when I really hoped that AMD will become the leading player in the CPU game. I have this personal love of underdogs. Alas, it was not meant to be.

The trouble began when Adobe started to use nVidia’s CUDA technology for acceleration, and when the performance of Intel’s new iCore series left AMD far behind. Essentially, if one wanted to use Adobe software, the choice was completely gone. It was Intel CPU and some kind of CUDA GPU. Adobe officially pushed heavily for Quadro solutions, which were way overpriced, and in terms of performance always behind the latest GTX series. Personally, I never bought into Quadro hype, because the benefits were not there.

On the other hand, Apple stuck to ATI cards, giving the users very limited offer, if they wanted to profit from CUDA. It was GT8800, GTX 285, or Quadro 4000. All terribly outdated or pricy. Now, this could really have been considered the middle finger to Adobe, considering FCP and Motion as competing products, and we know that Adobe engineers were not happy about the turn of events. Of course, it was also Apple’s way of promoting their own standard – OpenCL – which came about partly as the competition to nVidia’s CUDA. So the situation was a bit complicated, especially for Mac users.

Granted, nVidia was collaborating with Apple, AMD, IBM, and Intel on OpenCL since its inception, as part of the Khronos Group. Therefore the support for OpenCL soon became a standard for nVidia GPUs. Also, while CUDA is proprietary and optimized for Fermi/Kepler architecture and performance, OpenCL is open, and able to utilize any device, which can support its extensions. In OpenCL even CPUs can be put to work using the same code that programs GPUs. Of course, while there are only 4 to 12 cores in a single CPU, as compared to about a thousand in a decent GPU, the CPU input tends to be neglible, but it is there. The performance on equal hardware lagged until very recently, but last year OpenCL bridged the gap, and the two seem to operate on the parity level now.

Besides, Adobe has supported OpenCL in Premiere since CS6. BlackMagic claims that Resolve 10 will also have OpenCL support, and they seem to be pretty happy about it. The end user should not experience any problems, perhaps with the sole exception of the Ray-traced 3D engine in After Effects, which requires CUDA for accelerated processing. But this will most likely change in some future version as well.

I should indeed be cheering for OpenCL for having finally taken off. After all, it’s superior to CUDA in all but performance, and Adobe users should in fact be happy that the new, open, and less expensive alternative to nVidia/CUDA has been created. Especially if you consider that the hardware is not equal, and the recent AMD W-series GPU cards seem to fare pretty well against its Quadro equivalents. I might consider it myself in my future upgrades. My lack of enthusiasm stems perhaps from the fact that I wanted to add GPU acceleration to my plug-ins, and invested a bit in researching the CUDA engine, and not OpenCL.

After giving it some thought I agree with Philip Hodgetts, that there is no point in panicking, and that CUDA is the solution that will either go away in the future, or be relegated to some obscure niche in certain specialized applications. OpenCL is indeed the future. So at least in this regard the Apple’s bet is absolutely spot on.

And the new Mac Pro? Well… it’s a completely different story.

Label users, beware!

Beware of a nasty bug in CS6:

If you import the footage in such a way that a bin or a number of bins is created upon import, and you label your clips on the timeline, you might be in for a nasty surprise at some point.

If you choose to move your files from the bins, or sometimes even simply select them by the innocent “Reveal in Project” command or so much as click on them, you are going to loose labels in the timeline for selected footage in the given folder. *Pooof*, gone. Like this. Without any possibility of undoing it. If you have not saved your project for some time, you face either loosing the labels, or loosing the changes since your last save, because I tell you, these labels are not coming back!

Now, if you are lucky to have a saved project with your labels still applied, don’t breathe with relief yet, because they are not safe! First follow the following steps to the letter:

  1. Save the broken project under a different name
  2. If your Premiere window is maximized, click the maximize icon and set the window size as you like, just so that it is not maximized.
  3. Open the old project.

This bug propagates across projects! If you have Premiere opened in the maximized window – as most of us usually do – then the moment you open your project, the labels will evaporate as well. But if for some reason you do not have your window maximized (it might be larger than one screen, but it must not have the maximized icon turned on), you are safe for the time being. The labels will remain on the timeline until you trigger the bug again by selecting your files or moving them.

I have no idea why, but having your window non-maximized gives you a chance to save your work. This is the weirdest bug I have ever seen, and it is a nasty one.

The solution is to either import media in such a way that no folder is created, or to move them to another folder immediately after import. It does nothing to save your existing projects, but it will protect you in the future.

Adobe Anywhere – Currently An Enterprise Solution Only

anywhere_enterprise_feature

On NAB we’ve seen a few reveals from Adobe, and among them also the premiere of Adobe Anywhere. I speculated extensively on Anywhere in the past, and I was perhaps a bit too optimistic in my assessment for required hardware and bandwidth, motivated mostly by the hope that we would be able to install it in our small facility as well. Alas, it’s not going to happen.

As of now, Anywhere requires at least 4 servers to run: one being a collaboration hub, and 3 Mercury Streaming Engines. Karl Soule explained, that this is a required minimal structure, because the MSE machines also take care of the rendering. This hardware should cover the needs of 6-8 editors, and supposedly scales well by adding additional machines. It’s certainly not inexpensive (starting at $5000 but most likely achieving $15,000 to $20,000 per piece), and the cost is certainly increased by Windows 2008 Server Enterprise Edition (about $2300 per license) and MSE requiring at least one Tesla K10 processing unit costing $3000 each.

I was not mistaken though about replacing expensive SAN licences with something a bit more affordable. The two currently recommended systems (Harmonic MediaGrid and Isilon X400 series) sport their own filesystems which cover most of the SAN benefits, without incuring the overhead. Plus they work via Ethernet, lowering the price of backbone architecture even further. However, don’t get your hopes up, these solutions are still pretty expensive, going into hundreds thousands of dollars.

Obviously, Anywhere is not a plug and play solution, it requires tailoring to the specific workflow and solutions in one’s facility, and Adobe has their own servicemen who will install and configure it. Judging by the fact that the cost of software and installation is also not publicly available, it is safe to assume that it does venture into the “if you have to ask, you can’t afford it” territory.

My bandwidth estimation was also too optimistic. The suggested pipe for seamless experience seems to be 25-40 Mbps, which is not insignificant, and in fact might be the biggest limiting factor to the actual spread of Anywhere. While it’s easily achievable locally, it is far beyond standard 3G data rates (2 Mbps), requiring LTE or HSPA+ connections, not always easily available, and is slightly beyond WiFi 802.11 a and g standards, requiring at least 802.11n communication using multiple antennas. It is also at the edge of what the most recent ADSL modems can provide (40 Mbps in the ideal conditions). So perhaps Bob Zelin’s dream of remote editing will still be limited by the last mile infrastructure, at least for a time.

In the end, the message is pretty clear: right now Adobe Anywhere is aimed at the enterprise players like CNN and large post houses who can afford the necessary equipment or perhaps can fit it into an already existing hardware structure. Certainly, the benefits are great, but the little folk can only hope that at some point these solutions will trickle down.

NAB Special – KEM roll and a multicam trick in Premiere Pro next

02.-KEM-Roll-and-a-multicam-trick

No typical tutorial this week, but a small demonstration of the KEM roll (sequence editing) feature in the next version of Premiere Pro, and a little multicam trick which can help you to “unnest” your clips from a sequence. I hope you’ll enjoy it, even if the software is not available yet.

The brighter side of Anywhere

AdobeAnywhere

Today another interesting thought about Adobe Anywhere struck me. Essentially, part of the idea of Anywhere is to totally divide the UI from the renderer. The concept in itself is nothing new, and all 3D applications use it. However, as far as I know, it is the first time that it has been applied to an NLE.

The obvious implication is the possibility of network rendering, and using more than a single machine for hefty tasks. Of course, there are a lot of caveats to multi-machine rendering, and we already know that not all problems scale well, and sometimes the overhead of distributing processes is higher, than the gain in speed. But in general, the more means the better.

The less obvious implication is the possibility of running various types of background processing, like rendering or caching parts of a sequence that are not currently being worked upon, if the resources allow. This means a quicker final render time. It also allows to run multiple processes at the same time, or even multiple parts of the rendering engine at the same time, allowing for better utilization of server’s computational power.

However, the most overlooked implication is the fact, that the renderer is UI agnostic. It doesn’t care what kind of client gets connected, it only cares that the communication is correct. And this has potentially huge implications for the future of applications in Adobe’s Video Production line.

Pipeline

Anywhere Renderer most likely already consists of several separate modules – After Effects, and Premiere, possibly even Lumetri (SpeedGrade), which can be chained in any sequence. For example, the client sends a request to render a frame at half-resolution consisting of the following stack: V1 P2 clip, V2 Red One clip with Ultra Keyer on it, V3 Dynamic Linked AE clip with lower thirds, and a color correction via Lumetri on top of it all.

Adobe Anywhere makes Premiere renderer open a Premiere sequence, finds a dynamic linked clip overlaid on a source material, and asks After Effects to render it, at the same time composits the V1 and V2, and once AE is done, composits the result together. Then it sends the whole thing to Lumetri for color correction. Each stage is most likely cached, so that when the CC or the keyer is changed, the AE does not have to re-render the thing. Then the frame is rendered and sent out to the client.

Running locally

However, it’s all in the client-server architecture – you might say. How about those of us who use the programs on a single machine? Anywhere will not have any impact here… or will it?

There is nothing stopping Adobe from installing both the server, and the client software on the same machine. After all, the communication does not care about the physical location of client and server that much. It cares about the channels, and whether the messages are being heard. Perhaps the hardware might be a little bit problematic, considering the fact that both UI, and render most likely use GPU acceleration. But other than that, all Adobe needs to do is to distribute the Adobe Anywhere Render Engine as part of any local application. It would perhaps be custom-configured for local usage, to streamline some tasks, but it’s going to be the same Anywhere.

And that, dear readers, could be huge.

Implications

Separating the UI and a renderer is a brilliant move. In the long run it allows Adobe to alter the client without rewriting or even incorporating the renderer code in the application. For all Anywhere cares, the UI could be as simple as an HTML5 application which would send and receive the proper messages. Need I say anything more? Let your imagination run wild already.

The regular readers perhaps already see where I am going with it. The newcomers are encouraged to read about my vision of Adobe conforming tool, and – why not – Stu Maschwitz’s proposition of merging Premiere Pro and After Effects, or my hopes of seeing a Smoke-like ubertool from Adobe. Any such application could access Anywhere’s backend, and could be optimized to suit specific needs, giving birth to a number of tools specific to certain needs. Tools which are easily written, quick to update, perhaps even accessible via mobile devices. And in time they might also work locally, on a single powerful machine. Or on a number of them. Wherever you prefer. Anywhere.

A frame too far…

A frame too far...

In late September 1944 Field Marshall B. L. Montgomery, a very bold and talented British commander, led an ambitious offensive whose objective was to force an entry into Germany over Rhine. He aimed to  capture a series of bridges with the help of paratroopers, who would have to defend them until the main forces arrived.

Him and Premiere Pro have a few things in common: they are both audacious and tend to overreach. Monty’s boldness and wits won him a few battles, especially during his campaigns in Northern Africa. However, in this case his arrogance went a bit too far. Similarly, Premiere Pro also has its Arnhem moments.

Premiere has always included the current frame in the in/out timeline selection, but until the latest release, it has not bothered me much. CS6 introduced a plethora of new features, which made me change my previous workflow from mouse and keyboard driven to more keyboard oriented, mostly due to the new trimming interface, and the unpredictability of the ripple tool, making the problem more pronounced.

A frame too far…

The joys of old

It used to be, that the arrow tool ( V ) allowed me to perform about 80-90% of operations by having the mouse in my right hand, and the left hand on the keyboard close to the Ctrl (that’s Command for you, Mac people) key. If I wanted to trim, the arrow tool would intelligently turn into the trim cursor, when it approached the edit point. If I needed a ripple trim, I would press Ctrl , and I would always get the ripple trim tool for this operation. Then let go of the modifier, and I’m back with the arrow. If I wanted to adjust audio levels, the arrow tool would allow me to raise and lower the value in the timeline, while Ctrl would add keyframes, and allow to manipulate them. The only actual tools I used were rolling trim  (N ), slip ( Y ) and slide ( U ). Rarely rate stretch ( X ), as handling the speed changes by Premiere for interlaced footage is pretty uninspiring, and from time to time track selection tool ( A ). I don’t remember ever needing the tool palette, and found myself constantly switching it off to save some screen real estate.

Easy and fast. Combine that with a few shortcuts to add default transitions, and it turns out that using mouse and keyboard seems to be the most efficient way to go. The simplicity, ease, and flexibility of the timeline manipulation in Premiere was amazing. And for anyone using this method, opening Final Cut Pro legacy was sometimes pretty annoying. And Avid, especially before MC5? Don’t even get me started…

The mixed bag of the new

Then comes Premiere CS6 with its ability to select edit points, and improved trimming. And suddenly, this old workflow seems less and less viable. The hot zones for edit point selections are pretty wide. One has to be careful not to suddenly click on an edit point, because then the trimming mode will be activated, and ctrl will no longer act in predictable manner, giving you the ripple trim as you’d expect. It will change its behavior based on what is selected, and in general make manipulating timeline with a mouse much less efficient.

It’s understandable then, that I found myself drifting more towards the keyboard-oriented workflow, using trimming mode ( T ), setting in ( I ) and out ( O ) on the timeline, and finally learning keyboard shortcuts for lift ( ; ) and extract (apostrophe) – something, that I never needed before, because ripple delete, razor tool  (C ) and add edit ( Ctrl+K remapped to Z ) were simply quicker. I even started to enjoy the new way of doing things.

And all would be fine and dandy, were it not for the already mentioned fact, that Premiere marks the currently displayed frame as part of the selection. Which means, that if you position your playhead on the edit with the nicely defined shortcut keys (up and down arrow in my case), and press O to mark the out point, you will include also a single frame after the cut.

This is a bit problematic.

I admit I have seen it before – this has been the standard behavior of Premiere from the beginning – but because I hardly ever used in and out in the timeline, this has not bothered me much. However, when the selection started to become the core of my workflow, I found it terribly annoying, and slowing down my work. When I do any of the following operations, I need to constantly remind myself to go back one frame, to avoid the inclusion of the unwanted material:

  • lift and/or extract,
  • overlay edits with in/out in the timeline,
  • exporting based on the in/out selection.

I enjoy editing in CS6 a lot, but this “feature” literally keeps me up at night. It’s such a basic thing, that even Avid got this one right… When the playhead is positioned on an edit point, the out point is selected as the last frame of the incoming clip.

Why then does Premiere behave like Montgomery and has to go one frame too far? British Field Marshall also wanted to eat more than he could chew, and in the end he had to withdraw. Every time I have to go back a frame, I feel like I’m loosing a battle. Why?

Not one frame back, I say!

Best practices of applying 8-bit effects in Premiere Pro

This is how the ramp looks like without any effects applied.

Summary: Always apply your 8-bit effects as the last ones in the pipeline. 

A few years ago Karl Soule wrote a short explanation of how the 8-bit and 32-bit modes work in Premiere Pro. It’s an excellent overview, although it is a bit convoluted for my taste (says who), and does not sufficiently answer the question on when to use or not to use 8-bit effects, and what are the gains and losses of introducing them in the pipeline. Shying away from an Unsharp Mask is not necessarily an ideal solution in the long run. Therefore I decided to make a few tests on my own. I created a simple project file, which you can download and peruse ( 8-bit vs 32-bit Project file (1142 downloads) ).

In essence, the 32-bit mode does affect two issues:

  1. Banding.
  2. Dynamic range and clipping.

For the purposes of testing them, in a Full HD sequence with Max Bit Depth enabled, I created a black video, and with a Ramp plugin I created a horizontal gradient from black to white, to see how the processing will affect the smoothness and clipping of the footage.

Settings for a simple horizontal ramp to test it.

Next I applied the following effects:

  1. Fast Color Corrector with Input White reduced to 215 (clipping white).
  2. RGB curves that roughly mirrors the Gamma 2.2
  3. RGB curves that is roughly the reverse of 2.
  4. Fast Color Corrector with Output White reduced to 215 (bringing white back to legal range)
  5. Offset 8-bit filter that does nothing, but breaks the 32-bit pipeline.

Settings for the first Fast Color Corrector.

Master curve in the first RGB Curves effect

Master curve in the second RGB Curves effect.

Settings for the second Fast Color Corrector.

To assess the results I advise opening a large reference monitor window in the YC Waveform mode. Looking at Program monitor will not always be the best way to check the problems in the video. You should see the diagonal line running through the whole scope, like this (note, that the resolution of Premiere’s scopes is pretty low BTW):

This is how the ramp looks like without any effects applied.

Now perform the following operations, and observe the results:

  1. Turn off the effect number 5 (Offset), and notice a bit more smoothness in the curve, however it’s rather superficial, and hardly likely to influence your final delivery.
  2. Turn on Offset, and move it before the second FCC. You should immediately see the clipping in the highlights. This is the result of 8-bit filter clipping the values at 255,255,255 (white), and disposing of all super-white information that the first FCC introduced.
  3. Move offset between RGB curves (between 2 and 3). Apart from the clipping, you will also notice the banding and posterization, both in the program monitor, and in the Waveform as little flat steps. This is the result of 8-bit filter rounding up or down the values to the nearest integer (123.4 is rounded to 123), and later modifications have less data to work with.
  4. Move offset before the first FCC. Most likely you will not see anything too different from when the Offset filter was the last one applied. However, if you had footage that had super-white values, they would be clipped and irrecoverable. Similarly, if you had 10-bit footage, it would have been converted to 8-bit, and all latitude and additional information would be lost as well.

The effect of the first Fast Color Corrector on the ramp. Notice the clipping.

The effect of the second Fast Color Corrector on the ramp. Notice the lack of clipping.

The effect of the first RGB Curves on the ramp.

The effect of the second RGB Curves on the ramp.

All four 32-bit effects on.

All filters on with Offset in the last place. Notice a bit less smoothness in scopes.

All filters on with Offset before the second FCC. 8-bit filter makes the data irrecoverable, and clips the highlights.

All filters on with Offset between the RGB Curves. Notice the irregularity in the curve, showing rounding errors and compounding problems.

All filters on with Offset after the first FCC. The curve is not as irregular, but the clipping is still present.

All filters on with Offset in the first place. Notice the similarity to the Offset in the last place. The 32-bit pipeline is not broken afterwards, although the footage is converted to 8-bit because of the filter pipeline. 

This simple experiment allows us to establish following best practices on applying 8-bit effects in Premiere Pro:

  1. Try to apply 8-bit effects as the last ones in the row, especially with footage that is 10-bit or more.
  2. Always perform your technical grade and make your image legal (ie. not clipping) before you apply any 8-bit effect in the pipeline.
  3. Ideally do all your color correction before any 8-bit effect is applied.
  4. Avoid performing heavy color correction or image processing after an 8-bit effect has been inserted to avoid banding and posterization, especially with 10-bit or RAW footage.

And that’s it. I hope this sheds some light on the mysteries of Premiere Pro’s 32-bit pipeline, and that your footage will always look great from now on.