Professional Documents
Culture Documents
04. The Codec You Edit With 07. The Codec You Export
COD EC S D ON ’T N E E D TO BE HARD.
NO , R E A L LY, TH EY D O N’T.
By the end of this article, you will be able to pick At each stage, I’ll explain which factors you should be
the best codec for you on each project. My goal considering as you choose a codec, and I’ll give you
is to empower you to make your own informed some examples of the most commonly-used codecs
decisions about codecs, instead of relying on what for that stage.
worked for someone else.
Along the way, we’ll cover why low-end codecs
I’m going to walk you through every step in the and high-end codecs can each slow down your
process of making a video. Click on a heading to editing, the reasons for a proxy/offline edit, a real-
jump to that section. I’ll cover: world project walkthrough, some storage-saving
strategies, and an explanation for why transcoding
The codec you shoot
cannot improve your image quality.
The codec you edit
The benefits of optimizing your codecs can be
The codec you color-correct
huge. The right codec will preserve your images in
The codec you send to VFX the highest quality, help you work faster, and it will
also enable you to take the best advantage of your
The codec you export
computer and storage. You’ll be able to work faster
The codec you archive on a laptop than many can on a high-end tower.
3
back to the index
0 1 . W H AT A C O D EC D O E S
back to index
A codec is a method for making video files smaller, usually by carefully throwing away data that we
probably don’t really need, and they’re pretty smart about how they do that. A few years ago, I created
a video that covers the main compression techniques that many codecs use. It’s not required viewing to
understand this article, but it certainly won’t hurt.
How Codecs Work – Tutorial.
5
back to index
6
back to index
7
back to index
Lossyness
One of the columns in the table is “lossyness,”
which is an important concept with codecs. When
I’m talking about lossyness, I don’t necessarily
mean what your eye sees. I mean the amount of
data that is retained by the codec, only some of
which you can see. The question is: If I had an
uncompressed image, and then I compressed it
with this codec, how similar would the new image
be to the old image? How much information is lost
in the transcode? If the two images are very similar,
then the codec is not very lossy, and if they’re pretty
different, then it’s more lossy.
8
back to index
9
back to index
h.264 DNxHD
They all look pretty much the same, don’t they? The
visual quality is just about the same, and the H.264
file is a fraction of the size of the DNxHD file. This
is why it’s the recommended setting for YouTube. It
looks just about as good to the eye, and the file is
much easier to upload to the internet.
10
back to index
The trouble with the H.264 version, however, comes when you try to make changes to the image. What if
you wanted to increase the exposure?
h.264 DNxHD
Now we can see where the highly-compressed image falls apart. Her hair and shirt look terrible in the
h.264 image, and the buildings by the river look all mushy. The DNxHD still looks great, though.
11
back to index
12
back to the index
02. THE CODEC JOURNEY
back to index
14
back to the index
03. THE CODEC YOU SHOOT WITH
back to index
16
back to index
Cost
The first consideration is obviously cost. Generally camera, via HDMI or SDI, and compress it separately.
speaking, the more expensive the camera, the So you end up with two copies of your footage –
higher quality codecs are available on it. I say one copy heavily compressed on the camera, and
generally because there are some “sweet spot” a second copy lightly compressed on the external
cameras that can offer excellent codecs at a recorder. The key thing here is that the camera sends
reasonable price. Panasonic’s GH series (especially the signal out to the recorder before compressing it.
in the early days when the GH2 was hacked) were
One important note here is that many cheaper
known for offering better codecs than the other
cameras only output 8-bit, and often not in 4:4:4.
cameras in its price range.
An external recorder might be able to compress
to a 12-bit codec, but if the camera is only sending
TIP: BETTER CODECS WITH EXTERNAL
RECORDERS 8 bits, the recorder can only record 8 bits. Some
cheaper cameras may also not output a “clean”
One way that people (myself included) have found HDMI signal that is suitable for recording. We call
to capture higher-quality codecs on cheaper an output signal “clean” when it’s just the pure
cameras is to use an external recorder. image with no camera interface overlays.
17
back to index
Storage
The second factor to consider is storage space. High-quality codecs tend
to be higher bit-rate, which means that the files are larger. You need to be
prepared to store and back up all of those files as you’re shooting, and
you may also have to upgrade your memory cards in order to be able to
record the high-bitrate data. If you’re shooting solo, then you may end up
choosing a lower-quality codec because it allows you to change memory
cards less often and focus on the story instead.
Finishing
Another factor to consider is how much color-correction and VFX
(collectively referred to as finishing) you plan to do. If you’re going to be
doing very minimal color-correction and no VFX, then you can probably
get away with lower bit-depth, chroma subsampling, and macro blocking
that come with lower quality capture codecs.
18
back to index
Editing Hardware
The last factor to consider is your editing machine, because
most capture codecs are not well suited to editing without
a high-performance computer. H.264 and some raw files
require a powerful CPU/GPU to edit smoothly, and very-
high-bitrate codecs may require high-speed hard drives
or data servers. Unless you happen to be shooting an
edit-friendly codec, you may have to transcode your files
to another codec before editing, which can take time. For
most people, transcoding the footage isn’t a huge issue
because it can be done overnight or on a spare computer.
If you’re working on very tight turn-around times, however,
you may choose a codec that will allow you to start editing
immediately after a shoot, even if that means a higher cost
or a sacrifice in image quality. I explain which codecs are
best for editing in the next section.
19
back to the index
04. THE CODEC YOU EDIT WITH
back to index
Alright, you’ve shot your film, and you’ve got all of your files onto your computer. Now you need to decide
whether you’re going to edit with these files, or whether you want to transcode into another format.
21
back to index
Most lower to mid-range cameras record with codecs that use temporal compression, also known as long-
GOP compression. I will give you a simple explanation here, but if you’re interested in learning in more
detail, check out my codecs video, starting at 19:00.
The simple explanation of a long-GOP is that, for each frame, the codec only captures what has changed
between this frame and the previous frame. If the video doesn’t include a lot of motion, then this means that
the new file can be a LOT smaller than the original. The difference between this frame and the last frame is just
a few pixels, so all you need to store is a few pixels. That’s great!
DECODED:
STORED
IN FILE:
22
back to index
The issue, however, is that these codecs tend The other thing that can cause issues with playback
only to work well when played forward. (If you’re is raw video. Raw video needs to be converted
curious why, take a look at the video). That’s great before it can be displayed (sort of like a codec
for viewing on YouTube and your DVD player, but does), and some computers can’t decode the raw
it’s not great for editing, because when you’re file fast enough, especially if it’s 4K. Ironically, both
editing you’re often jumping around, or playing a the low-end cameras and the highest-end cameras
clip backward. It takes a lot more processing power produce files that are hard to edit!
to do those things quickly with a long-GOP codec.
A high-end computer might have no trouble, but High-Bitrate codecs can
even a mid-range computer will lag and stutter
Slow Down Your Editing
when you skim through the footage quickly or
jump around. For low to mid-range codecs, you don’t have to
worry about the bitrates at all. Once you start
Codecs that aren’t long-GOP (a.k.a. Intra-frame
moving up the ladder, however, high bitrate codecs
codecs), however, can play backwards just as easily
can cause issues with editing, especially if you’re
as forwards, and even a mid-range computer can
working on everyday computers.
skip around very smoothly. If you’ve only ever
edited clips straight from the camera, you not The reason is because your computer needs to
might realize what you’re missing! be able to read the data from your hard drive at
23
back to index
a bitrate that is at least as high as your codec’s multicam with three cameras? Suddenly you need
bitrate. It makes sense — if your codec is 50Mb/s 3x that data rate: 2,202Mb/s! At that point, you’re
(fifty megabits per second), then your computer going to need to invest in some high-performance
needs to be able to read that file from your hard hard drives or RAIDs.
drive at 50Mb/s or else it’ll fall behind and stutter.
Here are some rough guidelines for common
(note that Mb/s stands for megabits per second, data storage speeds, though of course there will
while MB/s stands for megabytes for second. There always be certain models that underperform or
are eight bits in a byte, so you need to multiple by overperform.
8 when converting from MB/s to Mb/s)
Standard spinning drive: 100-120MB/s
The good news is that hard drives are getting
Professional spinning drive: 150-200MB/s
faster every day, so 50Mb/s is never going to cause
any problems. But what if you’re editing ProRes Standard SSD: 400-500 MB/s
422HQ at 4K, which is 734Mb/s? The average
Low-end RAID: 200-300 MB/s
external hard drive is only just barely fast enough
to play that back, and some cheaper hard drives High-end RAID: 1000-2000 MB/s
won’t manage it. And then, what if you’re editing a
24
back to index
25
back to index
26
back to index
27
back to index
Proxy Edit
If you’re going to transcode the native camera The proxy workflow is so common that many high-
files before you edit them, then you’ll use an end cameras record a high-end raw file *and* a
“intermediate” codec. It’s called intermediate ProRes or DNxHD proxy file at the same time. After
because it comes between the capture codec and the shoot, the raw files are backed up and put in
the export codec. There are two common ways of storage, while the proxy files are sent off to the
working with intermediate codecs: editors and to the director/producers for dailies.
The first is the “proxy” workflow or “offline edit.” When choosing a proxy codec, you want to go for
This means that you are transcoding your captured one that does not use temporal compression (aka
footage into an intermediate format, editing with inter-frame compression or long-GOP compression),
that format, and then re-linking back to the original and you want to pick one that has a lower bitrate.
camera files before exporting. Because you will use The low bitrate means that the files are much
the camera files to export and not the proxy files, smaller, so you can use fewer/smaller/cheaper hard
you don’t need to worry so much about picking drives, simplifying your workflow. Woot!
a proxy codec with great image quality – lossy
codecs are fine. You can optimize for editing speed While the proxy files are great for editing, you
and storage convenience instead. shouldn’t do more than basic color-correction with
28
back to index
proxy files. If you are going to do all of your color- Some good choices for proxy codecs
correction inside of your editing software, then it’s
best to re-link back to your camera files because By far the most common proxy codecs are DNxHD/
your proxy files may have lower color quality. DNxHR and ProRes. They have both been around
for years, so they’re very widely supported.
The good news is that most editing software today Everyone knows how to handle them. They are
can switch between the camera files and the proxy both very well suited to a proxy workflow (ProRes
files in just a couple clicks, so you can even go back even has a preset called “proxy”), and are nearly
and forth if you need to. interchangeable when used for proxies.
We’ve published detailed guides for proxy Since DNxHD is made by Avid, and ProRes is
workflows in each of the major NLEs: made by Apple, it makes sense that DNxHD
would work better on Media Composer and
Final Cut Pro X Proxy Workflows
ProRes would work better on Final Cut Pro X.
Premiere Pro Proxy Workflows That used to certainly be true, but nowadays
both codecs work very smoothly on all modern
Avid Media Composer Proxy Workflows
editors (including Premiere Pro). There may be a
slight speed increase in using the codec that was
designed for the system, but it’s very slight.
29
back to index
The only significant difference between the two Start off with the smallest ProRes or DNx
for a proxy workflow is the fact that you may have codec in the same resolution as your capture
trouble creating ProRes on a PC, while DNxHD codec. Look at the GB/hr column and multiply
is very easy to create cross-platform. The only it by the number of hours of footage you
officially-supported way to create ProRes on a PC have. If you have enough storage space, then
is with Assimilate Scratch. There are some other you’re good – use that codec. If you have lots
unsupported methods for creating ProRes files of extra storage space, think about using the
on a PC, but they’re not always reliable. PCs can next largest flavor.
easily play back and edit ProRes files, but you
If you don’t have enough storage space, or
can’t encode new ProRes files on a PC as easily
if you’re on an underpowered machine, then
as DNxHD, and so some editors prefer a DNxHD
take the resolution down a notch. A lot of
workflow for that reason.
huge-budget Hollywood films were edited in
Regardless of which of the two codecs you pick, 480p just a few years ago, so don’t sweat it
you also have to pick which flavor you want. if you need to lower your resolution from 4K
This is really going to depend on your storage down to 720P for the edit.
constraints – it’s a tradeoff between image quality
and file size. The good news is that you don’t
need tip-top image quality when you’re editing,
so you can choose a low-bitrate codec.
30
back to index
Direct Intermediate
The other type of intermediate workflow is intermediate workflow. Some people will also call
something that I’m calling “Direct Intermediate.” this an “online” workflow, but this is also confusing
This means that you transcode your camera files into because that term was created to describe a
a codec that is both good for editing and very high- workflow that includes an offline and an online edit,
quality (not very lossy). Because the codec is very not a workflow that’s online from start to finish.)
high quality, almost all of the original information
The key to picking a good Direct Intermediate
from the camera files has been preserved, and so
codec is to make sure that you are preserving all
it’s not necessary to re-link back to the camera files
of the information from your capture codec. An
– you can just export directly from the intermediate
intermediate codec will never make your images
files. There will be some theoretical loss of
better (more detailed explanation below), but it
information when you transcode, but if you pick
can definitely make them worse if you choose the
a good enough intermediate codec, it’ll be small
wrong codec. The important thing is to understand
enough that you don’t need to worry about it.
the details of your original footage and make sure
(Note: I’m calling this process “Direct that your intermediate codec is at least as good
Intermediate” because there isn’t a common as your capture codec in each area. If you capture
name for this workflow. People usually just call your footage on a DSLR like a Sony A7Sii at 4K,
this “intermediate,” but that can be confusing then you will be recording in a 4:2:0, 8-bit, Long-
because proxy workflows are also a kind of GOP codec at 100Mbps. You want an intermediate
31
back to index
codec that is at least 4:2:0 and 8-bit. Going beyond pack a lot more information into those 100 megabits
these values (e.g. to 4:4:4 and 12-bit) won’t hurt, than ProRes can. In order for ProRes to match the
but it also won’t help at all, so it’s probably not image quality of h.264, you need a much higher
worth the extra storage space. bitrate. I would recommend only using ProRes 422
or ProRes 422 HQ if you’re starting with a 100Mbps
Let’s say, for example, that we want to go with a
h.264 codec. ProRes 422 will probably do just fine,
ProRes codec. We have 4 options to choose from
but if you have lots of storage space, then going up
that are 4:2:2 and 10-bit.
to ProRes 422 HQ will have a slight edge.
145Mb/s ProRes 422 Proxy While it’s fine to simply match the bit-depth and
328Mb/s ProRes 422 LT color sampling when choosing an intermediate, you
should always increase the bitrate at least a little.
471Mb/s ProRes 422 If you’re going from long-GOP to a non-long GOP
707Mb/s ProRes 422 HQ codec, then you should increase the bitrate a lot.
You might think that all you need is to match the Side note: If you wanted to go with DNxHD instead
camera bitrate (100Mbps), but you actually need to of ProRes, you have similar options, except that
greatly exceed the camera bitrate. This is because DNxHD also offers an 8-bit version for the lower-
h.264 is a much more efficient codec than ProRes. end codecs. Since our footage is 8-bit to start with,
Because h.264 uses long-GOP compression, it can that won’t hurt us at all.
32
back to index
THE PROXY WORKFLOW SOUNDED PRETTY software that automatically syncs audio tracks or
GOOD. WHY DO THE DIRECT INTERMEDIATE? multicam shoots.
Part of the reason why the Direct Intermediate Another reason why you might want to use a Direct
workflow is common is because it used to be a lot Intermediate workflow is because you can move
harder to use a proxy workflow. Some of the major right on to color-correction and VFX (“finishing”)
software providers didn’t make it particularly easy process without swapping around any files. Keep
to relink back to the original camera files, and reading, and I’ll explain more about why that’s
so people would choose a direct intermediate convenient in the Color-Correction and VFX
workflow. Nowadays, however, it’s pretty easy to sections.
do in any editing package. The main exception is One downside, however, is that you can’t “bake
when you have a lot of mixed footage types. If you in” the LUTs for your editor – you’re going to need
have multiple frame rates and frame sizes in the to apply a LUT via a color-correction effect in your
same project, switching back and forth from the editing software. If you were to include the LUT in
proxies to the capture codecs can be a headache. your transcode for Direct Intermediate workflow,
you would be losing all of the benefits of recording
If you are using some third-party tools to help
in log in the first place.
prep and organize your footage before you start
cutting, those can also make the relinking process The other obvious downside is that you need to
more tricky. One common example might be store all of these (much larger) files.
33
back to index
This is very important, because it is very This is a photo of a rose reflected in a water
commonly misunderstood, and there is a lot of droplet. It’s 4 megapixels, and it looks pretty nice
misinformation online. Transcoding your footage on my 27-inch monitor.
before you edit will never increase the quality of
the output. There are some extra operations that
you could do in the transcode process (such as
doing an up-res) that could increase the image
quality, but a new codec by itself will never
increase the quality of your image.
34
back to index
today. The Red Helium setup costs about $50,000, my fancy new image will never be better than the
it’s 35 megapixels, it’s raw, it has one of the best first one. I have a file that is technically higher-
camera sensors ever produced. resolution, but it does not capture any more of my
subject (the rose) than the first one did.
Which will be a better image – the 4 megapixel
photo, or the 35 megapixel photo? This is what you’re doing when you’re transcoding.
You are making a copy of a copy, taking a photo of a
photo. If you use a fancy high resolution camera to
take a photo of a photo, you will be able to preserve
pretty much all of the information in the original
image, but you won’t be able to add anything more.
35
back to index
A Real-World Example
Let’s say you’re editing a documentary that So you might decide to use a Proxy workflow
captured 4K footage using a Sony A7sii camera, instead and transcode your files to the ProRes 422
recording in the long-GOP version of XAVC-S. Not Proxy 4K format. Then your footage would only
ideal for editing. If they shot 40 hours of footage take up 2.8TB, just barely more than your captured
for your feature-length documentary, you’d end up footage. You can then easily edit off of a single
with about 2.7TB of camera files, which can fit on hard drive, and your workflow gets a lot simpler.
one hard drive easily (though you’ve made other, (For instructions on how to calculate bitrates and
separate backups, of course!). file sizes, check out this article: The Simple Formula
to Calculate Video Bitrates).
You could convert that to a high-quality, not-very-
lossy codec for a Direct Intermediate workflow,
maybe ProRes 422 HQ in 4K.
36
back to index
37
back to the index
05. THE CODEC YOU COLOR-CORRECT
back to index
39
back to index
40
back to index
41
back to index
42
back to index
Imagine you’ve finished your proxy edit – you And now we find another good reason for a
consolidate and transcode, send it off to your Direct Intermediate edit. If you are going to do
colorist, and then decide that you need to make some of your color work and your editing work
some changes to the edit. Now you’ve got go simultaneously, or at least are going to go back
back to the proxies to make the edit and then and forth a couple times, then it can be simpler
re-consolidate and re-send the footage. The to use one codec for both. This is especially
mechanics of that can get pretty messy. In a high- convenient if you are doing your editing and
end post-production workflow, there is usually a finishing in the same software package (or set of
“lock” on the edit so that the finishing processes packages, e.g. Creative Cloud).
can start. This means that (unless bad things
happen) you will try very hard not go back and
make changes to the edit. But hey, bad things
happen, so it’s best to be prepared.
43
back to the index
06. THE CODEC YOU SEND TO VFX
back to index
45
back to index
Go big or go home
In the VFX process, you tend to use very high-end These problems often arise because of image
(high bitrate) codecs for two main reasons. The first compression artifacts that are invisible to the naked
is simply that VFX artists need all the information eye. 4:2:2 or 4:2:0 color subsampling, for instance,
you can give them in order to do their job well. has almost no visible impact on the image. The
VFX artists are some of the pickiest people when it human eye cares mainly about contrast and seldom
comes to codecs, and for good reason. Everyone notices low color resolution, but the greenscreen
wants high-quality images, but image issues can extraction process relies primarily on color values.
often pose more of a problem for VFX than it does If the codec has thrown away a large portion of the
for editing, color-correction, and final export. color values by using 4:2:0 chroma subsampling, a
good color key may be impossible.
Many tasks in VFX work require very detailed
analysis of the image on a pixel-by-pixel level, The second reason why you want to use high-end
which most editors never need to do. For instance, codecs is because of generation loss. In the VFX
if you’re doing a green-screen extraction, you process, you will probably have to compress your
want the edge between your character and the file multiple times. You will compress the file once
greenscreen to be as clean as possible. We’ve all when you send it to them. And then, if they need
seen awful greenscreen shots where the edges to pass the file on between multiple specialists,
of the character are all choppy or blurred out. they may compress that file two or three times
46
back to index
Some high-end VFX workflows will only use lossless compression for
this reason. The good news is that your VFX shots are usually only a
few seconds per clip, which means your file sizes will be small even
with high-end codecs. So go big! If you captured 4:4:4 in the camera,
then definitely send 4:4:4 to VFX. Otherwise, I would pick a top-of-
the-line 4:2:2 codec (ProRes 422 HQ or DNxHQX).
47
back to the index
07. THE CODEC YOU EXPORT
back to index
49
back to index
If you have a fast enough connection, you could If the video is not public, they may also want a
upload a ProRes 422. Some people have reported small file that they can email or link directly to their
slightly (only slightly) better results when uploading own clients so that they can download it. In these
ProRes instead of the recommended h.264. If you cases, it may be appropriate to deliver more than
are delivering a file to a client, for them to upload two separate files, especially if it’s a long video.
to Youtube, then I would not give them ProRes, The file they should upload to YouTube will be
since you don’t know what kind of bandwidth too large to email conveniently. In this case, I will
they’re going to have. Fortunately, these sites usually down-res the file and compress it very
tend to publish recommended upload specs (just heavily. You also have to be realistic and decide
Google it). I personally will take whatever bitrate whether you think that your client will actually
they recommend and multiple by about 1.5x to 2x. understand the difference between the two files.
Your client may also want a file that they can embed If I need to deliver more than one file, I will usually
directly into their website (though I would dissuade call one of them “HD” in the filename and the
them, if you can). Generally speaking, you want a other one “small” or “not HD” in the filename. If
very heavily-compressed h.264. If you’re curious what you try to describe the different codecs to them,
a good bitrate is, my reasoning is that, if anyone I can almost guarantee they’ll have forgotten
knows what the sweet-spot bitrate is, it’s YouTube. the difference by next week, but they’ll probably
I periodically download a video from YouTube and remember what HD and “not HD” means.
check its bitrate, and use that as a benchmark.
50
back to the index
08. THE CODEC YOU ARCHIVE
back to index
52
back to index
GOT QUESTIONS?
53