«

»

Synchronization – What You Need to Know About Time Codes

DulabicStudios-Synchronization

Guest Blog Post by Dave Chick, music composer for Living Life or Waiting to Die 

If you’re an up-and-coming filmmaker (and I’m assuming that you are since you’re on this site), then it probably took you a split second to recognize what that oh-so-lovely model is standing next to.

That series of digits is lovingly referred to as SMPTE (Society of Motion Picture & Television Engineers) Time Code – shortened to either “Simptee” or just plain time code.

Timecode is a form of metadata that provides an objective time reference for editing, synchronizing or identification in film, video and audio. In a nutshell – it’s the one thing that will help keep everyone working on your film sane.

We won’t get into the technical bit structure of the standard, but just focus on the practical benefits it provides.

Format-wise, the SMPTE time code string is comprised of four two-digit numbers separated by colons: 00:00:00:00 which correspond to hour:minute:second:frame. This gives everyone working on the film the ability to accurately and reliably reference any point in the film – down to the specific frame.

SMPTE time code supports many different frame rates in use today. 24 fps, 25 fps, 29.97 fps (drop frame), 30 fps, etc.

Drop frame (29.97 fps) is the only rate where the format of SMPTE time code is formatted differently than the others. For drop frame, the last separator between seconds and frame is a semi-colon rather than a colon: 00:00:00;00.

From tracking daily reels, to editing and pretty well every single aspect of post-production, time code is the thread that can keep your project from devolving into a mess of confusion.

So, What Do You Care?

Why does Bojan have a film composer and audio guy talking about time code? Well, beside the fact that he left it up to me to choose, he didn’t really have any choice in the matter.

Seriously though, integrating time code into the production and post-production stages of your project will save everyone a LOT of headaches.

I’ll give you an example:

Without Time Code

EXT. HIPSTER COFFEE SHOP – DAY

DIRECTOR
Hey, composer-dude! In that romantic scene,
keep the mood in the cue a bit more subdued until the first
kiss and then really make it swell. When he stares at her,
drop it down to something a bit more forbidding.

A couple of days later…

DIRECTOR
What the… Why did you swell there and drop the mood there?

COMPOSER
What do you mean? That was the first kiss.

DIRECTOR
No it’s not, that’s a peck on the cheek!

COMPOSER
I call it a kiss.

DIRECTOR
I meant where they get into a serious lip-lock…
And why did you drop it down there?
I said to drop it down when he stares at her.

COMPOSER
He kisses with his eyes open. When is he not staring at her?

Both groan

FADE

With Time Code

EXT. HIPSTER COFFEE SHOP – DAY

DIRECTOR
Hey, composer-dude! In that romantic scene that starts at 00:45:32:00,
keep the mood in the cue a bit more subdued until the first kiss at 00:46:12:04
and then really make it swell. When he stares at her at 00:46:58:17,
drop it down to something a bit more forbidding.

A couple of days later…

DIRECTOR
Perfect!

Both: dorky high-five and fist pump.

FADE

Perhaps that may be a bit of a stretch (or is it?), but the point is that use of SMPTE time code when communicating with people working with you on your project avoids a lot of technical and communication-related issues.

The Process

The advent of SMPTE time code was a boon to linear video editing – specifically for synchronizing multiple video and audio devices. The advent of non-linear, random-access editing may have minimized the need for SMPTE time code to synchronize multiple devices, but it is still used as a crucial tool for point of reference communication on the vast majority of video and audio projects.

It’s gotten to the point where synchronizing music to video is almost taken for granted. Crafting a music cue to match a scene still requires a lot of subjective judgement, but the process of measuring tempo and beats relative to onscreen action is so much more simple than it was just a couple of decades ago.

A Wee History Lesson

DulabicStudios-Synchronization2

The methods that were used in the past almost seem to have required a degree in math to contemplate. Tables and scales to help calculate and cross reference music tempo versus time versus frame rate were cumbersome and confusing.

After a composer and director spotted the film, the composer would then go back to map out the cues. The composer would probably start jotting down thematic ideas and possible treatments, but then would inevitably have to start finding appropriate tempos and map out where the cues would be able to synchronize musical elements with the film.

Charts that would cross-reference beat-count with fractions of seconds, lists of frame/sprocket (perf) hole per second to beat and a variety of mathematical formulas to convert between/among feet, meters, time, fps, beat and bpm were all employed to find that magical tempo in the name of synchronizing music to film!

And that’s all well before the scoring session where even more challenges existed in getting a live performance group to play the theoretically synchronized score to film!

In an effort to keep this article from breaking any length limits, I’ll jump to the present and give a quick overview of some of the techniques and tools I employ with present technology.

Back to the Future

DulabicStudios-Synchronization3

One of the first things that I do when setting up a project is to synchronize my DAW (Digital Audio Workstation – I use Logic) to the film. As you can see from the screenshot above, Bojan supplied me with a final cut of his latest film “Living Life or Waiting to Die” with timecode burned into the frame.

After I synchronized my SMPTE time code rate and clock to the burn-in on the film, I was ready to go. From there, Bojan and I spotted the film, and I supplied him with Spotting Notes that listed all the cues with their appropriate SMPTE start and end times.

I began to create appropriate markers in the timeline to flesh out where underscore music was intended to go. Each cue has a definite start and end marker, but it also may have intermediate markers where music elements need to correspond to onscreen action, sound or dialog. Once those are in place, then tempos need to be adjusted to match beats with those markers.

There’s a bit of wiggle room (+/- 2 frames) with respect to how on-the-button something needs to be before it’s perceived to be out of sync. I try to set a general tempo that “feels” appropriate and then adjust it accordingly to line a beat up with the films’ sync element.

If I’m lucky, then that beat will be the appropriate down-beat, but if not, I make a call to the editor and… No, I don’t go back to the director / editor and tell them to add a few more seconds in the cut!

There are methods that a composer can employ in music to add or drop beats to a piece that don’t sound like sonic hiccups or stumbles. Elements such as pauses, pickup notes, tempo ramps and a host of others can be put to use so as to “trick” the audience into not even noticing that there was a bar of 3/4 inserted in a 4/4 measure.

In contrast to pre non-linear editing days, setting tempos to synchronize music to picture is quick and relatively painless process! Tempo change points, ramps and shifts can be experimented with and implemented so much more easily and dynamically than even a couple decades ago.

DulabicStudios-Synchronization.docx4

Once the tempo map is complete, then the “meat” of the scoring happens. Once I’m satisfied that a cue has reached a point that I feel it matches the directors’ vision, I bounce out an MP3 and label it with the appropriate cue information that includes the SMPTE time code start point. The director just has to line up his playhead to the SMPTE time code and drop the start of the file in – voila!

If there are any edits or adjustments that need to be made, the director communicates his notes by referencing SMPTE time code points.

The Final Product

That’s all fine and good for review, but when it comes to final asset delivery, then in addition to filenames including SMPTE start times, I deliver AIFF or WAV files in the broadcast WAV format (BWF). These files have meta-data embedded in them that includes the SMPTE start time of the file.

By setting up my systems’ SMPTE time code clock to be synchronized with the film, the sound mixer just needs to drop all the files into the project and execute the command to move the files to their original start times. If all goes as planned – voila again! Not all film editing programs support this functionality, so having a SMPTE reference for each file avoids confusion as to where in the timeline each cue should be placed.

Writing music for film is a funny juggling act between creativity and technical logic – all in support of making sure that the directors’ vision is realized. That being said, it’s all based upon everything being able to come together as a cohesive whole in the end. Making synchronization and SMPTE time code one of your BFF’s will make sure that all artistic elements in a film live happily ever after.

 

Dave’s a film-composer based out of Vancouver BC. He’s also a producer and audio engineer contributing articles to professional audio related blogs – recently being featured on the Best Buy Canada Plug-In Blog:
http://bit.ly/1fh5P7f 

More information about him and his work can be found at his site:
http://www.gethatched.ca

Dave can also be found at all the usual social media outlets:
https://twitter.com/chckn8r
http://www.facebook.com/dave.chick

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Follow

Get every new post on this blog delivered to your Inbox.

Join other followers: