How
High, a medium-budget comedy, is one of the first studio
features to be shot on 24P video, using the Panavision/Sony
24P camera. The picture features rap stars Method Man and
Redman playing themselves and was directed by Jesse Dylan
(Bob Dylan’s son). In this interview, editor Larry
Bock and assistant editor Erik C. Andersen cover their experience
editing the show: why it was produced this way, what it
was like to work with the new medium and how it affected
the post-production process.
|
For
a preview, the production installed a high-definition
projector in the theater balcony. Standing: Robert
Jacobs, tape operator from Video Applications, which
provided the projector; Andersen; Bock; John Haley,
audio engineer, Video Applications. Seated: John
Woo, assistant editor;
Keith R. Brown, editorial production assistant;
Daniel O'Keeffe, project engineer at Universal.
|
How
did you prepare to work in this new medium?
Bock:
I first called my friend Michael Alberts, who recently finished
cutting a low-budget film called Nicolas, the first feature
shot with the Panavision/Sony 24P high-def camera. He walked
me through the process and alerted me to certain problems
we might expect. Then I called Erik, whose technological knowledge
spans a wide spectrum of different areas in post.
Andersen:
I read everything available on 24P, including the articles
in the Guild Magazine. I also spoke to a couple of assistant
editors who had worked with the format.
Bock:
We also attended the camera and sound tests for How High at
Panavision. I was worried because it seemed like there were
so many opportunities for something to go wrong with sync
or timecode. But things went very smoothly, thanks to the
efforts of the film crew and my team.
What
were some of the advantages and disadvantages of working this
way?
Andersen:
Director Jesse Dylan and director of photography Francis Kenny
had to contend with the fact that the Panavision camera didn’t
have a full set of lenses. [At the time How High was shot,
there were three zooms and one prime available; as of this
writing, there
are four zooms and five primes.]
Creatively,
working in hi-def was no different than working
on a film show. In general, I was quite surprised
at how much the show looked like it had been shot
on film.
—
Bock |
|
The
camera cannot be under- or over-cranked, and it has contrast
problems in bright sunlight and signal-to-noise issues
in low light. The viewfinder is black and white, with
very low image quality, but there is a color HD monitor
on hand to check focus, lighting and actors’ performances.
I was very impressed with the image quality on the monitor.
What you see is what you get. |
One
big advantage to using the 24P camera is that HDCAM
tapes last 50 minutes (at 24 frames per second), much
longer than film magazines, so the DP was able to keep
the camera rolling between takes and change magazines
less often. Production thought this saved roughly an
hour a day and allowed for more setups, but it meant
that we were getting a lot of unslated material that
had to be sorted out in the cutting room.
Were
any other high-definition cameras considered?
Andersen:
They looked at the Panasonic AJ-HDC27A 720P DVCPRO HD
camcorder, which has variable speeds of 3, 6, 12, 24,
and 72 frames per second, but they were concerned about
mixing 720P images with the 1080P material produced
by the Panavision/Sony camera. It would have also have
meant a second camera package rental and a new set of
lenses for the DP.
How
did shooting on 24P affect your interaction with the
crew?
Bock:
For me it was the same as always — no different
than shooting on film.
Andersen:
I talked to the sound and camera department daily. One
issue was pre-roll. The 24P camera records time-of-day
code, which means there is a timecode break between
each take. There were many times I was unable to digitize
a slate because I only had a second of timecode before
the director yelled, “Action.” I needed
ten seconds of code and often didn’t get it.
What
was the workflow from the set to the editing room?
Andersen:
Hollywood Digital downconverted the HDCAM masters to
Beta SP. They transferred the full 16:9 image letterboxed
and added two timecode windows in the matte area: 29.97
timecode from the beta, and 24P (23.976) timecode from
the master. The Beta was digitized into the Avid and
we used audio from the Beta.
Did
you record audio on DAT as well as video?
Andersen:
Yes. Cameron Hamza, our sound mixer, hard-wired audio
to the camera via 300 feet of specially-built cable.
The sound went from the microphone to the mixing panel;
then four tracks ran out of the mixer: two to the camera
and two to a Fostex PD-2 DAT.
|
We
received an average of three-and-a-half hours
of material a day. Because there’s no telecine,
one big change for the assistants was that we
had to log the footage ourselves.
—
Andersen |
|
On
35mm, the camera, sound and video playback departments
work independently, but with the 24P camera, they are
hardwired together with sound, video and timecode cables.
We added two additional people to the film crew, a hi-def
engineer and an extra sound assistant to wrangle all the
cable. |
One
issue was how to record sound so that camera and DAT
timecode would match. Production used an Evertz Afterburner,
which produces a downconverted image and also converts
the camera’s 23.976 timecode to 29.97, which was
recorded on the DAT. Cameron Hamza used the Fostex DAT
because it was one of the only field recorders that
could lock to an external video sync pulse, which came
from the Evertz. We thought about recording audio timecode
at 23.976 but decided not to risk incompatibilities
with post-production gear. The Afterburner has a processing
time of about five frames. So our DAT timecode had a
consistent offset relative to video, which was easy
for the sound department to fix in post.
As
a test, I digitized from the DAT, then synced in the
Avid via the slate. After accounting for the five-frame
offset, I found an additional one-frame audio advance.
We discovered that video processing in the camera delays
the image by a frame as it is being recorded. The camera
is actually out of sync with itself! To address this,
Hollywood Digital delayed the sound by a frame when
they downconverted our dailies to Beta SP for cutting.
How
did cutting room procedures differ from those on a film
show?
Bock:
Since we didn’t have to wait for negative to be
processed, Erik was able to digitize the footage and
I was able to cut the scenes within 24 hours from the
time they were shot. On film, we would have been one
or two days behind camera. John Woo was the second assistant
editor, and Jessica Caggiano was our apprentice. After
principal photography we let go of the apprentice. It
was the smallest crew I ever had on a feature.
|
|
Andersen
and Bock with Mike Epps (Baby Powder) and Scruncho
(Baby Wipe). |
Andersen:
Because there’s no telecine, one big change for
the assistants was that we had to log the footage ourselves.
We used Avid MediaLog hooked up to a Beta SP deck. John
logged everything; in case the director asked for a
non-circled take, I could load it immediately. John
would log the first tape, start the Avid batch digitizing
and then go back and continue logging. We tried to keep
digitizing non-stop, while Larry and I viewed half-inch
dailies in another room on a projection TV. We didn’t
actually see the high-def footage projected until we
started onlining for previews.
Bock:
Normally I would be sitting next to the director in
a screening room getting some notes and a sense of what
he had in mind. We didn’t do that on this show.
Jesse looked at dailies on VHS in a trailer. We never
looked at the HD material projected and we never sat
in a projection room in a group situation. For me, that
was very different than previous experiences.
How
did the experience differ for you creatively?
Bock:
Creatively, working in hi-def was no different than
working on a film show.
Were
there any problems with the Avid? Things you wish the
system would do?
Bock:
We knew going into this film that the Avid was the only
system that could handle the workload and the amount
of temp effects we needed. It was also the only system
that would handle the 24P issues.
Andersen:
We used a Windows NT Avid Symphony, which belonged to
the production company, Jersey Films, as well as a Mac-based
Meridian Film Composer from Runway, and tied them together
with Unity storage.
Due
to the fact we had a 2 GB project with over 600 files,
the system took a long time to auto-save and redraw
the bins towards the end of the show. It took an hour
to back-up on a DVD-RAM disk each night. I would like
to see Avid design an automated backup utility that
would work in the background.
What
were the financial implications of shooting on video?
Andersen:
The 24P cameras rent for double the price of a 35mm
camera, but the cost of HDCAM tape is considerably less
than 35mm film, developing and printing. The high definition
on-line and color correction bays we used for previews
and final cut cost hundreds of dollars per hour, but
as more shows shoot in HD, those prices will drop. But
even with all the unexpected expenses and the tests
we did, the budget for HD post-production was still
comparable to a traditional film show.
Did
you receive more footage than you would have on a film
project?
Bock:
Jesse kept the camera rolling to make sure he captured
the most comical moments. We received an average of
three-and-a-half hours of material a day. By the end
of principal photography, we had the equivalent of over
750,000 feet of film.
Andersen:
By the end of the show we had a total of about 400 gigabytes
of storage, with our dailies digitized at 14:1. That’s
75 hours of material for a 90-minute movie.
Did
24P make a difference in how you dealt with visual effects?
Andersen:
Larry put me in charge of creating all the temp effects
and I tried things that we would never have done had
we been working on film. Some of those effects were
then recreated during our on-line sessions. For instance,
we had a shot of an exploding pigeon where the wire
was visible. All we did was paint the wire out, frame
by frame. It took 20 minutes and probably saved the
production a lot of money.
|
Figure
1. A split-screen effect, done in the editing room,
was used to combine the funniest parts of a take and
bring the characters together. |
In
another situation, we needed the character Baby Wipe
to walk into a shot. Larry and I looked through all
the takes, but the other character in the scene, Baby
Powder, had his funniest moments while Baby Wipe was
still completely out of frame. Then I suggested that
we do a split screen, with Baby Wipe’s half of
the frame coming from later in the take than Baby Powder’s
side. I did a quick composite on the Avid, and it looked
great (see Figure 1).
How
did you online?
Andersen:
With the Windows NT Avid Symphony, we could have on-lined
the show in NTSC, not hi-def. We would have needed a
lot of extra storage to reload the selected material
and assemble it uncompressed, so for the projected screenings
of the editor’s and director’s cuts we decided
to just output the 14:1 compressed image. For our previews,
we on-lined using a Fire at Hollywood Digital. The Fire
is an uncompressed HD non-linear editing and finishing
system made by Discreet. We used it to digitize our
HDCAM master tapes, which were then auto- conformed
to our EDL, laid back onto HDCAM tape and color-corrected.
One big issue with the Fire and with a linear HD online,
as well, is that if your effects can’t be described
in an EDL, you have to rebuild them in online by hand.
After the online, I loaded the finalized Fire effects
back into the Avid and cut them in, so that we wouldn’t
have to create them again for additional previews.
How
did you preview?
Andersen:
We had three previews, projecting HD using a Panasonic
PT 9600U, 1280 x 1024 projector.
Bock:
Getting ready for previews was very time-consuming,
because we had to on-line, color-correct, layback audio
to each reel, then tie the film together to make one
HD master for screening. But I was surprised by how
clean and bright the picture looked when I first saw
it projected in high-def. Back when I cut on film, we
would preview the print with thousands of splices in
it. Then I started cutting on non-linear machines, and
the assistants would conform the work print. I thought
the film looked a lot better, because it was not being
handled as much and had less splices and dirt. Now with
high-def, we get to project a spliceless print that
has been color-corrected. Although I still prefer the
look of a movie shot on film, I was pleasantly surprised
that the image didn’t have a pronounced video
feel to it. But I was worried about how we would handle
the changes from one preview version to the next.
|
|
Mike
Epps (Baby Powder) and Scruncho (Baby Wipe) teach
a class in “pimpology” as Billy O’Drobinak
operates and Afshin Shahidi pulls focus. |
Andersen:
The Fire editor recommended archiving our first preview
on D5 tape, then loading it back into the Fire. He conformed
the first preview to the new cut on his own —
we didn’t load any of the onlines into the Avid.
We did this for all three previews. The previews gave
us our first audience reaction to the 24P image. No
one in the audiences noticed that the movie hadn’t
been shot on film, and in fact, there were compliments
about how nice it looked.
Will
this film be blown-up to 35mm for release?
Andersen:
The movie won’t be projected digitally —
it’s a film-only release. E-Film did our scan
to negative, and Deluxe is doing our prints. Since it
cost several dollars per frame to scan the 24P to film,
we did a variety of color tests on selected scenes.
Our first test was printed on Kodak Vision Print Film
2383. It showed us that we needed a little more contrast
in the blacks — the image looked a little flat.
For our second test we printed the same negative on
2393 and found that it had more grain and less definition,
but the shadows blocked up. So we went back to our color-corrected
masters from the first two previews (which had been
color-corrected differently), pulled two new scenes,
scanned them to 2383 and made a new film print. We discovered
we needed to color correct for the film transfer, not
for HD projection, and that took care of most of our
problems. Each projector is different and we found that,
during previews, trying to compensate was pretty frustrating.
Bock:
Because we were color-correcting on tape, we were able
to use power windows, soft clipping, and keying, which
wouldn’t be available on film. But I was worried
about what would happen to all that work when the show
was scanned to film. In general, I was quite surprised
at how much it looked like it had been shot on film.
Were
expectations different from what they would have been
on a film project? Were you expected to work faster?
Bock:
They always expect you to work faster. I don’t
think shooting with the 24P hi-def cameras has changed
that.
Overall,
was the experience a good one? Would you want to do
it again?
Bock:
For me, the experience was good. A couple of aspects
of the 24P process did create extra work. Jesse asked
me to look at everything he shot, including uncircled
takes — three to four hours of footage every day,
which meant I had less time to edit. I guess I would
still prefer to work on projects that are shot on film,
but it seems like high-def might become a common medium
for features. The answer prints I’ve seen looked
good. Maybe not as good as film, but for a comedy like
this, I don’t think the audience will ever know
how we shot it.
Andersen:
With technology advancing so quickly, the image will
only get better. I consider myself very lucky to have
worked with Larry on the first HD feature for Universal,
and I look forward to doing it again.
Reprinted from
The Motion Picture Editors Guild Magazine
Vol. 23, No. 1 - January/February 2002
go
to "Getting to Know You" article
©
2005-07 Films in Focus |
|