broad historical phases characterize the development of video editing
that followed: physical film/tape cutting, electronic transfer editing,
and digital non-linear editing. Even before the development of a
successful videotape recording format in 1956 (the Ampex VR-1000),
time zone requirements for national broadcasting required a means
of recording and transporting programs. Kinescopes, filmed recordings
of live video shows for delayed west coast airing, were used for
this practice. Minimal film editing of these kinescopes was an obligatory
part of network television.
videotape found widespread use, the term "stop-and-go recording"
was used to designate those "live" shows that would be shot in pieces
then later edited together. Physically splicing the 2" quad videotape
proved cumbersome and unforgiving, however, and NBC/Burbank developed
a system in 1957 that used 16mm kinescopes--not for broadcasting--but
as "work-prints" to rough-cut a show before physically handling
the videotape. Audible cues on the film's optical sound track allowed
tape editors to match-back frame for frame each cut. Essentially,
this was the first "offline" system for video. Known as ESG, this
system of rough-cutting film and conforming on tape (a reversal
of what would become standard industry practice in the 1990s), reached
its zenith in 1968 with Rowan and Martin's Laugh In. That
show required 350-400 tape splices and 60 hours of physical splicing
to build up each episode's edit master.
cleaner way to manipulate prerecorded video elements had, however,
been introduced in 1963 with Ampex's all electronic "Editec." With
VTRs (videotape recorders) now controlled by computers, and in-
and out-points marked by audible tones, the era of electronic "transfer
editing" had begun. Original source recordings were left unaltered,
and discrete video shots and sounds were re-recorded in a new sequence
on a second generation edit master. In 1967, other technologies
added options now commonplace in video editing studios. Ampex introduced
the HS-100 videodisk recorder (a prototype for now requisite slow
motion and freeze frame effects) that was used extensively by ABC
in the 1968 Olympics. "Helical-scan" VTRs (which threaded and recorded
tape in a spiral pattern around a rotating head) appeared at the
same time, and ushered in a decade in which technological formats
were increasingly miniaturized (enabled in part by the shift to
fully transitorized VTRs like the RCA TR-22 in 1961). New users
and markets opened up with the shift to helical: educational, community
activist, and cable cooperatives all began producing on the half-inch
EIAJ format that followed; producers of commercials and industrial
video made the three-quarter inch U-matic format pioneered by Sony
in 1973 its workhorse platform for nearly two decades; newsrooms
jettisoned 16mm newsfilm (along with its labs and unions) for the
same videocassette-based format in the late 1970s; even networks
and affiliates replaced venerable two-inch quad machines with one-inch
helical starting in 1977.
The standardization of "time-code" editing, more than any other
development, made this proliferating use viable. Developed by EECO
in 1967, time-code was awarded an Emmy in 1971, and standardized
by SMPTE shortly thereafter. The process assigned each video frame
a digital "audio address," allowed editors to manage lists of hundreds
of shots, and made frame accuracy and rapidly cut sequences a norm.
The explosive growth of non-network video in the 1970s was directly
tied to these and other refinements in electronic editing.
digital editing, a third phase, began in the late 1980s both as
a response to the shortcomings of electronic transfer editing, and
as a result of economic and institutional changes (the influence
of music video, and the merging of film and television). To "creative
personnel" trained in film, state-of-the-art online video suites
had become little more than engineering monoliths that prevented
"cutting-edge" directors from working intuitively. In linear time-code
editing for example, changes made at minute 12 of a program, meant
that the entire program after that point had to be re-edited to
accomodate the change in program duration. Time code editing, which
made this possible, also essentially "quantified" the process, so
that the "art" of editing meant merely managing "frame in/out" numbers
for shots on extensive edit decision lists (EDLs). With over 80%
of primetime television still shot on film by the end of the 1980s,
the complicated abstractions and obsolescence that characterized
these linear video formats also meant that many Hollywood television
producers simply preferred to deliver programs to the networks from
film prints--cut on flatbeds and conformed from negatives. The capital
intensive nature of video post-production, also segregated labor
in the suites. Directors were clients who delegated edit rendering
tasks to house technicians and DVE artists. Online linear editing
was neither spontaneous nor user-friendly.
procedures rejected videotape entirely and attacked the linear "straight-jacket"
on several fronts. Systems were developed to "download" or digitize
(rather than record) film/video footage onto video disks (CMX 6000)
or computer hard-drive arrays (Lightworks, The Cube). This created
the possible of random access retrieval as an "edited" sequence.
Yet nonlinear marked an aesthetic and methodological shift as much
as a technological breakthrough. Nonlinear technologies desegregated
the editing crafts; synthesized post-production down to the "desktop"
level, the personal computer scale; allowed users to intervene,
rework, and revise edited sequences without recreating entire programs;
and enabled editors to render and recall for clients at will numerous
stylistic variations of the same show. Directors and producers now
commonly did their own editing--in their own offices. The trade
journals marvelled at the Avid's "32 levels of undo," the ability
to restore extensive changes to various previous states. Nothing
was locked in stone.
This openness allowed for a kind of presentational and formal "volatility"
perfectly suited for the stylistic excesses that characterized contemporary
television in the late 1980s and 90s. When systems like the Avid
and the Media 100 were upgraded to "online" mastering systems in
the 1990s--complete with on-command digital video effects--the anything-can-go-anywhere
premise made televisual embellisment an obligatory user challenge.
The geometric growth of hard-disk memory storage, the pervasive
paradigm of desktop publishing, and the pressure to make editing
less an engineering accomplishment than a film artist's intuitive
statement sold nonlinear procedures and technologies to the industry.
Video editing faces a trajectory far less predictable than that
in the 1950s, when an industrial-corporate triumvirate of Ampex/RCA/NBC
controlled technology and use. The future is open largely because
editing applications have proliferated far beyond those developed
for network oligopoly. Video is everywhere. Nonlinear established
its beachhead in the production of commercials and music videos,
not in network television. Still, by 1993 mainstream ATAS (Academy
of Television Arts and Sciences) had lauded Avid's nonlinear system
with an Emmy. By 1995, traditional television equipment manufacturers
like Sony, Panasonic, and Grass Valley were covering their bets
by selling user-friendly, non-linear Avid-clones even as they continued
slugging it out over digital tape-based electronic editing systems.
At the same time, program producing factories like Universal/MCA
Television continued to use a wide range of editing systems for
their series--film, linear, and nonlinear.
obsession with "digital interactivity" in the 1990s, means that
sequencing video imagery in "post-production" will remain central
to the fabrication of entertainment "software." Storage formats
(film, tape, video disk) will, clearly, continue to change. Yet
industry forays into the "information superhighway," now suggest
a prototype for interactive editing that is closer in spirit to
television's historic paradigm of multi-source "switching." Many
now envision the "video server"--networked by wide bandwidth fiber-optic
cable--as a bottomless, digitized, motion picture storage pit, as
an image-sound repository that does not even need to reside in the
sequencing platform of the digital video editor. If this server-network
model survives, the role of the nonlinear digital editor might then
stand as the very model for all video-on-demand consumers in the
domestic sphere as well. Viewers will become their own editors.
Anderson, Gary H. Video Editing and Postproduction. White
Plains, New York:Knowlege Industry Publications, 1988.
Browne, Stephen E. Videotape Editing: A Postproduction Primer.
John. Televisuality: Style, Crisis, and Authority in American
Television. New Brunswick, New Jersey: Rutgers University Press,
Arthur. Electronic Post-Production and Videotape Editing.
Boston: Focal, 1989.
Herb. Television Production Handbook. Belmont, California:
Wadsworth Press, 1992.
in Television; Videotape