Video synthesizer

A video synthesizer is a device that electronically creates a video signal. A video synthesizer is able to generate a variety of visual material without camera input through the use of internal video pattern generators. It can also accept and “clean up and enhance” or “distort” live television camera imagery. The synthesizer creates a wide range of imagery through purely electronic manipulations. This imagery is visible within the output video signal when this signal is displayed. The output video signal can be viewed on a wide range of conventional video equipment, such as TV monitors, theater video projectors, computer displays, etc.

No computer or camera is needed for the generation itself, the signal is made entirely from scratch and the output signal is often analog. Therefore, the medium on which the image can be displayed is limited. CRT screens or projectors are usually used because the video signal is modified to such an extent that it is evaluated by a standard digital device as damaged and does not display anything.

It is not uncommon for some video synthesizers to be created by modifying old video processors that were used to clean analog video. This conversion involves so-called circuit bending, in which the circuits of the device themselves are modified, for example by adding a switch between previously divided circuit circuits. Other modifications can be the addition of potentiometers or photoresistors.

Video pattern generators may produce static or moving or evolving imagery. Examples include geometric patterns (in 2D or 3D), subtitle text characters in a particular font, or weather maps.

Imagery from TV cameras can be altered in color or geometrically scaled, tilted, wrapped around objects, and otherwise manipulated.

A particular video synthesizer will offer a subset of possible effects.

Video synthesis can also be considered as the modification of an audio signal for the purpose of projection on an oscilloscope, which, for example, was used by Jerobeam Fenderson in his project Oscilloscope Music. The video signal (or its subfolders) can often be modulated by an audio signal, which together with the possibility of working in real time has found its application in VJing in clubs, concerts or other performances.

Video Synthesizers as Real Time performance instruments
The history of video synthesis is tied to a “real time performance” ethic. The equipment is usually expected to function on input camera signals the machine has never seen before, delivering a processed signal continuously and with a minimum of delay in response to the ever-changing live video inputs. Following in the tradition of performance instruments of the audio synthesis world such as the Theremin, video synthesizers were designed with the expectation they would be played in live concert theatrical situations or set up in a studio ready to process a videotape from a playback VCR in real time while recording the results on a second VCR. Venues of these performances included “Electronic Visualization Events” in Chicago, The Kitchen in NYC, and museum installations. Video artist/performer Don Slepian designed, built and performed a foot-controlled Visual Instrument at the Centre Pompidou in Paris (1983) and the NY Open Center that combined genlocked early micro-computers Apple II Plus with the Chromaton 14 Video Synthesizer. and channels of colorized video feedback.

Analog and early real time digital synthesizers existed before modern computer 3D modeling. Typical 3D renderers are not real time, as they concentrate on computing each frame from, for example, a recursive ray tracing algorithm, however long it takes. This distinguishes them from video synthesizers, which must deliver a new output frame by the time the last one has been shown, and repeat this performance continuously (typically delivering a new frame regularly every 1/60 or 1/50 of a second). The real time constraint results in a difference in design philosophy between these two classes of systems.

Video synthesizers overlap with video special effects equipment used in real time network television broadcast and post-production situations. Many innovations in television broadcast equipment as well as computer graphics displays evolved from synthesizers developed in the video artists’ community and these industries often support “electronic art projects” in this area to show appreciation of this history.

Confluence of ideas of Electronics and Arts
Many principles used in the construction of early video synthesizers reflected a healthy and dynamic interplay between electronic requirements and traditional interpretations of artistic forms. For example, Rutt & Etra and Sandin carried forward as an essential principle ideas of Robert Moog that standardized signal ranges so that any module’s output could be connected to “voltage control” any other module’s input. The consequence of this in a machine like the Rutt-Etra was that position, brightness, and color were completely interchangeable and could be used to modulate each other during the processing that led to the final image. Videotapes by Louise and Bill Etra and Steina and Woody Vasulka dramatized this new class of effects. This led to various interpretations of the multi-modal synthesesia of these aspects of the image in dialogues that extended the McLuhanesque language of film criticism of the time.

EMS Spectron
In the UK Richard Monkhouse working for Electronic Music Studios (London) Limited (EMS) developed a hybrid video synthesiser – Spectre – later renamed ‘Spectron’ which used the EMS patchboard system to allow completely flexible connections between module inputs and outputs. The video signals were digital, but they were controlled by analog voltages. There was a digital patchboard for image composition and an analog patchboard for motion control.

Evolution into Frame Buffers
Video synthesizers moved from analog to the precision control of digital. The first digital effects as exemplified by Stephen Beck’s Video Weavings used digital oscillators optionally linked to horizontal, vertical, or frame resets to generate timing ramps. These ramps could be gated to create the video image itself and were responsible for its underlying geometric texture. Schier and Vasulka advanced the state of the art from address counters to programmable (microcodable) AMD Am2901 bit slice based address generators. On the data path, they used 74S181 arithmetic and logic units, previously thought of as a component for doing arithmetic instructions in minicomputers, to process real time video signals, creating new signals representing the sum, difference, AND, XOR, and so on, of two input signals. These two elements, the address generator, and the video data pipeline, recur as core features of digital video architecture.

The address generator supplied read and write addresses to a real time video memory, which can be thought of as evolution into the most flexible form of gating the address bits together to produce the video. While the video frame buffer is now present in every computer’s graphics card, it has not carried forward a number of features of the early video synths. The address generator counts in a fixed rectangular pattern from the upper left hand corner of the screen, across each line, to the bottom. This discarded a whole technology of modifying the image by variations in the read and write addressing sequence provided by the hardware address generators as the image passed through the memory. Today, address based distortions are more often accomplished by blitter operations moving data in the memory, rather than changes in video hardware addressing patterns.

History of Video Synthesizers, Designers, and Artists

1960s
1962, Lee Harrison III’s ANIMAC: (Hybrid graphic animation computer) – predecessor to the Scanimate
1966, Dan Slater’s custom vsynths: Dan Slater has built a number of custom homebrew vsynths over the years, & worked with Douglas Trumbull on various films.
1968, Eric Siegel’s PCS (Processing Chrominance Synthesizer)
1968, Computer Image Corporation Scanimate:
Video of a News Report on Scanimate, including interview with inventor Lee Harrison III
1969, Paik/Abe synthesizer
Built at WGBH Boston Experimental TV Center envisioned by Nam June Paik, designed by artist/engineer Shuya Abe.
Several built at CalArts, and Experimental TV Center Binghamton University, WNET NYC, Jim Wiseman has one still operational
Click on the heading for a dedicated page, w/extensive Real Video Clips
1969, Bill Hearn’s VIDIUM: (Analog XYZ driver/sequencer)
1969, Glen Southworth’s CVI Quantizer & CVI Data Camera

1970–1974
1970, Eric Siegel’s EVS Electronic Video Synthesizer & Dual Colorizer (Analog)
1970, groove & VAmpire
(Generated Real-time Output Operations on Voltage-controlled Equipment)
(Video And Music Program for Interactive Realtime Exploration/Experimentation).
1970, Lear Siegler’s vsynth: Unique Hi-Rez video processor used in the film “Andromeda Strain” and by Douglas Trumbull & Dan Slater
Stephen Beck’s Direct Video Synth & Beck Video Weaver
Stephen Beck created some early 70’s synths that had no video inputs. They made video purely from oscillations.
He also modified a few Paik/Abe units.
Sherman WALTER WRIGHT: One of the first video animators, he worked at Computer Image Corp in the early 70’s, and later at Dolphin Productions, where he operated a Scanimate. While at Dolphin, Ed Emshwiller and him worked on Thermogenisis and Scapemates together, and he also made several tapes on his own. In 1973–76, as artist-in-residence at the Experimental Television Center, NY, he pioneered video performance touring public access centers, colleges and galleries with the Paik/Abe video synthesizer. He also worked with the David Jones colorizer & Rich Brewsters sequencing modules. These various modules were based on David’s design for voltage controlled video amps and became the basis for the ETC studio. He was there when Don McArthur built the SAID. Woody Vasulka and Jeff Schier were close at hand building computer based modules in Buffalo including a frame buffer with ALUs built in, mixers, keyers and colorizers. Wright also worked with Gary Hill at Woodstock Community Video, where they had a weekly cable show of live video/audio synthesis. Wright has developed his own performance video system, the Video Shredder, and uses it to mesmerize audiences wherever and whenever he can. He’s getting quite good at it. His mission is to create a new music of sound and image. He has performed throughout the east coast of the USA and Canada at art galleries and museums, schools and colleges, media centers, conferences and festivals.
1971, Sandin Image Processor: Very early video synth….DIY modular, Built by Dan Sandin of Chicago.
1972, Rutt/Etra Video Synthesizer: Co-invented by Steve Rutt & Bill Etra, this is an analog computer for video raster manipulation.
1973, Phil Morton publishes “Notes on the Aesthetics of Copying an Image Processor'”. He “proudly referred to himself as the ‘first copier’ of Sandin’s Image Processor. The Sandin Image Processor offered artists unprecedented abilities to create, process and affect realtime video and audio, enabling performances that literally set the stage for current realtime audio-video New Media Art.”
1974, VSYNTH’s by David Jones: Many creations, the most famous being the Jones Colorizer, a four channel voltage controllable colorizer with gray level keyers.
1974, EMS Spectre: Innovative video synthesiser using analogue and digital techniques, developed by Richard Monkhouse at EMS. Later renamed to ‘Spectron’.

1975–1979
1975, Dave Jones Video Digitizer: an early digital video processor used for video art. It did real-time digitizing (no sample clock) and used a 4-bit ALU to create color effects
1975, Don McArthur’s SAID: Don McArthur developed the SAID (Spatial and Intensity Digitizer), an outgrowth of research on a black and white time base corrector with Dave Jones
1976, Denise Gallant’s vsynth: Created a very advanced analogue video synthesizer in the late 70’s.
1976, Chromaton 14
A fairly small analog video synthesizer, w/color quantizers & can generate complex color images without any external inputs.
Built by BJA Systems
1977, Jones Frame Buffer: Low resolution digital frame storage of video signals (higher resolution versions, and multi-frame versions were made in 1979 and the early 1980s)
1979, Chromachron: One of the first DIGITAL VSynths. – Designed by Ed Tannenbaum.
1979, Chromascope Video Synthesizer, PAL and NTSC versions. Created by Robin Palmer. Manufactured by Chromatronics, Essex, UK. Distribution by CEL Electronics. Model P135 (2,000 units built) and Model C.101 (100 units built).

1980s
1984, Fairlight CVI Computer Video Instrument: The Fairlight CVI was produced in the early 80’s, and is a hybrid Analog Digital video processor.

2000s
2008, Lars Larsen and Ed Leckie founded LZX Industries and began developing new analog video synthesizer modules (Visionary, Cadet, and Expedition Series).
2011, Critter & Guitari Video Scope: preset video synthesizer.
2013, Critter & Guitari Rhythm Scope: preset video synthesizer.
2014, Critter & Guitari Black & White Video Scope: preset video synthesizer.
2014, Ming Mecca: modular pixel-art-oriented analog video synth
2016, Paracosm Lumen: semi-modular software video synth for MacOS.
2016, Vsynth: a modular software video synthesizer package for Max/Jitter.
2016, Ming Micro: pixel-art-oriented digital video synth
2017, Critter & Guitari ETC: video synthesizer that supports 720p output.