Our proposal for the SPLICE 2018 festival involves two parts (“theory and practice”): First, a presentation that follows the development of machines for audiovisual performance, from early history to the present, as well as a discussion of our proprietary (software) machine for creating real-time visuals in a performance environment. A discussion of the development of software tools for live visuals to follow.
Finally, an AV performance provides a demonstration of our process and the aforementioned instrument (in conjunction with a number of other music and visual tools) in a live environment.
“The world today is shaped by machines.”
Therefore, stated the manifesto of Bruno Munari:
“Artists must be interested in machines.”
A century ago, art movements such as the Futurists (with whom Munari was associated) strove to invent new paradigms for creative endeavor – paradigms that reflected the new reality of the world of machines, and that addressed the ways in which this world inevitably shapes those who inhabit it …
Soon the machine was not only a subject and an aesthetic, but also an active participant in creation of visual and auditory experiences.
Artists such as László Moholy-Nagy and Nicolas Schöffer developed mechanical and cybernetic instruments that could generate audio-visual performances in real time. In this “new plastic adventure” – as Schöffer described it – “sound, movement, space, light, color, as they interconnect, will form structures with multiple counterpoints in an architecture … without beginning or end.”
[ … a context in which … ]
“The artist no longer creates … works. [We] create … creation.”
In the 21st century, artists have a wealth of new technology with which to work. This of course includes modern CPUs and GPUs … but the “new plastic adventure” arguably takes place in the domain of software.
About a decade ago while working in performance with both music and visuals … the tools for the visual side (especially for narrative pieces) seemed inadequate; like DJ tools, limited to mixing and some effects …
… what was needed was more control to generate visual content on the fly … in a manner analogous to the way a music synthesizer works: providing full, parametric control over the creation of material (not just mixing of it) …
This led to the development of a software tool called VS3 (“visual synthesizer iteration 3”), which was used in performances for a number of years …
… but was eventually in need of an upgrade …
VS4 is the latest version. It takes advantage of the powerful new platforms and graphics engines available today …
… but is still in many ways inspired by the ideas, and the machines (such as the Light Space Modulator of László Moholy-Nagy), of a century ago …
(Except where noted, all material in the video is original, with visuals generated using VS4 or its predecessor, VS3.)
This video is also available at Vimeo.
Lecture and Discussion
We envision a presentation that follows the development of machines for audiovisual performance, from early history to the present, as well as a discussion of our proprietary (software) machine for creating real-time visuals in a performance environment. Discussion of development of software tools for AV performance to follow. This will include the topic of the use of contemporary game engines (e.g., Unity) for this purpose.
Approximately 45 minutes. Further details coming soon.
Performance
Audiovisual performance; two artists. Equipment will include two computers with custom software, plus additional musical instruments and controllers. Stereo audio mix and video projection. Approximately 30 minutes. Details coming soon.
Personnel
Amy Zimmitti is Course Director in the Recording Arts and Game Development programs at the Los Angeles Film School. Has been involved in preparing and performing the production of AV performances across multiple venue systems in the US and abroad for almost 20 years. Studied Game Design and Recording Arts at Full Sail University, Studio Art and Psychology at University of Rhode Island, and Psychology – Electro Acoustics – Public / Landscape Sculpture at Liverpool Hope University. Currently resides in Los Angeles.
Peter G. Johnson is a new media producer, creative director and technologist with experience at a number of major interactive and experiential agencies. Was the first Liaison Software Engineer at Avid Technology, working with the Media Composer and Film Composer products. Video work has been exhibited at numerous venues, including the Museum of Modern Art in New York City. Holds a BS in Computer Science from Worcester Polytechnic Institute and a BA in Studio Art from the University of Massachusetts. Currently resides in Los Angeles.