Ce document n'est pas traduit, désolé...
Interfacing audio & images
| Jpeg Image (29 Ko)
|| Jpeg Image (22 Ko)
- Michaël Saup - brainware and software
- Bob O'Kane - software
- Gideon May - software
Project : interactive art
URL : http://www.rz.uni-frankfurt.de/~supreme
Video(s) and extracted images: 320*240
|Video QuickTime ->
|| (3.2 Mo)
|Jpeg Images ->
|| (10 Ko)
|| (10 Ko)
|Video QuickTime ->
|| (1.9 Mo)
|Jpeg Images ->
|| (11 Ko)
|| (12 Ko)
Interfacing audio & images is a collection of innovative digital
experiments such as connecting images to sound, introducing
'liquid plasmatic architexture', interactive ballet choreography
and even more approaches to virtual data-deconstruction.
all experiments, transformations & installations have been developped
by 'supreme particles' introducing the 'hybrid matrix-strategy'
Keys Words :
- proprietary software
- hardware silicon graphics
This text describes several approaches to interface audio and images, all done
at the Institute for New Media in Frankfurt, Germany. I believe very much,
that all of the discussed topics could be easily applied for an interactive
stageshow using real-time graphics hardware, which to this dimension none has performed yet.
Additional keywords: audio, multimedia, soundtrack, stageshow, MIDI, Virtual Reality,
Digital Signal Processing, real-time graphics.
For the last years, the main focus of my work has been on controlling
pictures through audio-input and applying different setups in an art
and performance environment. There already exists some tradi
tion in synchronizing sound to images like in multimedia-installations or videogames.
A main fact is, that most artistic electronic work in a sense of craftsmanship can be automatically done by a computer instead of an operator. There exists already a big library of fileformats both f
or audio and images that will serve this fact like Standard MIDI Files (SMF), AIFF, TIFF and so on. Along with special hard- and software there should now exist enough tools for interfacing pictures
with sound, even in real-time applications. These setups could then be used for live-stageshows or other interactive purposes. Even the production of videoclips could be automated in very complex way
s. Instead of an exhausting frame-by-frame-rendering of images, 3D-models could so be directly extracted from videoimages or audioinput in any format and the effects could be seen just as it happens.
Audiodata can be supplied by the following sources:
- Standard MIDI files (created by softwaresequencers like Cubase, Vision, Digital
- MIDI-data (created by MIDI-keyboards, MIDI-guitars, ...)
- Analog soundsources (violins, space shuttle, voice, atomic explosion,...)
- Prerecorded samples on harddisk
Imagedata can be supplied by the following sources:
- live camera-input
- harddisk (Abekas A60, stores 30 seconds of video)
- workstations that have real-time graphics-capabilities
3) Image manipulation
Non-Real-time techniques can supply visual footage for laserdisks, harddisks or videotapes
that can be triggered later on. Mainly these techniques include algorithms, that require large
amount of calculation time:
- Digital Signal Processing: fft, filtering, edgedetection...
- traditional 2D or 3D-animation
- complex, audiotriggered imagesynthesis
- standard video postproduction
Real-time techniques can be used to manipulate image data in live- interaction:
- live mixing with videodevices
- computercontrolled hard- and laserdiskaccess
- mapping of raw or interpreted audiodata on 3D-grids or models
- texturemapping on 3D-grids
Along with the manipulation of imagedata, also sound can be processed or changed
with the help of MIDI-controllable devices like samplers, mixing consoles,
effects or light-controllers.
- MIDIbrain: listens to the input and supplies a musical backup
- MIDIcloning: imitates the input
- mathematical methods: randomwalk, Feigenbaum,...
- XTRA.TRAX: A program creates Video-editlists from Standard MIDI files,
that will then automatically assemble a videoclip.
The Abekas A60 Harddiskrecorder is controlled by a musical instrument.
With this, a video is no more cut, but instantly controlled in time.
- CYBERTIME: fractal algorithms create sequences of video and a soundtrack.
- FLUTE: a flute transforms texture-space in real-time.
- PAULA CHIMES: A musical instrument made out of 16 chimes and 2 videomonitors
is capable of real-time imagewarping and autocomposition of music.
- TILT IT lT: A video, that uses the soundtrack for the direct creation of
3D-computergraphics. For instance, a sample of a guitarsolo is calculated and
animated into 3D-space and reflectionmapped with the original picture that
created the sound.
- DWARFMORPH: A software, that is able to create 3D-models out of 2D-videoimages.
In a real-time application, the viewer of a scene can be directly part of the 3D-world he looks at.
- ELASTIC GUITAR GRIDS: the sound of a guitar creates waves on a static image.
- BIOGRIDS: images & sounds create colonies of objects.
- HYENA DAYS: The musicians control image- and sounddevices with their
instruments (Steina Vasulka: violin, Michael Saup: guitar).
- SOUNDSCULPT: sound transforms form.
- WARPING THE BIENNALE VENEZIA 1993: The voice of Peter Weibel warps his own face.
- SOUNDMAPPING: sounds are mapped onto 3D-objects.
Some more Comments :
information taken out of a fax from MichaëlSaup