Interfacing audio & images


Image(s): 640*480

Jpeg Image (29 Ko) Jpeg Image (22 Ko)



Project : interactive art

  • URL :

    Video(s) and extracted images: 320*240

    Video QuickTime -> Film/Video (3.2 Mo)
    Jpeg Images -> (10 Ko) (10 Ko)

    Video QuickTime -> Film/Video (1.9 Mo)
    Jpeg Images -> (11 Ko) (12 Ko)


    Interfacing audio & images is a collection of innovative digital experiments such as connecting images to sound, introducing 'liquid plasmatic architexture', interactive ballet choreography and even more approaches to virtual data-deconstruction. all experiments, transformations & installations have been developped by 'supreme particles' introducing the 'hybrid matrix-strategy'

    Technical Information

    • proprietary software
    • hardware silicon graphics

    More Information...

    • Abstract :

      This text describes several approaches to interface audio and images, all done at the Institute for New Media in Frankfurt, Germany. I believe very much, that all of the discussed topics could be easily applied for an interactive stageshow using real-time graphics hardware, which to this dimension none has performed yet. Additional keywords: audio, multimedia, soundtrack, stageshow, MIDI, Virtual Reality, Digital Signal Processing, real-time graphics.

      1) Introduction

      For the last years, the main focus of my work has been on controlling pictures through audio-input and applying different setups in an art and performance environment. There already exists some tradi tion in synchronizing sound to images like in multimedia-installations or videogames. A main fact is, that most artistic electronic work in a sense of craftsmanship can be automatically done by a computer instead of an operator. There exists already a big library of fileformats both f or audio and images that will serve this fact like Standard MIDI Files (SMF), AIFF, TIFF and so on. Along with special hard- and software there should now exist enough tools for interfacing pictures with sound, even in real-time applications. These setups could then be used for live-stageshows or other interactive purposes. Even the production of videoclips could be automated in very complex way s. Instead of an exhausting frame-by-frame-rendering of images, 3D-models could so be directly extracted from videoimages or audioinput in any format and the effects could be seen just as it happens.

      2) Inputmaterials

      2.1 Audio

      Audiodata can be supplied by the following sources:
      • Standard MIDI files (created by softwaresequencers like Cubase, Vision, Digital Performer...)
      • MIDI-data (created by MIDI-keyboards, MIDI-guitars, ...)
      • Analog soundsources (violins, space shuttle, voice, atomic explosion,...)
      • Prerecorded samples on harddisk

      2.2 Images

      Imagedata can be supplied by the following sources:
      • live camera-input
      • laserdisk
      • harddisk (Abekas A60, stores 30 seconds of video)
      • videotape
      • workstations that have real-time graphics-capabilities

      3) Image manipulation

      3.1 Non-Real-time

      Non-Real-time techniques can supply visual footage for laserdisks, harddisks or videotapes that can be triggered later on. Mainly these techniques include algorithms, that require large amount of calculation time:
      • Digital Signal Processing: fft, filtering, edgedetection...
      • traditional 2D or 3D-animation
      • complex, audiotriggered imagesynthesis
      • standard video postproduction

      3.2 Real-time

      Real-time techniques can be used to manipulate image data in live- interaction:
      • live mixing with videodevices
      • computercontrolled hard- and laserdiskaccess
      • mapping of raw or interpreted audiodata on 3D-grids or models
      • texturemapping on 3D-grids
      • warping
      • morphing

      4) Soundsynthesis

      Along with the manipulation of imagedata, also sound can be processed or changed with the help of MIDI-controllable devices like samplers, mixing consoles, effects or light-controllers.
      • autocomposition:
      • MIDIbrain: listens to the input and supplies a musical backup
      • MIDIcloning: imitates the input
      • mathematical methods: randomwalk, Feigenbaum,...

      5) Results

      • XTRA.TRAX: A program creates Video-editlists from Standard MIDI files, that will then automatically assemble a videoclip. The Abekas A60 Harddiskrecorder is controlled by a musical instrument. With this, a video is no more cut, but instantly controlled in time.
      • CYBERTIME: fractal algorithms create sequences of video and a soundtrack.
      • FLUTE: a flute transforms texture-space in real-time.
      • PAULA CHIMES: A musical instrument made out of 16 chimes and 2 videomonitors is capable of real-time imagewarping and autocomposition of music.
      • TILT IT lT: A video, that uses the soundtrack for the direct creation of 3D-computergraphics. For instance, a sample of a guitarsolo is calculated and animated into 3D-space and reflectionmapped with the original picture that created the sound.
      • DWARFMORPH: A software, that is able to create 3D-models out of 2D-videoimages. In a real-time application, the viewer of a scene can be directly part of the 3D-world he looks at.
      • ELASTIC GUITAR GRIDS: the sound of a guitar creates waves on a static image.
      • BIOGRIDS: images & sounds create colonies of objects.
      • HYENA DAYS: The musicians control image- and sounddevices with their instruments (Steina Vasulka: violin, Michael Saup: guitar).
      • SOUNDSCULPT: sound transforms form.
      • WARPING THE BIENNALE VENEZIA 1993: The voice of Peter Weibel warps his own face.
      • SOUNDMAPPING: sounds are mapped onto 3D-objects.

    • Some more Comments :

      information taken out of a fax from MichaëlSaup

  • Copyright © 1994-2024
    Other Sites : | Ai Girls | Ai Creations