“ARCHiTEXTURE by SUPREME PARTiCLES 1994” by Saup

  • ©

Conference:


Type(s):


Entry Number: 02

Title:


    ARCHiTEXTURE by SUPREME PARTiCLES 1994

Program Title:


    The Edge

Presenter(s):


Collaborator(s):


Project Affiliation:


    Institut fuer Neue Medien

Description:


    ARCHiTEXTURE is an interactive image/sound/room installation using real-time image and sound processing. Through the architectural and technological setup, the viewer is confronted with a computer-generated “plasmatic being.” Both visual and accoustic perception are stimulated.

    Implementation of a “brainlike” software structure in virtual reality environments frees the interactive process from its 1:1 transmission of information. The brain creates a history and uses the past to assemble the future. It also acts in a destructive manner: it can forget useless information and substitute it with meaningful data. The connection/interfacing between sound and image creates a double plasma: the visual plasma and the accoustic plasma. Realtime computation of both media requires sophisticated technology.

    Setup
    In the middle of the room is a pneumatic projection screen that can be controlled with a computer. In front of the projection screen, a circular pattern is painted on the floor. This is the center of action and interaction.

    Interaction
    1. Image
    As the viewer steps into the room, the screen shows the image from a camera: a wide-angle shot of the space. As if in a mirror that illuminates hidden structures, the viewer is shown crossing the room. When the viewer steps into the middle of the circular pattern, two shots are made from two different angles with two different focal lengths. These images are transformed into 3D models and interpolated in real time by the specially designed ‘plasmatic’ software. Liquid, organic images are created. Then the image is put back as a texture over these objects and, through further software-based manipulation, is turned into PLASMA. The (software) mirror starts a life of its own. The “reflection” expands, bulges, and transforms continuously. At the same time, the pneumatic projection screen transforms physical flatness into 3D space to support the “plasmatic” effect.

    2. Movement and Sound
    Confronted with a fragmented self-image, the viewer acts to correct the reflection. As the viewer moves, the PLASMA starts to react. Not only visual patterns, but also interpolated speech is created. A microphone collects all speech and noises in the room, storing them in small units in the computer’s memory. There they are “soundmorphed” and interpolated into an artificial language that consists not of concrete messages but of sound with a strong emotional value, at least as long as the viewer is moving in a hectic, excited fashion. As the viewer changes to slower, smoother movements, the PLASMA changes, too. It starts giving off pieces of sound that were stored earlier. What is more, the artificial being is no longer reduced to passive imitation of the viewer. It starts moving according to its own laws of interpolation, and as the viewer’s movements become more coordinated and smoother, the reaction of PLASMA is no longer mirror-like. Now it is life-like. And it starts playing the active part, “talking” to the viewer and using its “brain” to construct two-way communication.

    Software
    1. Visual
    The software transforms 2D video images into 3D computer models that are further transformed via local growth, bubbles, texture gravitation, and texture interpolation.

    2. Audio

    An organic sound ambient is generated via interpolation of frequencies (soundmorphing), a 3D sound system with 8-12 speakers depending on the room, 3D active MIDI-processes, MIDI-evolution, MIDI-gravitation, and digital signal processing computer-controlled effects processors.

    Bio-Grid Library
    All functions are designed in order to create organic behaviour among audio data, digital images, and 3D coordinates. All modules can be connected; the same functions can be applied to either audio, images, or coordinatespace. The two-dimensional video images are transformed into three-dimensional coordinate space. In this way, an additional space/time dimension is introduced to the system – a topography of the image. Unlike the organisation of images in time, this opens space for image internal metamorphosis in the space/time dimension. The information of the image can therefore be freed from its static representation and transformed into states with enhanced or reduced complexity. The additional inheritance of external information (audio data, etc.) allows image behaviour to be controlled through external parameters.

    Methods of the “PLASMATIC”
    Software Order:
    Chaos – gravitation, dynamics random walk, worms Life algorithms – transformation from 2D into 3D, external soundspecific control, manipulation in color-space Creation of subpatterns / substructures — texture mapping, texture animation, paint and waxeffects, 2D / 3D warping Disintegration of objects into particles – light animation Digital signal processing applied to – 3D objects, 2D images, sound, coordinates / Z-buffers Sound-specific controls – control of dynamics through audio input, sound mapping: mapping of audio data onto 3D objects, sound-driven image processing Sound-specific parameters – amplitude, frequency, average, minimum, maximum, variance MIDI-specific parameters – pitch, velocity, xyz coordinates, instru-ment/sound choice Sound generation – mapping of 3D coordinates onto MIDI parameters, mapping of 3D coordinates onto 3D speaker-matrix, use of 3D coordinates as envelopes and timelines Metamorphosis/interpolation of- 3D objects, sound (soundmorphing), images/colors Interpolation with the help of-gain/bias functions, fractal algorithms, gravitation/magnetism.


PDF:



Overview Page: