Project Link: https://dmgallery.apps01.yorku.ca/thedarksideofthehills/
Creative Description
“The Dark Side of the Hills” is an interactive audio project developed with *Max MSP* and *Ableton*, showcased at the [York University Digital Media Exhibition](https://dmgallery.apps01.yorku.ca/thedarksideofthehills/). The project explores the real-time interaction between audio and visuals to create an immersive experience. **Max MSP** processes audio files to generate a synchronized visualizer that responds dynamically to audio frequencies, while **Ableton** manages and manipulates the soundscapes. By blending sound and visuals, the project allows audiences to experience their connection in an interactive format. As viewers engage, both elements evolve in real time, highlighting the potential of digital media and interactive technologies in modern artistic expression.
Technical Description:
Dynamic Music-Driven 3D Terrain Visualization using Max/MSP
This project explores the interplay between music and generative 3D visuals to create an immersive ocean-like terrain that responds in real time to audio elements. Built using Max/MSP, the core visualization employs a dynamically generated plane connected to a custom gen patch, which filters and maps key musical components for real-time terrain transformation.
### Core Functionality:
1. **Audio Input and Signal Processing**:
- The generative system utilizes peak amplitude detection from the live music feed, which acts as the main driver for terrain modulation. By filtering specific audio characteristics, such as peaks and dynamic changes, the system manipulates the 3D environment in correspondence with the music.
2. **Visual Terrain and Object Manipulation**:
- The primary visual component consists of a plane geometry that morphs based on real-time audio signals. The terrain is rendered using jit.gl, with the gen patch acting as the control mechanism for creating dynamic shifts in the landscape.
- A small hidden jit.gl shape serves as an anchor point for the virtual camera, guiding the viewer through the terrain with an offset to simulate a traveling experience. The result is a continuous flow, mimicking the ebb and flow of ocean waves, providing a feeling of motion and progression.
3. **Camera and Perspective**:
- The camera maintains a steady perspective that offers an overview of the terrain, enabling viewers to observe the evolving landscape without overwhelming their field of view.
4. **Lighting and Time Transitions**:
- A rotating light source simulates the transition from day to night, reinforcing the passage of time and keeping the central focus on the terrain.
### Audio Composition:
The original music for this project was composed and performed by Sina Karimi using an electric guitar, with all recordings and mastering done in a DAW. The music composition aimed to evoke fluidity, matching the terrain’s generative dynamics and integrating seamlessly into the visual experience.
### Technical Challenges:
One major challenge was achieving synchronization between the live-generated visuals and the dynamic musical input. The terrain had to respond fluidly to the variations in amplitude and musical structure, necessitating precise signal processing and real-time adjustments to visual properties. Another difficulty involved maintaining a consistent flow of movement and light in the scene to match the narrative focus of staying in the present moment.