Skip navigation

ANDANTE
a music application to explore cities

Andante is a musical application for mobile devices aimed at music lovers. The software can be used in any city and aims to create a relation between the users and the environment around them in a poetic way, making the exploration of the urban landscape more pleasant and creative. The project is based on a sonification process which translates a physical phenomenon to a non-speach sound. More specifically Andante generates a real-time musical composition which corresponds with the spatial and structural environment surrounding the user.

Starting point
I started my research from the city, exploring the set of services currently available to link urban environments with people that live there or pass through them. This research made me understand that the greatest number of applications for city navigation is purely functional and usually involves visual communication. Therefore I explored the world of the blind and visually impaired to search for inspiration and offer new solutions. This was extremely valuable as in this field all services available exclude visual communication and involve the other senses more deeply. This allowed me to understand the considerable untapped potential to develop new types of communications and interactions for products on the market today. After initial interviews, I decided not to design a product aimed exclusively for visually impaired, but kept many of the principles and inspirations collected during the research phase.

How it works

Andante is designed for musicians, composers and music lovers in order to make exploring the cities more enjoyable and stimulating. The interaction is based on a wide range of possibilities offered by sound communication, using graphics only to support the musical interaction allowing an eyesfree use of the application. Andante creates a musical composition influenced by both the user and the environment. It brings people to immerse themselves completely in an landscape that combines sound and visual perception into a poetic augmented reality.
The application transforms data into a musical composition generated in real time. Each building is a note and its characteristics are defined by a specific correspondence with the structural characteristics of the building:

surface = tone
height = volume
surface/height = note value
distance from the user = play sequence

Music generation
The video prototype below is realized with Processing and shows the real time music generation. I analyzed an area of Dorsoduro district in Venice. Sounds are made through a MIDI library played on the computer with JAVA synthesizer.

Eyesfree interaction
The interaction with the device is totally eyes free: navigation is driven by the sound and the graphic interface provides only a secondary support. In this way users can interact with the device without looking at the screen and also blind or visually impaired users can access quickly and easily all the music functions. The prototype was made in Processing and tested on the HTC Desire S with OS Android 2.3.5

Report