, ,


Generally, audience watchs the performance and receive message passively in digital performance. In this project, we try to create an interactive performance which people can interact with dancer using their mobile phones. We develop an iPhone app named as Luciérnaga which is available in the iTune Store. For Luciérnaga app, it’s an light storage which people can use camera to capture the light in real life and play with the light ball in their iPhones. In the theater, people can use their iPhones to take part in the performance. They can control the objects in the stage and generate special sounds to create a soundscape with others in live performance. Therefore, Luciérnaga app is a medium between audience and performance.


We set up a local wireless network and arrange the 8 different zones in the theater. Audience can connect to our control server based on a unique number when entering the theater. For iphone app, we analyze the captured pictures and change the size of light ball based on the brightness. We design a special sound for iphone app. It can generate an interesting sound based on the change of different gestures, accelerometers and size of light ball. For control server, we monitor the connecting status for each iphone and send different commands to trigger different sound effects and music. In 8 zones, iphone would generate different sounds with different volume and order. All of them would collaborate with each others and create a special sound experience in the performance.


We generate real-time animation by an open source programming language which is called Processing.

In order to create better translation of actions into image, a Kinect sets upon dancer to detect movement and get position information. All the image effects are realtime system controlled by dancer movement.

In order to increase the projection area on the stage, we use picture-frame stage design. It makes three projection areas which are floor, background screen, and frames. Three projectors are connected to three computers separately but synchronous operation. Each computer is a sender and receiver simultaneously. Open sound control protocols are used to communicate with each other. This type of system framework can synchronize visual generation, sounds, and iPhone control.