Star Wars idea comes to life

Ah, unforgettable moments in sci-fi cinema. Luke Skywalker fiddles with something stuck on R2-D2’s er, body, and from a lens on R2’s er, head, erupts a 3D holograph of Princess Leia. 

“Help me Obi-Wan Kenobi, you’re my only hope,” says Leia. A few times.

As reported in the journal Nature, a team of researchers from the University of Sussex, UK, and Tokyo University of Science, Japan, appears to have brought this cinematic trickery to life.

They’ve created a protype system that generates a 3D image that can also emit sound and provide a tactile response when “touched”.

The prototype, says the study’s authors, may have applications in biomedical and computational-fabrication fields. {%recommended 4693%}

Sussex’s Ryuji Hirayama and colleagues created the Multimodal Acoustic Trap Display (MATD), which can simultaneously produce visual, auditory and tactile content.

Based on the principles of “acoustic tweezers” (where the position and movement of very small objects can be manipulated using sound waves), the system uses sound waves to trap a particle and illuminate it with red, green and blue light to control its colour as it moves through the display.

The authors have demonstrated their system by producing 3D images including a torus knot, a pyramid and a globe, which can be seen from any point around the display. 

Using acoustic fields to create the images means that they can also produce sound coming from the displayed content, as well as tactile feedback.

For example, they produced an audio-visual countdown timer that users can start and stop by tapping their finger on the display.

The prototype demonstrated in the work brings us closer to displays that could provide a fully sensorial reproduction of virtual content, the authors conclude. 

Does this mean that not only could Leia talk to Luke and Obi-Wan, the chaps would be able to talk back? Stay tuned…

Please login to favourite this article.