Back to RPO

This week, my main goal was to refine the three core pillars of my final year project, which meant diving into a lot of reading to uncover connections and identify gaps I want to explore further. Following Andreas’s advice from last Tuesday, I focused on finding articles and studies related to new media experiments using hand gestures. During this search, I came across a particularly intriguing book—The Implicit Body of Performance by Nathaniel Stern. The book begins with Stern describing "Enter," one of his projects for a physical computing class. His discussion about embodiment in interaction was both insightful and directly relevant to my exploration of gestures. It offered a fresh perspective on how physical engagement can enhance interactive experiences, which aligns well with the themes I’m researching.

Now, as I work on refining my three pillars, I’m leaning toward structuring them as follows: Pillar 1: Gesture as a Mode of Communication, which will explore how gestures function as a universal and intuitive language for interaction. Pillar 2: Embodiment and Interactive Experiences, focusing on the role of the body in creating immersive, meaningful interactions. Pillar 3: Human-Machine Collaboration, delving into how humans and machines can work together more effectively, especially through gesture-based interactions. These pillars not only encapsulate my research interests but also provide a clear framework for guiding the direction of my project. I’m excited to explore these areas further and see how they interconnect as my research progresses.

Motion Sensor Gloves

Today, I spent some time in the media lab with Andreas, working through some technical challenges related to the TouchDesigner file for the Echoes for Tomorrow project. The main task was figuring out how to display the output on two different screens using the same file, even though we only had one Mac mini and one webcam. Since I’m still pretty new to TouchDesigner and there aren’t many tutorials for the specific setup I was trying to achieve, it was a bit tricky. However, after some experimentation, I managed to get it working—yay!

Unfortunately, after getting everything running, we noticed that even a simple project was crashing unexpectedly. Given that there wouldn’t be anyone available at the exhibition to restart the application if needed, we decided it would be safer not to rely on TouchDesigner for the exhibition. Although it was a bit of a setback, I’m glad we caught the issue early enough to pivot and think about a more stable solution moving forward.

For my practical work, I focused on experimenting with the motion sensor gloves that Andreas let me borrow. Using the user manual as a guide, I managed to set up all the sensors in their correct positions, which felt like a small victory. However, I hit a roadblock when I realized that the software required to track the motion from the sensors wasn’t compatible with my laptop, so I couldn’t get the application up and running. It was a bit frustrating, but at least I made progress on the setup, and I’m exploring alternative ways to work with the gloves.