New Wearable Acting Technology

Here’s what the actor will wear on stage. In the photo below, you see a battery connected to our new 3-layer device:

The bottom layer is the TinyDuino microcontroller board.

The Middle layer is the 9-Axis Inertial Measurement Unit (IMU for short), which includes a gyroscope, accelerometer and magnetometer.

The top layer is the Bluetooth transmitter.


Proof of Concept of the New Wearable Acting Technology

- Harry Klein

Finding the Right Technology

Ben and Zeezee’s initial tech request seemed a little unlikely to
work in the format desired: basically trying to use a phone worn by
the actor to stream 4K video that was originally recorded as 8K,
360 degree video. Phones just don’t have the processing power for
those kinds of resolution, and the wireless connection would need
to be incredibly fast and incredibly reliable for it to look good.

Our project would require investigating other hardware, software
and wireless technologies for a better solution.

As we met a few more times to discuss the project, I gained a
greater understanding of the project’s goals and proposed a
secondary solution using either a phone or a microcontroller to get
the orientation of the actor’s head, and then sending that
orientation to a computer to do the processing. That way, only a
relatively small amount of information would actually need to be
sent wirelessly – which would result in a much more reliable
connection for performances, better processing speed and
potentially a smaller device that the actor would be required to
wear on stage.

Internet discussion threads were a good resource for exploring different technologies. We eventually came up with a path that we believed was feasible in the limited time allotted and went for it. Ben raced the clock to make it to the Micro Center before closing, and returned with the TinyDuino microcontroller we had settled on. Our project was about to move forward. More about that in New Wearable Acting Technology.

Along with the choice of microcontroller, we also needed software to receive the microcontroller data and tie it in with the video. I had had some experience programming with the game engine Unity during the previous summer and knew if its versatile capabilities. I suspected that it might be up to the task. In fact it worked marvelously well with one hitch.

I tried to use Unity’s built-in movie texture program in order to play the 360 video, but that initially produced incredible performance issues. After some online investigation, we discovered the AVPRO video plug-in for Windows, which was designed as an alternative to the built-in functions of Unity; it worked really, really well. It was actually a significant unanticipated expensive for our project, but it was key to making it all work!

The last major tech challenge was the need for the technology to work wirelessly for performances. I had no prior experience programming for any wireless technology so I was concerned that it might be a time sinkhole to figure out and program. However, it turned out to be easier than I had previously imagined because we went with a simple Bluetooth solution.

- Harry Klein

The Theatre-Tech Innovation Challenge

I knew that Zeezee was working on an adaptation of the Ramayana and wanted to do a multi-media production. I had offered to help in any way I could early on. But it wasn’t until Ben and Zeezee stopped by my house to check out my VR (virtual reality) headset and what I was doing with programming in the VR space, that they saw how I might contribute to the project.

A couple months earlier, they’d had a passing, wacky, random brainstorming idea. It resurfaced because of vague conceptual similarities between it and the VR space that I was showing them.

The idea? :

Wouldn't it be crazy and cool if we could have the audience experience the character Rama’s dream as he is dreaming onstage? How? Possibly by projecting 360 degree video onto stage-surround screens in the theatre, with the video reflecting Rama’s onstage movement. The audience could experience Rama’s dream through his eyes, thus inviting the audience into Rama’s reality and giving the performance space an extra dimension.

They hadn’t really expected the idea would work from a technical standpoint. It would involve tracking an actor’s movement in space and somehow integrating that with 360 video. There would be a lot of technical challenges to be addressed.

Don’t suggest to me something might be too difficult! I love a challenge, especially when it involves collaborating with amazing friends with great ideas! Also, it was just the sort of challenge I like. While I’m not into acting, I am really fascinated with creating new kinds of experiences for people – which is what drew me to start VR programming and to invite Ben and Zeezee over to check it out. In the theater realm, I had taken a course in lighting design and I had occasionally stopped by school productions to see what Zeezee was working on and to help out. But my preferred angle to creating experiences always seems to involve some sort of technology. I jumped at the chance to apply it to King’s Dharma.

- Harry Klein

It's Been a Long Year

1 year ago we started this the project which would later be called King's Dharma. We took a trip up to Zeezee's house in Vermont, and after watching the movie Inception, the entire story changed. 

We were inspired. We took what we knew from Inception, and initially mimicked the plot structure. Eventually it was decided that we should stray away from Nolan's story and create our own. 

A long night was ahead of us, as well as a long year. We thought that we could pull this off in 6 months. Little did we know that it would take us more than a year for the preproduction; something that we're still currently doing. 

So thank you for being with us for a whole year. This summer we plan on submitting the script and our story to Sundance's new innovation section. Who knows what will happen? We will also be checking out The Mill in Brooklyn, and Dartmouth's innovation lab as well. 

Continue to come back to the blog, we're going to try to vlog more. Stay tuned to a post about the DIY camera slider we made for under $30! 

360° Video! ...kind of

Wow. It's here. Finally. I guess, we did it? Well, that could be said. Thanks to the 360Fly and my wonderful engineering of a custom DIY immersive POV 360° VR-Ready camera rig, we filmed Act 1 Scene 7 of King's Dharma

So why the slight doubt that we did it? We still have to test it. Despite Harry not being mentioned much in the last few posts, he has been working incredibly hard on getting the TinyDuino to work (and it does! Video coming soon). For me, since I rotated the 360Fly HD 90°, I might have messed up the projection of the 360 video (although I might have misspoken since I did rotate the video as well). You can play with the video using your mouse or WASD keys to move around! 

Yes, there's some awkward stretching...we're trying to figure out what's going on.. Also, Zeezee's sister is standing in for Harry (who couldn't make the journey). 

Marvelous right?! So it's not the best video on YouTube, but it's definitely not the worst. However, we did learn some things after filming with the 360Fly.

  • 1504 x 1504 resolution is terrible. It's soft, looks like 360p when you view it on YouTube and a Premiere Pro VR Sequence. I tried to upscale the video to 2160p with the After Effects Detail Preserving Upscale effect and it helped, although you can see the interpolation.
  • The 360Fly is not true 360° video; it's somewhere around 360° by 240 (I believe). Maybe I messed up in the post processing step, but YouTube doesn't seem to understand that the video isn't technically spherical. While it doesn't matter for our purpose, I understand why it might be annoying. 

Okay, enough ranting; back to the creation of the film. After a 2 hour drive to Greenfield, MA, we came to the conclusion that Poet's Seat Tower was the designated filming location. Despite the 6in of snow and the freezing temperature, we spent 3 hours rehearsing the scene for 10 minutes of filming. Fun. And despite the fact that Zeezee's sister cannot be heard in the beginning, we were all very pleased with the end result. Now, the next step is to integrate the video with the TinyDuino!

Around 6pm today is when our test is scheduled so wish us luck!