SIGN UP FOR UPDATES


Invoke Portal

13 Aug 2019



As we showed more and more people our previous projection mapping work, three things became apparent.

    1. People with technical minds think it's really cool, but sometimes others take a while to "get" what they are looking at.

    2. As it's designed for a single person's perspective, it doesn't engage groups as well as we would like.

    3. People will always get in the way of the projectors and the target objects.

The next stage of our development was therefore designed to address these limitations.

The Invoke Portal is like a window into virtual reality. Unlike a VR headset, the portal lets you share your experience with friends as you explore new dimensions together. In this demo video above, you can walk in and look around a real location captured through photogrammetry. This example is to scale, with objects in the real room arranged to match the virtual furniture (hinting towards applications more in the ballpark of mixed reality). Larger landscapes and structures could be viewed from the perspective of a giant instead. Interactive and invented spaces are also accessible through the portal, with multiple portals able to interact in the same space at the same time.

While VR headsets are ideal for individual immersion, and mobile devices like smartphones or tablets can be used in a similar way, the Invoke Portal is intentionally designed with a point of difference. It emphasises the size of your view, physical interaction, and inclusion of others that is best suited for public, event or gallery spaces.


Realtime Projection Mapping

12 Mar 2019



We were interested to explore projection mapping as a medium for delivering shared mixed reality experiences. This demo renders interactive virtual objects within physical objects using a projector and the Lighthouse tracking system.

While the Lighthouse tracking is convincing within a VR environment (where you have no visual reference of where the corresponding objects actually are), latency, drift and calibration errors are much more evident when deployed for mixed reality. Anything that is not "1:1" is clearly observable if you are looking for it.

This being said, the demo does illustrate the sort of haptic experience we are building into Invoke products. High-tech haptics add unnecessary complexity and will never be as immersive as interacting with a real object. The scene with the cubes tumbling in one of the boxes is a great example of this augmentation adding to an experience that is mostly substantiated by the physical object. Of course, more complex geometries, textures and materials can be integrated with wearable mixed reality displays due to specific limitations of the projection mapping setup.


LightWing

arc/sec Lab    12 Feb 2019



LightWing is an interactive installation at the intersection of architecture, art and technology. The kinetic object explores the digital augmentation of structure and materiality, and creates a mysterious sensation of tactile data.

Produced by Uwe Rieger and Yinan Liu at the arc/sec Lab, using the Invoke tracking system.

Hosted at the Pah Homestead/TSB Wallace Arts Centre from 12 Feb - 31 Mar 2019.