Actronika: Touching the Metaverse

It’s not enough that we can see and hear the virtual world, to become truly immersed we need to feel it - to interact with all our senses. That’s exactly what Actronika is doing with its haptics vest, Skinetic. It incredibly recreates the sense of touch on your skin, using advanced technology to make your experience ultra-realistic.


Apple   Spotify   Google   RSS   YouTube

Podcast Transcript

Introduction

From meetings on Zoom to the promise of the metaverse, the time we spend in virtual worlds is only set to increase. But to feel truly immersed in that world, we need more than sight and sound: we need to feel. Actronika is a haptics company, creating technology that recreates the sense of touch on the skin so we can literally feel what’s happening on our TV and on our computer screens. They’ve developed a vest that’s set to change the virtual reality experience forever, called Skinetic.

The importance of touch

The biggest organ in the human body is skin. You have seven different receptors on your skin, which weighs an average of about seven kilos. Essentially, it’s huge. It’s bigger than your liver – and it’s always on. Not addressing it is totally impossible in any type of Metaverse or parallel world that you would like to emulate. With the torso being essentially the largest part of your body, this is where you have the most skin, so this is a must have for any immersive experience.

Good vibrations

Haptics is all related to the sense of touch. If you think about the sense of touch, you have to think about the pressure applied to your skin, then the temperature. And then there’s this third factor, and what we consider the most important one, and that’s the vibrotactile part. We think that this part is one of the most important factors. Whenever you touch an object, you can tell if it is warm or cold, you can tell if it’s hard or soft, but the actual fine quality of objects is complex, so it’s impossible to perceive unless you start sliding your fingers on the surface. When you do this, your fingers create small vibrations that give you information about the quality of the surface, so you can immediately tell if it’s wood, metal, plastic, or something else. That’s what we try to emulate with vibration.

The direction of history

In the next 10 years, most interfaces are going to be enriched by some kind of touch feeling, and then smell, taste, and so on. You’re going to be totally immersed. This is essentially the direction of history. Things are going faster, so we’re now able to give a sense of touch that is pretty close to reality. Until now, we had a problem because the computation speeds were not optimal. Touch is the best sense on your body that is actually the fastest, so you need to have response times under 10 milliseconds. Now we were able to do that, we can incorporate that sense of touch into interfaces.

The design process

The biggest selling point of kinetic is that you can actually feel the difference between, say, gunshots, fireballs and lasers, which is not necessarily the case for what you have on the market right now. It's also important to know that it doesn’t hurt. This is how we imagine the fireball to feel like, but eventually a game developer can think of something else. We basically design them by designing audio signals that are within the haptic range, which is very low frequency content. If you think about audio, the main difference between regular audio and haptics is that the waveforms are the same, but the frequency content is at the lower end of that. We strongly believe that eventually there’ll be people specialising in that, so audio designers will become haptic designers, and design the haptic experience alongside the audio. It will become a part of the design process.

Tricking the brain

In the real world, everything is synchronised. Whenever you grab an object, you would most probably feel and see the object and hear at the same time. If you want to provide that immersive experience in the virtual world, you have to be able to do the same because your brain has been trained for the whole life outside of VR. So if you expect your brain to be tricked into thinking that it’s a new reality, we need to respect the same laws. We focus on environmental interactions like rain or wind. We have created synthesisers that have all these parameters that you can think of. Raindrop size, intensity, the kind of rain – light rain, heavy rain – and we give you access to these parameters so you can control them independently. If you think about the rain, it will drop on your shoulder when you stay straight. But when you bend, it will drop on your back. This gives the extra immersive part to the experience as this is exactly what would happen in the real world.

Real world applications

The most obvious and present place for haptics that have been used is the mobile industry. Everyone has a phone with haptics in it, it’s just that the quality of haptics that they provide is not up to the task. The understanding of haptics was different from what it is now, so the market will take time to catch up with these high-definition haptics. As a company we’ve been working a lot with the automotive industry and the entertainment market. Streaming platforms like Netflix want you to stay home and watch movies on your TV, so cinemas have to reinvent the whole cinema experience to be something more than just turning on a TV. For a cinema-like experience, you don’t need a headset, but you will be able to experience an enhanced movie-watching experience. The haptic content has been designed in terms of the vibration and the specialization, and we provide software that helps designers to add haptic component in the most seamless way and the most efficient way.

Keeping it real

Overusing haptics is also not the best idea, so we try to use haptics in an intelligent way. We have haptic silences where there are no haptics, because there’s no good reason to hepatise anything. Because if you get a constant vibration of the whole movie, it would be too much. This is also the approach that we have. Designing haptic specifically for the experience, not just filtering stuff, it gives that freedom. And this is a decision to haptic something – not just because the audio is the way it is.

Our experts say….

Assessing Actronika’s needs

To design Skinetic and bring the vision to life, Actronika needs communication and collaboration with its external experts and its suppliers to be seamless. Onshape, our cloud-based computer aided design and product data management platform, helps them to collaborate with their extended team in a really unique way through several key aspects: access, sharing the data, and the management of change. Onshape is unique in that it runs in a web browser, so everyone, anywhere on Earth, can access the whole system immediately. For collaboration, there’s no copying of files; we all share the same master data. Models, drawings, assemblies, parts, all that stuff, anyone on the team can see, and they see changes happen instantly, no matter where they are. For change management, PDM has been a really unliked system for many years. With a new generation of PDM, you can manage the changes, new versions, revisions without any copying or locking.

How did Onshape help?

Actronika has been able to save 15-20% of their time thanks to this new generation PDM that’s built in Onshape. Before using Onshape, Actronika’s teams faced many problems related to data management, including unknowingly working on an outdated version of a file. Onshape has fixed the PDM problem in a big way. We have a new generation of PDM with Onshape, which does familiar things like versioning and release management bit without the old problems, and with more power. it also provided increased traceability. With Onshape, you can trace every operation performed, giving them versioning and release management, as well as the ability to easily create custom workflows.

Credits

Thanks to Jon for his insight and to Gilles and Rafal for taking us behind the scenes at Actronika’s headquarters.

Please rate, review and subscribe to our bi-weekly Third Angle episodes wherever you listen to your podcasts and follow PTC on LinkedIn and Twitter for future episodes.

This is an 18Sixty production for PTC. Executive producer is Jacqui Cook. Sound design and editing by Ollie Guillou. Location recording by Rebecca Rosman. And music by Rowan Bishop.

Episode Guests

Gilles Meyer, CEO at Actronika

More About Actronika

Rafal Pijewski, CTO at Actronika

More About Actronika
overlaycontent

Jon Hirschtick, EVP, Chief Evangelist at PTC

More About Onshape

overlaycontent