At GDC 2023, I sat down in gaming accessory company Razer’s office and felt something I’d never experienced before: playing a video game and having my controller and headphones vibrate at different intensities that I could adjust to my liking. Then I watched a blockbuster superhero film with headphone vibration tuned to the action — all powered by the same software.
The software development kit, or SDK, created by tech studio Interhaptics, which was acquired by Razer last year, lets companies easily add vibration to their games, films and other media. Interhaptics founder Eric Vezzoli, now Razer’s general manager of Interhaptics, walked me through a demonstration of what the software can do.
He noted that the software takes just a day to be implemented into a game, and then vibration will be automatically added for any feedback device, be it a controller, smartphone, headphones, haptic vest or other device. Even if a developer is adding peripherals with different vibration frequency ranges, the software can add haptic feedback that’s suited for each device. That simplifies the process when, say, trying to set vibration levels to be similar on iPhones and Android phones, which have very different vibration ranges.
“We take the designer’s intention and we translate it to machine capability,” Vezzoli said.
The haptic composer software, as it’s properly called, also puts vibration control in gamers’ hands. In the game demo I played, I was able to toggle whether vibrations would happen when triggered by my character, enemies or the environment, as well as tone them down if they were too intense. The software put control of vibration feedback in my hands.
The software SDK launched with support for PS4, PS5, Meta Quest 2 and X-input controllers, as well as iOS and Android phones. Developers can set up custom vibrations for potentially any number of different peripherals with haptics, allowing them to pulse or vibrate at different intensities to convey whatever emotion or action fits the game or movie scene.
That list of peripherals includes the Razer Kraken V3 HyperSense headphones, which have haptic motors spread around both earcups and are the headphones I wore for the demo. While I was playing the simple dungeon-crawling game that Vezzoli and his team built to show off the SDK, every sword swing by my character pulsed vibration around my ears, while enemies hitting my character buzzed my ears in a noticeably different way.
Then I watched scenes from films with headphone vibration coinciding with exciting moments — buzzing along while a superhero used their powers, or, during a suspenseful silence, pulsing at a low frequency that subtly alternated between ears, like a heartbeat.
If I’m being honest, it felt weird to have headphones buzzing around my ears with dynamic patterns — the pitter-patter of heartbeats or triumphant vibrating bursts of superheroes clashing, which I’m used to hearing via sound effects, not feeling on my skin.
But I could see how, if I were to get used to dynamic vibrations around my ears — or with future devices, elsewhere on my body — they could make entertainment more immersive. I remember discovering how much listening to footsteps made me better at finding enemies in first-person shooters, and dynamic vibrations about explosions or activity could similarly point me in the right direction. Movies and shows, which rely on visuals and soundscapes to convey tone and mood, could add a new layer with haptics — and the technology seems ideally suited for VR developers to add texture to their immersive worlds.
Razer and Interhaptics’ software is admittedly a bit future-facing, since controllers and smartphones are far more common than vibrating headphones or other peripherals. But the company is sending out developer kits with the Razer Kraken V3 HyperSense headphones for developers to try adding the SDK software to their game.
“It’s a different type of experience, and we believe we can generate enormous value from a user experience playing these games,” said Vezzoli.