Visual experiences dominate the web.
There are a few that also use auditory ones. However, I would like to focus on a sensory dimension currently hardly used: touch. 👈🏻
Although most users access the Internet via mobile devices equipped with vibration motors, few web applications use haptic feedback to enrich the user experience.
Why touch? 👨🏻🔬
Simply put, Our brains process touch faster than sound and vision.
Research shows that we feel tactile signals in just 95ms
, while visual information takes about 170ms
to be registered (Ng et al., 2017).
This is why haptic feedback is so important. When you touch a digital interface and receive a haptic response, your body instantly feels that connection.
In studying Human-Computer Interaction (HCI), researchers found that interfaces that incorporate haptic feedback have created significant improvements in how We experience tech:
- The user response speed improves by up to 35% in complex tasks (Brewster & Brown, 2004);
- Memory retention increases by 23% when combined with visual stimuli (Hoggan et al., 2008);
- Emotional engagement rises by 27% in satisfaction metrics (Lee & Starner, 2010);
- Perceived quality and responsiveness of interfaces increase by 40% (Seaborn & Antle, 2011).
When visually impaired users feel haptic feedback from touchscreen elements, they navigate mobile interfaces 28% faster (Kane et al. 2013).
Similarly, Kuber and Yu (2010) found that when haptic feedback is used with screen readers, people with disabilities understand digital content 32% better.
✨ Simplifying: the useVibration
hook
Many developers shy away from adding vibration because implementing it seems complicated.
But it's pretty simple with the Vibration API. But to make it even easier for those who use React, I built the useVibration
hook:
const [{ isSupported }, { vibrate }] = useVibration();
// In a submission button
<button
onClick={() => {
submitForm();
vibrate(VibrationPatterns.SUCCESS);
}}
>
Submit
</button>
This hook provides:
- Support detection - automatically identifying whether the device supports vibration;
- Predefined patterns - eliminating the need to create patterns from scratch;
- Minimalist API - reducing the learning curve for developers;
- Integrated TypeScript - providing type safety and auto-completion.
🧩 Design and accessibility considerations
When implementing haptic feedback, developers should:
- Be unobtrusive - vibrations should remain subtle to avoid irritating users;
- Be consistent - the same pattern should always convey the same meaning;
- Be optional - users should be able to turn off haptic feedback;
- Use in combination - never rely solely on haptic feedback.
😓 Current limitations
It's important to acknowledge that vibration support varies across browsers and devices:
- Safari on iOS does not support the vibration API;
- Desktop browsers generally lack support;
- Some Android devices may ignore complex patterns.
However, developers can quickly provide visual or auditory alternatives when needed using the hook's
isSupported
check.
📡 The future is multisensory
Haptic feedback represents an opportunity to create more intuitive and accessible interfaces.
According to Moyes and Jordan (2021), touch-based interfaces will soon become a key feature distinguishing exceptional digital experiences from mediocre ones.
When we add thoughtful haptic feedback to our designs, we build websites and apps that people will remember and connect with on a deeper, more intuitive level. 🍀
💎 References
- Ng, Brewster and Williamson (2017) - Study on how mobile device use is affected when occupied with other tasks.
- Brewster and Brown (2004) - How tactile messages (through vibration) can convey information without looking at the screen.
- Hoggan, Brewster and Johnston (2008) - Research on the effectiveness of tactile feedback on touchscreens.
- Lee and Starner (2010) - Study on wrist devices that alert through vibration.
- Seaborn and Antle (2011) - How audio and touch feedback can increase empathy.
- Kane, Wobbrock and Ladner (2013) - Useful gestures for blind people: preferences and performance.
- Kuber and Yu (2010) - Study on touch-based authentication.
- Moyes and Jordan (2021) - How interfaces using multiple senses can be a competitive advantage.
This is my first one. If you have any suggestions or want to contribute, feel free to do so. I would appreciate it very much! 😊
Top comments (4)
That's so many references! Did you take classes on haptic feedback? How did you find the resources?
Great post. I have never really though about using haptic feedback on the web. I love the idea of using it to "increase contrast" in a non visual way and make things more accessible.
I wonder how much of an issue this is. I have gotten the impression that since JAWS is more powerful than Voiceover, most people using a screen reader are not on MacOS. (I should find a reference for that information haha)
They have been my hyperfocus for the past two weeks! 😂
I found the references using Perplexity. I refined my search until I found them. Unfortunately, some of these references required payment, and oh, they are expensive.
Great post about haptic feedback! It's incredible how this functionality enhances user experience on web and mobile while helping developers with its simple code. 😎
And this feedback is very important for people who have motor limitations.
Now, the only thing missing is API support on iOS and Safari (Why, Safari? Why? 😅)
Great take on haptic feedback! 👏 Touch can enhance web experiences, and your useVibration hook makes it easy to implement. 🚀