Apple Music brings its audio haptics feature to all users as part of iOS 18
Apple's Music Haptics feature is now live, as part of the official release of iOS 18. This is an accessibility tool that integrates with Apple Music on iPhones. Simply put, it uses the phone's speaker-based haptics system, which the company refers to as the Taptic Engine, to create taps, textures and refined vibrations to the audio of the song."
This is quite obviously aimed toward those affected by hearing loss, allowing them to feel the music. It works with Apple Music, but also with Apple Music Classical and Shazam. The company says it'll also integrate with some third-party apps, so long as the iPhone is connected to Wi-Fi or cellular.
To get started, just head into the Accessibility settings menu and turn on Music Haptics." An easily identifiable logo will appear on the Now Playing screen in the Apple Music app when activated. Tapping this logo will pause the feature and tapping it again will turn it back on. Music Haptics is supported globally on iPhone 12 and later, as long as the device is updated to iOS 18.
To commemorate the launch, Apple Music has released a series of playlists that take advantage of the haptic technology. These channels have names like Haptics Beats and Haptics Bass, so they are filled with songs with plenty of opportunity for taps and vibrations.
People have already been experimenting with the feature. Some users have suggested that it sounds like an Atari game" when a phone is placed on a box with Music Haptics turned on. I don't agree but, well, listen for yourself.
This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/apple-music-brings-its-audio-haptics-feature-to-all-users-as-part-of-ios-18-184753345.html?src=rss