Automakers Race to Add Voice AI to Connected Vehicles

When rain begins to fall and a driver says, Hey Mercedes, is adaptive cruise control on?"-the car doesn't just reply. It reassures, adjusts, and nudges the driver to keep their hands on the wheel. Welcome to the age of conversational mobility, where natural dialogue with your car is becoming as routine as checking the weather on a smart speaker.
A new era of human-machine interactionThis shift is more than a gimmick. Conversational interfaces represent the next evolution of vehicle control, allowing drivers to interact with advanced driver-assistance systems-without fiddling with buttons or touchscreens. Automakers are embedding generative AI into infotainment and safety systems with the goal of making driving less stressful, more intuitive, and ultimately safer. Unlike earlier voice systems that relied on canned commands, these assistants understand natural speech, can ask follow-up questions, and tailor responses based on context and the driver's behavior. BMW, Ford, Hyundai, and Mercedes-Benz are spearheading this transformation with voice-first systems that integrate generative AI and cloud services into the driving and navigating experience. Tesla's Grok, by contrast, remains mostly an infotainment companion-for now. It has no access to onboard vehicle control systems-so it cannot adjust temperature, lighting, navigation functions. And unlike the approach taken by the early leaders in adding voice AI to the driving experience, Grok responds only when prompted.
Mercedes leads with MBUX and AI partnershipsMercedes-Benz is setting the benchmark. Its Mercedes-Benz User Experience (MBUX) system-unveiled in 2018-integrated generative AI via ChatGPT and Microsoft's Bing search engine, with a beta launched in the United States in June 2023. By late 2024, the assistant was active in over 3 million vehicles, offering conversational navigation, real-time assistance, and multilingual responses. Drivers activate it by simply saying, Hey Mercedes." The system can then anticipate a driver's needs proactively. Imagine a driver steering along the scenic Grosslockner High Alpine Road in Austria, hands tightly gripping the wheel. If the MBUX AI assistant senses that the driver is stressed via biometric data, it will slightly adjust the ambient lighting to a calming blue hue. Then a gentle, empathetic voice says, I've adjusted the suspension for smoother handling and lowered the cabin temperature by two degrees to keep you comfortable," At the same time, the assistant reroutes the driver around a developing weather front and offers to play a curated playlist based on the driver's recent favorites and mood trends.
A car with Google Maps will today let the driver say Okay, Google" and then ask the smart speaker to do things like change the destination or call someone on the smartphone. But the newest generation of AI assistants, meant to be interactive companions and copilots for drivers, present an entirely different level of collaboration between car and driver. The transition to Google Cloud's Gemini AI, through its proprietary MB.OS, platform enables MBUX to remember past conversations and adjust to driver habits-like a driver's tendency to hit the gym every weekday after work-and offer the route suggestions and traffic updates without being prompted. Over time, it establishes a driver profile-a set of understandings about what vehicle settings that person likes (preferring warm air and heated seats in the morning for comfort, and cooler air at night for alertness, for example)-and will automatically adjust the settings taking those preferences into account. For the sake of privacy, all voice data and driver-profile information are stored for safekeeping in the Mercedes-Benz Intelligent Cloud, the backbone that also keeps the suite of MB.OS features and applications connected.
BMW: from gesture control to voice firstAlthough BMW pioneered gesture control with the 2015 7 Series, it's now fully embracing voice-first interaction. At CES 2025, it introduced Operating System X-with BMW's Intelligent Personal Assistant (IPA), a generative AI interface in development since 2016-that anticipates driver needs. Say a driver is steering the new iX M70 along an alpine roadway on a brisk October morning. Winding roads, sudden elevation changes, narrow tunnels, and shifting weather make for a beautiful but demanding journey. Operating System X, sensing that the car is ascending past 2,000 meters, offers a bit of scene-setting information and advice: You're entering a high-altitude zone with tight switchbacks and intermittent fog. Switching to Alpine Drive mode for optimized torque distribution and adaptive suspension damping [to improve handling and stability]" The brain undergirding this contextual awareness now runs on Amazon's Alexa Custom Assistant architecture.
The Alexa technology will enable an even more natural dialogue between the driver and the vehicle, so drivers can stay focused on the road," said Stephan Durach, senior vice president of BMW's Connected Car Technology division, when Alexa Custom Assistant's launch in BMW vehicles was announced in 2022. In China, BMW uses domestic LLMs from Alibaba, Banma, and DeepSeek AI in preparation for Mandarin fluency in the 2026 Neue Klasse.
Our ultimate goal is to achieve...a connected mobility experience expanding from a vehicle to fleets, hardware to software, and ultimately to the entire mobility infrastructure and cities." -Chang Song, head of Hyundai Motor and Kia's Advanced Vehicle Platform R&D Division
Ford Sync, Google Assistant, and the path to autonomyFord, too, is pushing ahead. The company's vision: a system that lets drivers take Zoom calls while the vehicle does the driving-that is, once Level 3 vehicle autonomy is reached and cars can reliably drive themselves under certain conditions. Since 2023, Ford has integrated Google Assistant into its Android-based Sync system for voice control over navigation and cabin settings. Meanwhile, its subsidiary Latitude AI is developing Level 3 autonomous driving, expected by 2026
Hyundai researchers test Pleos Connect at the Advanced Research Lab's UX Canvas space inside Hyundai Motor Group's UX Studio in Seoul. The group's infotainment system utilizes a voice assistant called Gleo AI.Hyundai
Hyundai took a bold step at CES 2024, announcing an LLM-based assistant codeveloped with Korean search giant Naver. In the bad-weather, alpine-driving scenario, Hyundai's AI assistant detects, via readings from vehicle sensors, that road conditions are changing due to oncoming snow. It won't read the driver's emotional state, but it will calmly deliver an alert: Snow is expected ahead. I've adjusted your traction control settings and found a safer alternate route with better road visibility." The assistant, which also syncs with the driver's calendar, says You might be late for your next meeting. Would you like me to notify your contact or reschedule?"
In 2025, Hyundai partnered with Nvidia to enhance this assistant using digital twins-virtual replicas of physical objects, systems, or processes-which, in this case, mirror the vehicle's current status (engine health, tire pressure, battery levels, and inputs from sensors such as cameras, lidar, or radar). This real-time vehicle awareness gives the AI assistant the wherewithal to suggest proactive maintenance (Your brake pads are 80 percent worn. Should I schedule service?") and adjust vehicle behavior (Switching to EV mode for this low-speed zone."). Digital twins also allow the assistant to integrate real-time data from GPS, traffic updates, weather reports, and road sensors. This information lets it reliably optimize routes based on actual terrain and vehicle condition, and recommend driving modes based on elevation, road surface conditions, and weather. And because it's capable of remembering things about the driver, Hyundai's assistant will eventually start conversations with queries showing that it's been paying attention: It's Monday at 8 a.m. Should I queue your usual podcast and navigate to the office?" The system will debut in 2026 as part of Hyundai's Software-Defined Everything (SDx)" initiative, which aims to turn cars into constantly updating, AI-optimized platforms.
Speaking In March at the inaugural Pleos 25-Hyundai's software-defined vehicle developer conference in Seoul-Chang Song, head of Hyundai Motor and Kia's Advanced Vehicle Platform R&D Division, laid out an ambitious plan. Our ultimate goal is to achieve cloud mobility, where all forms of mobility are connected through software in the cloud, and continuously evolve over time." In this vision, Hyundai's Pleos software-defined vehicle technology platform will create a connected mobility experience expanding from a vehicle to fleets, hardware to software, and ultimately to the entire mobility infrastructure and cities."
Tesla: Grok arrives-but not behind the wheelOn 10 July, Elon Musk announced via the X social media platform that Tesla would soon begin equipping its vehicles with its Grok AI assistant in Software Update 2025.26. Deployment began 12 July across Models S, 3, X, Y, and Cybertruck-with Hardware 3.0+ and AMD's Ryzen infotainment system-on-a-chip technology. Grok handles news, and weather-but it doesn't control any driving functions. Unlike competitors, Tesla hasn't committed to voice-based semi-autonomous operation. Voice queries are processed through xAI's servers, and while Grok has potential as a copilot, Tesla has not released any specific goals or timelines in that direction. The company did not respond to requests for comment about whether Grok will ever assist with autonomy or driver transitions.
Toyota: quietly practical with AIToyota is taking a more pragmatic approach, aligning AI use with its core values of safety and reliability. In 2016, Toyota began developing Safety Connect, a cloud-based telematics system that detects collisions and automatically contacts emergency services-even if the driver is unresponsive. Its Hey Toyota and Hey Lexus AI assistants, launched in 2021, handle basic in-car commands (climate control, opening windows, and radio tuning) like other systems, but their standout features include minor collision detection and predictive maintenance alerts. Hey Toyota may not plan scenic routes with Chick-fil-A stops, but it will warn a driver when brakes need servicing or it's about time for an oil change.
UX concepts are validated in Hyundai's Simulation Room.Hyundai
While promising, AI-driven interfaces carry risks. A U.S. automotive-safety nonprofit told IEEE Spectrum that natural voice systems might reduce distraction compared with menu-based interfaces, but they can still impose moderate cognitive load." Drivers could mistakenly assume the car can handle more than it's designed to unsupervised.
IEEE Spectrum has covered earlier iterations of automotive AI-particularly in relation to vehicle autonomy, infotainment, and tech that monitors drivers to detect inattention or impairment. What's new is the convergence of generative language models, real-time personalization, and vehicle system control-once distinct domains-into a seamless, spoken interface.