XiM20_BulletinBoard

First sight: Yanfeng XiM20

If you are designing a ride-share autonomous vehicle, how do you make an emotional connection with your passengers?

Published Modified

“The HMI is deliberately simple, there’s only five menu items,” smiles Tim Shih, VP for design and user experience at Yanfeng Technology. “It’s only one level deep. You don’t have to learn an entire HMI architecture when you’re only in a car for minutes.” A concept for short on-demand journeys, the XiM20 – which has just picked up a Red Dot award – proposes a different take on autonomy to some of the luxurious mobile living rooms we’ve seen lately – though it’s not short on technology, and its very simplicity brings a certain set of considerations for designers.

XiM20 Birds eye view-0575
XiM20 – a bird’s eye view

“The car business is built on emotional connection, desirability,” says Shih. “It’s about selling an object people want to buy and own, and to show off to their friends. Designers spend hours and hours trying to create an emotional hook. But what happens when you have an interaction that is only a few minutes or hours at a time? Even for a 15-minute ride, it’s about experiences – and an emotional connection can be powerful, it can make you want to use a service again and again.”

This focus on short-trip travel was driven by Yanfeng’s Quality of Life research study, a survey of 2000 respondents in Germany, China and the USA. “It’s a global mega-trend, the realisation that people are valuing experiences over possessions. We wanted to understand this more deeply, and how we could apply this to the auto industry, what it means for mobility,” says Shih. Key insights included the priorities of users within the timeframes of ride- and car-sharing journeys compared with vehicle ownership: saving time, access, convenience and smart purchasing (of the service) for a typical 20-minute trip or shorter, while sensory appeal, leisure time, comfort, and ‘surprise and delight’ features really only came into play over longer journeys.

“For 10 minutes, your needs are really limited,” says Volker Dreisbach, director of advanced design and leader of Yanfeng’s design team in Neuss, Germany. “The hypothesis is that in this kind of interior, there is not that much you would use the car [interface] to do – because you are not driving, you would use your phone or tablet. If you have time to watch a movie, what is the appetite to do that through the car when you have your own device? And today, the HMI in a car is only so complex because you have to use it while driving. Here, it is like stepping in a taxi, you just get in and ride.”

XiM20 Front seat view-0541
XiM20 – the interior controls can be accessed via the Smart Interior Surface (SIS) table in the front, as well as via a smartphone app

Since a journey is pre-booked via an app, Dreisbach says, “it all goes through your own mobile device and the car already knows what your preferences are, your settings. You can link your own addresses [for navigation], ‘home’: it’s a simple process.” HMI in XiM20, therefore, has been stripped down to essentials, mirrored on the app: general information such as the car’s position and direction on a map; when it was last cleansed (by a sweep of sanitising UV light); plus cabin controls, including those for temperature, seat positioning, and the audio system. These are housed in touch-sensitive, underlit capacitive surfaces embedded into the wide, flat wood-covered IP; this was designed to feel like a café table in the windowed front of the cabin, where one might sit and look out, in contrast to the more intimate, enclosed rear area.

The front seat area of the XiM20 concept

The wood was chosen for a natural feel, complementing the deliberate effort not to fit black screens: it showcases Yanfeng’s ‘shy tech’, where controls and switches are hidden until needed. Alongside the absence of visible switchgear, even features and furniture are minimal: storage is limited to open cubbies (with sensors and illumination to alert users if they have left items when they depart), and recesses for phone-charging.

UV sanitiser

“We have been having side conversations with Airbus about interior experiences, features, ambience, and the experience beyond a flight itself,” Shih adds. “There’s a lot we can learn from other industries.” Accordingly, since the experience starts before users get inside, an entry sequence has been choreographed, including welcoming lighting and a message on the ‘bulletin board’ display in the side panel, which people see as they climb in. Access is by palm recognition, although Shih notes that in user testing of the concept, some people expressed a preference for facial recognition, when carrying bags or objects in their hands.

“Once settled, you touch the ‘jewel’ [switch in the centre of the IP] to activate the vehicle,” Shih explains. “Then, the touch icons in the wood dash; and it tells you the time the car was last cleansed.” He points out that the main controls are still forward-facing, as in a conventionally-driven car, and agrees that motion-sickness remains a potential drawback in autonomous vehicles, although measures to mitigate against this can be incorporated into the cabin controller technology. “An immersive environment, a lack of horizon [view], certain smells, can all be factors. Part of smart cabin technology is being able to react to, for example, sensing impending motion sickness, or from blinking rate, fatigue. If the car knows you, it can react: we can adjust the vehicle settings.”

XiM20_ActiveSpace2
The Active Space in the XiM20 is a unique combination of in-cabin sensing (developed with partner IEE) and surface display technology

Atmosphere can be adjusted by animated lighting on the rear inner panels and headliner: gesture controls allow the passenger to manipulate soft abstract patterns or geometric shapes, all deliberately low-resolution. This gentle ambience highlights the alternative approach to providing ever-larger screens and an ever-more technical feel, whether conceptual or for production. Dreisbach says: “The other development is to get away from screens and make things a bit more friendly, so the tech appears a bit more human.”

He adds: “You have to do the translation from this rather extreme execution. This [XiM20’s technologies] will happen at some point, but the hard part is what you do in between – if the car still has a steering wheel, how much can you introduce to make the next steps interesting? Topics like bringing functionalities into surfaces are relatively close, these things will come relatively soon. There are practical executions in two ways – one is lower-resolution, like in the space in the rear of the car, which is interactive and playful, and the other is the highly-precise way of showing information, where the resolution is as good as on a screen.”

XiM20_SISTable3
The Smart Interior Surface table has functionality but remains an appealing surface

Dreisbach concludes: “I think the next step is to achieve a higher level of seamlessness to blend more things into the interior surfaces. XiM20 is inspiring the people [Yanfeng’s clients] we’ve presented it to, and then it sparks the discussion, ‘we want a little bit of that in the next car’. We see things happening.”

Powered by Labrador CMS