top of page
Future IVX(In-Vehicle Experience)

Simulated installation of the future In-Vehicle Experience. 

Year of production: 2019

XCode, Swift, Processing, Unity,  C#, After Effects, Photoshop, Installation

 

The age of self-driving vehicles has come. For decades, the main purpose of the car was to keep me moving. But what can people do if they say the problem is solved? Personally, I am interested in cars, one of which is a fun driving experience. If all cars are fully automated, why buy a car? However, BMW's M series is for the fun driving experience. If we go in the era of autonomous driving, who will guarantee that fun. Therefore, I will research how future human-machine interaction can coexist and how automobile machinery can cooperate with people. What about the appearance of the car in the near future? There are several levels of automation, but in the next few years, the future may mainly consist of purely autonomous vehicles like the Google X auto-driving car without pedals or a steering wheel. Does that mean people can do nothing? it's not like that. In terms of human-machine interaction, machines can be being in a car more fun. Let's take a few examples. First of all, assuming a user is driving, the computer can act as a pacemaker during driving. For example, driving in the city, it improves navigation and helps the user by communicating with pedestrians and obstacles. Another example is, if this person is on the highway, to encourage fun driving experiences such as high-speed driving and gear shifting. Secondly, looking at the function as an assisting friend requires a more emotional interface. It will interact not only in cars but also in life. For example, the system will integrate with your other smart devices. In terms of the hardware side, it provides an ergonomic custom seat or sound system that uses only the frequencies that specific passengers can hear. Considering a software approach, a lighting system is triggered by the user's emotional mood or sleep mode when you are tired and a sound system synchronizes information without borders to listen to music that you heard right before from home. Additionally, the car itself comes to pick you up on your schedule. So I focus on what people can do in that personal moving space and how they respond to the autonomous car environments and interact with them.

 

Keywords: Autonomous Car, Future Car Simulation, Computational Media, Interactive Installation, Immersive Space

acx_doodle002.jpg
Screen Shot 2019-05-19 at 11.30.59 AM.pn
giphy_doodle.gif

Target

My targets are people who have been in cars, as a driver or a passenger. This included anyone who wants to use a car as a private space rather than transportation or a busy person or who wants to enjoy the city.

targe_icons.jpg

User Scenario Flowchart

The automobile I'm talking here is the Level 5 fully-automated driving systems. Some people still want to drive the road, other people want to look outside, and the others will think of various activities within that car.

user_scenario_flow_edited.png

User Test 1

ut_3.jpg
ut_1_1.JPG
ut_2.JPG

“Monitor mode is really engaging.”

“Tour guide + AR realtime”

“Design the car seat and dashboard.”  

“Car is a tool of transportation. If I’m not driving, I want to the best way of using that free time.”

  “Entertainment should be contextual.”

“I want to sleep, read books and do meditation.”

User Test 2

I conducted the second user test. I set up the car-scaled screen. And I used Unity software wanted to observe how they naturally engaged with this windshield. But at this time, I simulated touch interaction with my mouse, based on what the user wanted to select. The idea comes from future car interiors, where have no steering wheel and a simple dashboard. I thought touching the window was intuitive and natural interaction at the beginning. However,  most people tried to use voice control as soon as they sat down. Moreover, some participants felt uncomfortable touching the windows directly. because people wanted to spend their time relaxing while still being able to interact with the vehicle. Especially in the sleep mode.

TWU_0130.JPG
TWU_0131.JPG
giphy.gif
giphy_sleep.gif
user3.gif

Final Prototype

After taking in feedback, I decided to look for a better solution that would give users the freedom and comfort. So I wanted to integrate voice recognition to control the simulation. And I planned to move the window touch interaction to a portable device which allows users to control without any limitation of their postures.

final_icons.jpg

Mobile Application

I developed this application with Xcode, Swift, and three frameworks(UIKit, Speech, AVFoundation) and one library(SwiftSiriWaveformView).

Screen Shot 2019-05-19 at 11.31.23 AM.pn
Screen Shot 2019-05-19 at 12.34.49 PM.pn
giphy _key.gif
giphy_vc.gif
Screen Shot 2019-05-19 at 12.41.52 PM.pn
phone_video_call.gif
phone_sleep.gif
phone_map.gif
Simplified-Mockup-Iphone-X.png

Installation

The windshield envelops the user, creating a soft and cozy environment that feels organic.

cockpit2.png

This is the actual setup that I did. There are a curved screen, rear projection, and a bean bag chair. I built a curved screen by sawing a support system into a tent. Also, I picked this bean bag chair because it allows people to sit in any directions and to lay down.

Screen Shot 2019-05-19 at 11.31.41 AM.pn
Screen Shot 2019-05-19 at 11.31.44 AM.pn
Screen Shot 2019-05-19 at 11.31.44 AM.pn
Next Step & Vision

My next step is to explore more different activities and a unique interface. In the future,   there will be different 'kinds' of cars to purchase, and not the typical binaries of small vs big, luxury vs affordable, and red vs blue. Instead, we will base our purchase on categories such as sleeping cars, meeting cars, and family-friendly cars. Autonomous cars will become more than just a convenience, they'll become extensions of our home, office, and even coffee shop. 

Bibliography
 

Academic Journal / Research Paper

David, S. and Wendy, J. Using embodied design improvisation as a design research tool, International conference on human behavior in design October 14-17, 2014
Dimitrios, G.
and Fang, C. The use of affective interaction design in-car user interfaces. Work (WORK), 2012 Supplement; 41: 5057-5061. (5p)
Schmidt, A., Spiessl, W. and Kern, D. Driving Automotive User Interface Research. IEEE Pervasive Computing IEEE Pervasive Comput. Pervasive Computing, IEEE. 9(1):85-88Jan, 2010
Kun, A.L., Boll, S. and Schmidt, A., Shifting Gears: User Interfaces in the Age of Autonomous Driving. EEE Pervasive Computing IEEE Pervasive Comput. Pervasive Computing, IEEE. 15(1):32-38Jan, 2016
Tang, J.K.T. and Tewell, J., Emerging human-toy interaction techniques with augmented and mixed reality, Mobile Services for Toy Computing. (Mobile Services for Toy Computing, August 24, 2015
, 77-105)


Project

Yuan, J. The Driver-focused Human Machine Interface, http://www.driverfocusedhmi.com/, 2016
Communication Lighting for Autonomous Vehicle, Hyundai MOBIS at CES 2019
Mini Bedroom on the Road, Volvo 360c


Exhibition

The Road Ahead: Reimagining Mobility, Cooper Hewitt, Smithsonian Design Museum

 

bottom of page