3.3 Future Development Challenges
Although the multi-modal interaction fusion in car
AR systems has solved many of the above problems,
it still has shortcomings. In strong light environments
or when driving in reverse, the AR projection is not
clear enough, which affects information transmission.
In this case, the detection of user gestures may also
fail and virtual information cannot be manipulated.
Simultaneously, if too much traffic information is
projected onto the AR display device or the
information layout is not reasonable enough, it will
indeed cause information overload problems, which
not only hinder the interaction inside the cabin, but
also affects the driver's observation of the outside
environment. In the future, there are many other AR
technologies that can be applied, and other interaction
methods can also be integrated into it. There is great
room for improvement in this system.
4 CONCLUSIONS
This study discussed how to enhance the
environmental perception and cabin experience of
intelligent vehicles. From the perspective of multi-
modal interaction, the research analyzed various
interaction methods such as visual image interaction,
touch interaction, and voice interaction to improve
the safety, practicality, and comfort of intelligent
vehicles. At the same time, the study found the
limitations of the single mode interaction mode, as
well as the functional conflicts and miscellaneous
problems caused by the rigid combination of multi-
modal interaction. To address these pain points, this
article introduced the application of augmented
reality technology in smart cars and concluded that a
closed-loop interaction of "human-vehicle-
environment" can be achieved by creating an AR in
car system that integrates multi-modal interaction,
achieving the goal of comprehensively enhancing the
perception of intelligent car environment and cabin
experience while ensuring safety. These findings
highlight the growing importance of integrating
emerging technologies to optimize human-vehicle
interaction. In the broader context of intelligent
transportation systems, future work should continue
to explore innovative approaches that bridge
technology and user needs, driving the evolution of
safer, more intuitive, and more immersive mobility
experiences.
REFERENCES
Baek, H. J., Lee, H. B., Kim, J. S., Choi, J. M., Kim, K. K.,
& Park, K. S. (2009). Nonintrusive biological signal
monitoring in a car to evaluate a driver's stress and
health state. Telemedicine and e-Health, 15(2), 182–
189.
Breitschaft, S. J., Clarke, S., & Carbon, C.-C. (2019). A
theoretical framework of haptic processing in
automotive user interfaces and its implications on
design and engineering. Frontiers in Psychology, 10,
1470.
David, O., Kopeika, N. S., & Weizer, B. (2006). Range
gated active night vision system for automobiles.
Applied Optics, 45(28), 7248–7254.
Gao, F., Ge, X., Li, J., Fan, Y., Li, Y., & Zhao, R. (2024).
Intelligent cockpits for connected vehicles: Taxonomy,
architecture, interaction technologies, and future
directions. Sensors, 24(16), 5172.
Guerrero-Ibáñez, J., Zeadally, S., & Contreras-Castillo, J.
(2018). Sensor technologies for intelligent
transportation systems. Sensors, 18(4), 1212.
Mehler, B., Kidd, D., Reimer, B., Reagan, I., Dobres, J., &
McCartt, A. (2016). Multi-modal assessment of on-road
demand of voice and manual phone calling and voice
navigation entry across two embedded vehicle systems.
Ergonomics, 59(3), 344–367.
Strayer, D. L., Cooper, J. M., Goethe, R. M., McCarty, M.
M., Getty, D. J., & Biondi, F. (2019). Assessing the
visual and cognitive demands of in-vehicle information
systems. Cognitive Research: Principles and
Implications, 4(1), 18.
Strayer, D. L., Cooper, J. M., Turrill, J., Coleman, J. R., &
Hopman, R. J. (2016). Talking to your car can drive you
to distraction. Cognitive Research: Principles and
Implications, 1(1), 16.
Xu, X., Wu, Z., & Zhao, Y. (2024). An improved
longitudinal driving car-following system considering
the safe time domain strategy. Sensors, 24(16), 5202.
Zhang, Y., Guo, L., You, X., Miao, B., & Li, Y. (2024).
Cognitive response of underground car driver observed
by brain EEG signals. Sensors, 24(23), 7763.
Zhao, D. (2022). Application of neural network based on
visual recognition in color perception analysis of
intelligent vehicle HMI interactive interface under user
experience. Computational Intelligence and
Neuroscience, 2022, 3929110.
Zhou, Y., Fu, R., Wang, C., & Zhang, R. (2020). Modeling
car-following behaviors and driving styles with
generative adversarial imitation learning. Sensors,
20(18), 5034.