Game Development Based on Physiological Signals
Yan Gao
a
Industrial Engineering, School of Mechanical Engineering, Southwest Jiaotong University, Chengdu, Sichuan, China
Keywords: Physiological Signals, Game Development, Artificial Intelligence.
Abstract: The paradigm of gaming is shifting from mere entertainment products to intelligent emotional interaction
platforms. Integrating physiological signals—such as electroencephalography (EEG), electrocardiography
(ECG), galvanic skin response (GSR), eye tracking, and electromyography (EMG)—significantly enhances
user immersion and engagement. This study systematically analyzes the differential contributions of unimodal
and multimodal physiological signals in game development. The perceptual mechanisms and interaction
potential of key signals—including EEG encompassing (Steady-State Visual Evoked Potential (SSVEP),
P300, and Motor Imagery (MI)), eye tracking, electrooculography (EOG), and EMG—are examined.
Comparative case studies demonstrate unimodal implementations (e.g., EEG focus-controlled games, MI-
based parkour games, EMG-driven Virtual Reality (VR) rehabilitation) and multimodal integrations (e.g.,
EEG-EOG hybrid systems for intent recognition, EEG-eye tracking synchronization in VR). Results indicate
that unimodal approaches offer implementation efficiency and task-specific effectiveness, yet exhibit limited
experiential dimensionality. Conversely, multimodal integration substantially enriches the user experience
and enhances system robustness, albeit presenting significant technical integration challenges. This research
concludes that overcoming multimodal fusion bottlenecks is critical for future advancement. Refining signal
optimization mechanisms will accelerate the application of emotionally interactive gaming paradigms in
entertainment and therapeutic domains.
1 INTRODUCTION
Games have profoundly integrated into daily life and
are undergoing a transformative shift—evolving from
pure entertainment products toward intelligent
affective interaction and immersive experience
platforms. To continuously enhance player
immersion, engagement, and interest, game designers
and researchers persistently pursue more precise and
profound methods for insight into player states.
Traditionally, game design relied predominantly on
questionnaires, playtesting sessions, and designers’
intuition/experience to evaluate player experience
and guide optimization. Crucially, these approaches
exhibit fundamental limitations: most provide only
post-hoc and superficial insights, lacking the capacity
for real-time, uninterrupted, and objective detection
of players’ complex, dynamically fluctuating internal
physiological and psychological states during
gameplay (e.g., excitement, boredom, frustration,
flow state, cognitive load, startle response). This
a
https://orcid.org/0009-0001-5820-2698
constrains designers’ understanding of authentic
player experiences and impedes precise tuning of
gameplay and personalized adaptation. To elevate
user immersion, engagement, and interest,
researchers increasingly integrate physiological
signals—such as electroencephalography (EEG),
electrocardiography (ECG), galvanic skin response
(GSR), eye tracking, and electromyography
(EMG)—into game design. These signals offer
inherent objectivity, continuity, and accuracy,
thereby opening novel pathways for game
development. Consequently, integrating
physiological signals into game design not only
serves as a powerful complement to traditional
evaluation methods but more significantly propels
games toward greater intelligence, emotional
resonance, and deep experiential engagement,
substantially enhancing player interest.
This paper first introduces the physiological
signals employed in game research and development,
outlining the advantages of incorporating them.
Subsequently, it presents case studies of game
Gao, Y.
Game Development Based on Physiological Signals.
DOI: 10.5220/0014360500004718
Paper published under CC license (CC BY-NC-ND 4.0)
In Proceedings of the 2nd International Conference on Engineer ing Management, Information Technology and Intelligence (EMITI 2025), pages 411-415
ISBN: 978-989-758-792-4
Proceedings Copyright © 2025 by SCITEPRESS Science and Technology Publications, Lda.
411
development utilizing unimodal physiological
signals, followed by an exploration of the benefits of
multimodal signal integration and corresponding
multimodal game development examples. A
comparative analysis concludes that unimodal and
multimodal physiological signal integration serve
distinct roles in game development. This paper aims
to provide a foundation for understanding how
physiological signals can enhance interactive
experiences, and to encourage broader applications in
next-generation game design.
2 COMMON PHYSIOLOGICAL
SIGNALS
EEG is a non-invasive technique for mapping brain
signals, providing direct measurement of cortical
activity with sub-millisecond temporal resolution
(Vaid, Singh, & Kaur, 2015). It measures voltage
fluctuations arising from ionic currents within
neurons. EEG signals are typically represented as a
two-dimensional matrix, where one dimension
corresponds to the spatial arrangement of electrodes
(channels) and the other dimension corresponds to
time, reflecting task-related brain potentials (Altaheri
et al., 2023). Key EEG-derived signals include:
Steady-State Visual Evoked Potential (SSVEP) This
refers to the response detected in the visual cortex
when a user fixates on a visual stimulus flickering at
a specific frequency. The user's intent can be
identified based on this frequency-locked response
(Huang, 2021). Compared to other EEG signals,
SSVEP offers the advantages of operational
simplicity, strong signal stability, and superior noise
resistance (Müller-Putz, Scherer, Brauneis, &
Pfurtscheller, 2005). In gaming, players can interact
by simply gazing at specific elements (e.g., fixating
on a flashing icon can be interpreted as a command to
select or activate a function).P300 This is an event-
related potential (ERP) component, specifically a
positive deflection in the EEG waveform occurring
approximately 300 ms after the presentation of an
infrequent or significant target stimulus (e.g., a
flashing letter). It plays a critical role in Brain-
Computer Interfaces (BCIs), particularly in
applications like the P300 speller, where detecting
this waveform allows users to select characters on a
screen (Cecotti & Graser, 2011). In games, it may
occur when a player focuses on a specific event (e.g.,
their character being attacked). Detecting the P300
wave can be used to infer player intent, potentially
triggering hints or dynamically reducing game
difficulty. Motor Imagery (MI): This denotes the
mental rehearsal of a movement without any physical
execution. In the context of EEG, MI signals are
complex and exhibit high-dimensional structure
(Altaheri et al., 2023). When a user imagines limb
movement, characteristic changes occur in the power
of μ (8-12 Hz) and β (13-30 Hz) frequency bands
within the EEG. MI enables players to control game
mechanics through thought (e.g., mentally grasping
objects).
Eye Tracking involves monitoring eye
movements (gaze points, pupil changes) using
cameras or infrared sensors. Eye tracking technology
has emerged as a promising tool for enhancing user
experience and interaction in Virtual Reality (VR)
games. Research indicates that smooth pursuit eye
movements can be effectively utilized for object
selection and interaction within VR environments
(Khamis et al., 2018). Furthermore, eye tracking
metrics can assess user experience, satisfaction, and
attention in VR games, enabling the development of
adaptive games that respond in real-time to player
performance and behavior (Soler-Dominguez et al.,
2017).
Electrooculography (EOG) is a technique for
measuring eye movements by recording the electrical
potential around the eyes. It is based on the
chorioretinal potential, which changes with eye
movement. EOG technology offers significant
advantages for VR game interfaces. EOG-based
systems provide hands-free, natural interaction,
thereby enhancing the immersive experience within
VR environments (Kumar & Sharma, 2016).
EMG is a technique used to record the electrical
activity produced by muscles. It measures the
electrical potentials generated during voluntary or
involuntary muscle contractions. This method is
crucial for various applications, including muscle
fatigue detection, control of robotic mechanisms and
prosthetics, and the clinical diagnosis of
neuromuscular disorders (Boyer, Bouyer, Roy, &
Campeau-Lecours, 2023).
EMITI 2025 - International Conference on Engineering Management, Information Technology and Intelligence
412
3 HARNESSING
PHYSIOLOGICAL SIGNALS
FOR GAME DEVELOPMENT:
UNIMODAL AND
MULTIMODAL APPROACHES
3.1 Integration of Unimodal
Physiological Signals in Game
Development
Integrating physiological signals into game
development has emerged as a transformative
approach, significantly enhancing both gameplay
experience and entertainment value. Game
development based on unimodal physiological
signals offers relative technical simplicity and
demonstrates high efficacy for direct control or
affective state monitoring, substantially improving
user immersion and overall experience.
et al. designed a BCI game utilizing the TGAM
EEG module and the Unity engine. Their method
involved extracting features from EEG waves via the
TGAM module and processing these features to
derive user focus levels and blink signals. The Unity
engine was then employed to control a ball's speed
changes and jumping behavior based on the extracted
focus data and blink signals. This approach markedly
enhanced user immersion and experience during
gameplay while also improving player attention. It
represents a successful experimental application of
BCI technology within the gaming domain (Lv et al.,
2024).
Du proposed a comprehensive method for
constructing an online MI BCI gaming platform using
portable devices. The method first involved designing
a data acquisition paradigm and incorporating both
offline and online training to enhance subjects' MI
capabilities. Secondly, the author presented a
preprocessing scheme tailored for offline training to
improve the signal-to-noise ratio (SNR) and
conducted a comprehensive analysis of various
feature extraction methods—including Common
Spatial Patterns (CSP) and Riemannian geometry—
to optimize feature selection. Subsequently, the study
focused on analyzing the generalization capability
and robustness of Riemannian geometry classifiers,
conducting comparative experiments using CSP +
Linear Discriminant Analysis (LDA) as a benchmark
across different sessions and subjects. These
experiments validated the method's effectiveness and
feasibility on a self-collected dataset. Finally, a
complete BCI platform—encompassing data
acquisition to game control—and an online MI-
controlled parkour game were designed and
implemented. Following offline training, all 4
subjects successfully achieved real-time control of
the game, attaining a maximum game control success
rate of 87.5%. This platform also serves as an
application testing framework for other BCI systems
(Du, 2022).
Duan et al. developed a VR wrist rehabilitation
training system based on EMG signals. This research
involved acquiring surface EMG signals from wrist
movements and decoding movement intention using
muscle synergy theory to control the VR game.
Concurrently, stochastic disturbance forces were
introduced within the game environment, and
impedance control was employed to facilitate
interaction between the user and the virtual
environment, encouraging the exploration of different
movement control strategies. Experiments confirmed
the system's feasibility. Comparative training
experiments (with vs. without stochastic disturbance
forces) evaluated its effectiveness, demonstrating that
training with disturbance forces reduced task
completion time by 24% and increased path
efficiency by 26%compared to training without such
forces. These results prove that the system effectively
promotes the adoption of more efficient motor control
strategies by users, thereby enhancing training
efficiency (Duan, Zeng, & Song, 2024).
3.2 Game Development Based on
Multimodal Physiological Signals
Compared to unimodal approaches, multimodal
integration provides users with richer, more adaptive
gaming experiences and enables more accurate
emotion detection. The fusion of multiple signals
enhances interaction richness and system robustness,
making it particularly suitable for highly immersive
scenarios such as VR.
Li et al. proposed a hybrid BCI system method
based on EEG and blink EOG for recognizing MI
intentions. The system's performance was tested in
both 3D Tetris and 2D game environments. The study
focused on: 1) extracting features from EEG, CSP
methods against a proposed multi-feature extraction
approach; and 2) developing game-BCI control
strategies to improve players' BCI control
proficiency. To validate the effectiveness of the 3D
game environment in enhancing players' ability to
generate Event-Related Desynchronization
(ERD)/Event-Related Synchronization (ERS), the 2D
screen-based game served as the control experiment.
Statistical results demonstrated that the group
Game Development Based on Physiological Signals
413
executing MI tasks within the 3D Tetris environment
exhibited a significantly greater enhancement in
generating MI-related ERD/ERS. Game score
analysis revealed a clear upward trend in player
scores within the 3D environment, whereas no
significant decreasing trend was observed in the 2D
environment. These findings indicate that an
immersive and control-rich MI environment can
improve relevant mental imagery and enhance MI-
based BCI skills (Li et al., 2017).
Larsen et al. presented a synchronization method
for multimodal physiological data streams,
specifically integrating EEG with eye-tracking within
a VR headset. They implemented a hybrid SSVEP-
based BCI speller within a fully immersive VR
environment as a proof-of-concept use case.
Hardware latency analysis indicated an average offset
of 36 ms and an average jitter of 5.76 ms between the
EEG and eye-tracking data streams. The proposed
VR-BCI speller concept demonstrated its potential
for real-world applications. These results confirm the
feasibility of combining EEG and VR technology for
neuroscientific research, establishing new pathways
for studying brain activity within VR environments.
This work also lays the groundwork for refining
synchronization methods and exploring application
scenarios such as learning and social interaction
(Larsen et al., 2024).
4 CURRENT LIMITATIONS AND
FUTURE OUTLOOK
Game development based on unimodal physiological
signals is relatively mature. Unimodal signals, such
as Electrodermal Activity (EDA) and EEG, provide
valuable insights into player emotions during
gameplay. This approach enables developers to create
games that adapt to players’ emotional responses,
enhancing engagement and immersion. By leveraging
physiological signals, games can dynamically adjust
based on player reactions. For instance, game
difficulty or narrative elements can be modified in
real-time according to a player’s stress or excitement
levels, leading to more personalized gaming
experiences. The commercial viability of games
incorporating physiological signals is growing. The
release of controllers with integrated physiological
sensors, such as Sony’s Dualshock 5, signifies a trend
toward mainstream acceptance of biofeedback in
gaming. This development will likely drive broader
adoption of physiological signals in game design
(Hughes & Jorda, 2021). Conversely, multimodal
integration combines multiple physiological signals
to deliver richer, more accurate gaming experiences,
significantly boosting player interest. Nevertheless,
multimodal game development faces substantial
technical challenges. Effectively integrating and
optimizing these heterogeneous physiological signals
remains a primary hurdle. Future research should
prioritize Optimizing signal utilization to enhance
player experience and therapeutic outcomes and
exploring the full potential of multimodal approaches
to overcome current limitations.
5 CONCLUSIONS
As games evolve from entertainment products toward
intelligent affective interaction media, integrating
physiological signals (EEG, ECG, GSR, eye tracking,
EMG, etc.) into game design is becoming a critical
technological pathway to enhance user immersion,
engagement, and interest. This paper systematically
reviews the sensing mechanisms of common
physiological signals and examines the distinct
contributions and applications of uni-modal versus
multi-modal physiological signals in game
development.
The research first explains the principles and
game interaction potential of EEG (including SSVEP,
P300, MI), eye tracking, EOG, and EMG signals. It
then focuses on two key dimensions: applications of
uni-modal physiological signals in games, and game
development under multi-modal physiological
signals. In the unimodal analysis, highlight its
technical simplicity through case studies including
brain-controlled games (focus/blink-controlled ball
movement, MI-based running games) and EMG-
driven VR rehabilitation training, demonstrating its
effectiveness in enhancing immersion and enabling
specific functional control. For multi-modal fusion,
explore techniques such as EEG+EOG for MI
intention recognition and EEG+eye tracking
integration in VR, establishing multimodal
approaches’ significant value in delivering richer
adaptive experiences, improving interaction
robustness (especially in VR scenarios), and
facilitating user skill acquisition.
This study constructs a methodological
framework for physiological signal selection and
fusion design, summarizing key technologies. This
paper concludes that while unimodal approaches
offer simplicity, they provide limited experiential
dimensions; multimodal integration substantially
enhances experiential richness and accuracy but faces
challenges in technical integration. Future research
EMITI 2025 - International Conference on Engineering Management, Information Technology and Intelligence
414
should prioritize overcoming multimodal fusion
bottlenecks to advance optimized applications in
gaming and therapeutic domains. This work
establishes theoretical and technical foundations for
developing next-generation intelligent affective
interaction games.
REFERENCES
Altaheri, H., Muhammad, G., Alsulaiman, M., et al. (2023).
Deep learning techniques for classification of
electroencephalogram (EEG) motor imagery (MI)
signals: A review. Neural Computing and Applications,
35(20), 14681–14722.
Boyer, M., Bouyer, L., Roy, J. S., & Campeau-Lecours, A.
(2023). Reducing noise, artifacts, and interference in
single-channel EMG signals: A review. Sensors (Basel),
23(6), 2927.
Cecotti, H., & Graser, A. (2011). Convolutional neural
networks for P300 detection with application to brain-
computer interfaces. IEEE Transactions on Pattern
Analysis and Machine Intelligence, 33(3), 433–445.
Du, S. (2022). Design and implementation of online games
based on motor imagery EEG signals (Master's thesis).
Harbin Institute of Technology.
Duan, Y., Zeng, H., & Song, A. (2024). Research on wrist
training system of myoelectric controlled virtual reality
game. Journal of Nanjing University of Information
Science and Technology, 16(01), 76–82.
Huang, Z. (2021). Research on context-driven AR-BCI
brain-computer interaction methods (Master's thesis).
Zhengzhou University.
Hughes, A., & Jorda, S. (2021). Applications of biological
and physiological signals in commercial video gaming
and game research: A review. Frontiers in Computer
Science, 3, 557608.
Khamis, M., Oechsner, C., Alt, F., & Bulling, A. (2018).
VRpursuits: Interaction in virtual reality using smooth
pursuit eye movements. Proceedings of the 2018
International Conference on Advanced Visual
Interfaces (AVI '18), 1–8.
Kumar, D., & Sharma, A. (2016). Electrooculogram-based
virtual reality game control using blink detection and
gaze calibration. Proceedings of the 2016 International
Conference on Advances in Computing,
Communications and Informatics (ICACCI), Jaipur,
India, 2358–2362.
Larsen, O. F., Tresselt, W. G., Lorenz, E. A., Holt, T.,
Sandstrak, G., Hansen, T. I., ... & Holt, A. (2024). A
method for synchronized use of EEG and eye tracking
in fully immersive VR. Frontiers in Human
Neuroscience, 18, 1347974.
Li, T., Zhang, J., Xue, T., & Wang, B. (2017). Development
of a novel motor imagery control technique and
application in a gaming environment. Computational
Intelligence and Neuroscience, 2017, 5863512.
Lv, Z., Hu, W., Bo, H., et al. (2024). Immersive game
development based on brainwave detection. Internet of
Things Technology, 14(09), 12–14.
Müller-Putz, G. R., Scherer, R., Brauneis, C., &
Pfurtscheller, G. (2005). Steady-state visual evoked
potential (SSVEP)-based communication: Impact of
harmonic frequency components. Journal of Neural
Engineering, 2(4), 123–130.
Soler-Dominguez, J. L., Camba, J. D., Contero, M., &
Alcañiz, M. (2017). A proposal for the selection of eye-
tracking metrics for the implementation of adaptive
gameplay in virtual reality-based games. In S. Lackey
& J. Chen (Eds.), Virtual, Augmented and Mixed
Reality. VAMR 2017. Lecture Notes in Computer
Science, vol 10280 (pp. 359–366). Springer.
Vaid, S., Singh, P., & Kaur, C. (2015). EEG signal analysis
for BCI interface: A review. Proceedings of the 2015
Fifth International Conference on Advanced
Computing & Communication Technologies, Haryana,
India, 143–147.
Game Development Based on Physiological Signals
415