Author:
Gustav Öquist
Affiliation:
Uppsala University, Sweden
Keyword(s):
Mobile devices, Interaction, Multimodality, Framework, Semiotics.
Related
Ontology
Subjects/Areas/Topics:
Agents
;
Artificial Intelligence
;
Biomedical Engineering
;
Biomedical Instruments and Devices
;
Emerging Technologies
;
Mobile Multimedia Applications
;
Mobile Software and Services
;
Personal Communication Systems
;
Pervasive Computing
;
Telecommunications
;
Wireless and Mobile Technologies
;
Wireless Information Networks and Systems
Abstract:
This paper explores how interfaces that fully uses our ability to communicate through the visual, auditory, and tactile senses, may enhance mobile interaction. The first step is to look beyond the desktop. We do not need to reinvent computing, but we need to see that mobile interaction does not benefit from desktop metaphors alone. The next step is to look at what we have at hand, and as we will see, mobile devices are already quite apt for multimodal interaction. The question is how we can coordinate information communicated through several senses in a way that enhances interaction. By mapping information over communication circuit, semiotic representation, and sense applied for interaction; a framework for multimodal interaction is outlined that can offer some guidance to integration. By exemplifying how a wide range of research prototypes fit into the framework today, it is shown how interfaces communicating through several modalities may enhance mobile interaction tomorrow.