Mobile Devices and Interfaces

Computer Interfaces have come a long way in the half a century or so that we've seen them move from room fulls of air-conditioned electronics to a chip that sits in your phone or your watch, or in half a dozen different places in your car. Print terminals and teletypes have given way to CRT and LCD terminals, portable computers have evolved from luggables to laptops to pads.

But how much has the Human-Computer Interface changed? Despite the ubiquity of colour graphic screens and our webcams or the builtin cameras and microphones, most of our work with computers remains centred around text. Graphical visualizations for web search are still not competitive, although they have been researched for many years, and alternatives to the keyboard tend to gravitate to virtual keyboards rather than exploit alternate paradigms appropriate to the new technologies and form factors.

And how well integrated are our portable devices with our fixed infrastructure? We are really just starting to see apps for our phones and pads that can interact with our laptops, our cars and our homes.

The Centre for Knowledge and Interaction Technologies (KIT) implicitly and explicitly targets mobile applications in its programs, and this page will only serve to highlight those where the mobile project has developed a momentum of its own. See our pages on Assistive Technologies, Augmented Reality, Computer-supported Cooperative Work, Games and Simulation, Information Retrieval and Visualisation, Talking Thinking Teaching Heads, Autonomous Vehicles, Intelligent Robots & Adhoc/Resilient Mobile Communications for additional example of mobile interaction technologies.

In this web page, we will seek to highlight some of the features of our research that are specific to our mobile versions, as well as the specific advantages that our technology brings to mobile applications.

 

Featured Product

Head X for Mobile (available commercially as Clevertar) brings a virtual character to life, thus offering a novel yet very natural and intuitive means for users to interface with a software application. Research has demonstrated that users intrinsically exhibit a strong preference for human-human interaction. Indeed, users treat computers as social actors: they respond in social ways to computers when provided with even the most basic social cues. Measures of relationship building, such as trust, have accordingly been observed to be significantly higher for computers with expressive human-like traits. Furthermore, the multimodality of such an interface can improve understanding, reduce ambiguity, and offer a means for conveying non-verbal information, which overall leads to a more human-like interaction this is also more efficient as well as satisfying.

 
By emulating relevant aspects of human-human interaction, the Head X for Mobile technology enables existing and new software applications to be enhanced in this way. Specifically, Head X acts as middleware between 3D modelling software, interactive content such as a chatbots, and text to speech software. Head X interrogates the phonemes contained in the text-to-speech output and renders these as visual phonemes (visemes) whereby the 3D model’s mouth displays realistic, synchronised lip movements as the audio output is delivered. This is subsequently blended with non-verbal information through facial expressions and simulation of appropriate behavioural dynamics. A central event manager maintains oversight of both input and output to ensure the resultant interaction feels ‘natural’.

Head X for Mobile has the following benefits relative to existing options:

  • It enables a more natural, intuitive, satisfying interaction than possible through text or speech alone and can be less intimidating to use than conventional interfaces.
  • It allows for dynamic content: speech and animation need not be pre-recorded (as is typically the case with computer games), so, e.g., the software can speak your name and variously respond to unique situations.
  • It can enhance communication through affect and other non-verbal attributes; e.g. convey the urgency or importance of the message through intonation and expression. This strongly impacts on the user and leads to quick responses, as affective information is efficiently processed by humans in a parallel cognitive channel to non-affective information.
  • It has a vast range of application possiblities due to its ability to integrate with other software. This extends from offering new ways of interacting with conventional software to novel applications relying on human-like traits (e.g. truly personal assistants).
  • It greatly simplifies creating a human-like agent for any application; the technology is self-contained and can be used by any mobile application developer. Users have been shown to attribute more intelligence and trustworthiness to such agents than to conventional software, which is particularly relevant for applications in marketing or where expertise needs to be conveyed.
  • It only requires the hardware built into current, readily available smart phones and tablets, thus allowing millions of users and benefiting from rapid growth in this area.

Current Research

Please turn off your phone!  This common request at the start of a lecture or seminar may be a thing of the past.  The mobile phone can be used to interact with the lecturer and other students, including anonymous participation in surveys or the didactic questions that punctuate a good lecture - delivering higher participation rates than a mere show of hands! 

Encouraging proactive student engagement in lectures through collaborative note-taking on mobile devices is a project discussed in detail in the context of collaborative and social computing. In this project students can assist each other in making notes of what the lecturer is saying, noting down questions they need to follow up, copying a diagram off the board, jointly participating in capturing the most out of the talk, whilst also helping each other direct their learning experience with appropriate comments and questions.

Of course, with a mobile phone or pad, learning can move out of the classroom, and consistent with Flinders' routing streaming of lecture video, students watching a live video link can actually participate in the learning experience.

That's not all! Our Clevertar Thinking Head avatars are also available to help students find their way around campus, and new versions are being developed that will help students remember to do their assignments, and provide encouragement and support as part of a total learning experience – see the panel to the right for some screenshots.


 

Spin outs & Products


  • Open Day Buddy – download Matthew Flinders onto your iPhone or Android to show you around Flinders, personally!
 

Coming...

  • Study Buddy – fun, advice and encouragement for undergraduate students at Flinders University.