touch screen

Thinking about a text free touchscreen interface

iMuse is part of the Arts Council England World Stories project in the Ure Museum of Greek Archaeology. Working with a student panel, 14-16 year-olds from three schools, Guja Bandini, the Museum’s education officer, and professional animator Steve Simons, iMuse is tasked with providing an iPad app. This will bring together  material the project produces, interpreting myths in a fun and engaging way based on an object within each display case. The overall Project is about young people engaging with the objects in novel ways with iMuse having a particular interest in ensuring accessibility/inclusion are considered. Having an app is a means to an end, not a primary aim, so we’d agreed to use the mini web app previously tried in both the Museum of English Rural Life and the Ure. This was initially designed for use with QR codes on object labels, with a simple, layered interface using symbols and only a modicum of text. In the Ure, this could be used alongside a printed map.
While the interface did seem to provide a reasonably accessible way into finding out about objects (http://www.emeraldinsight.com/journals.htm?articleid=17068126&ini=aob), several things have set us experimenting again.
  • a comment from a teacher-participant that text on the main pages could prove a barrier
  • the decision by the Panel that Sophie the owl should act as guide in some way
  • the difficulty of interpreting a map
  • observing that visitors will experiment with a touchscreen without much instruction (e.g. the Ladybird book in the MERL)
  • the open invitation to create what materials you like about an object/myth (i.e. unknown numbers/types will arrive)
So, we’ve stepped away from mimicking the old ways (buttons looking like you are controlling something like a cassette recorder). Instead we are experimenting with an interface which has no written text initially, but has photos of the actual display cases to help orientation. Sophie as the cursor/guide, following the visitor’s finger, flies past these, settling on an object when requested. The visitor decides which bits of info they want to look at and can easily ‘fly back/forward’ to other cabinets.  
It’s not sensible to decide the exact interface until we know what material is going to be provided, but already we have some good ideas coming in about highlighting objects, having audio/visual ‘pop-up instructions, and ensuring any text-based items have audio versions, and any visual items have audio descriptions. There’s masses to think about here, with potential for more use of the media such as signing or captions on video. There are also practical considerations, not least iMuse’s very limited technical coding ability and our requirement that this remains a web-type rather than native app. The good thing is there is time for us all to discuss the possibilities and to do some trialling before the launch in early Summer. All comments about the interface welcome.