Augmented Reality (AR) is a technique in which a user views the real scene in which they are standing through their mobile device (tablet/smartphone) which overlays 2/3D virtual figures/objects. It is a technique heralded as being of particular interest to heritage sites where for example parts of a building are missing.
iMuse has tried this out on a real site in only a very minor way so far (see First experiment with Augmented Reality) overlaying a 3D model of a cock ‘visiting’ the real hens in the chook run in the garden of the Museum of English Rural Life, Reading, and a 3D virtual cow ‘visiting’ a 3D real model one.
The technique sounds particularly appealing for sites which do not lend themselves to displaying many physical objects, or where it is desired to interpret the site in different ways for visitors – for example to show the site as it might have been in different time periods.
iMuse’s initial idea for an experiment was to place the people from an early photograph of the veranda at Brock Keep, Reading, back in the veranda as the visitor viewed it standing in the garden. The technique used (see techie stuff, below, for more info) – pointing the camera on the mobile device at a black and white pattern which would be overlaid with the people – has so far proved likely to be impractical in reality as the pattern would have to be very large and therefore intrusive on the physical site. Alternative techniques, for example using real objects in the scene to tell the system where to ‘place’ the people may be feasible but so far have gone beyond what appears to be practical on a low/non-budget using open source software. We need to investigate this more, including the lighting levels required indoors, the possible use of geolocation outside, how such a technique can work if there are crowds surrounding the visitor/object and the required specification of visitors’ mobile devices.
iMuse needs to find some real (but lowcost!) practical implementations of the technique in the wild to assure itself that AR is a feasible technology for a low budget site.
Open source software ARJS and Aframe was used to spot a simple black and white marker on which a 3D model of a cock (downloaded from Skectchfab in gltf format) was overlaid. While Android devices which were a few years old, and a Windows laptop which is 5 years old were able to ‘find’ the pattern, older IOS systems were not – it requires IOS11+ to use WebGL and older devices cannot be updated to use it. This problem will go away with time of course, but currently, given the ubiquity of iPhones, it’s likely that quite a few visitor devices would not be able to take advantage of AR-based activities.
Low lighting levels were problematic for some devices, and if printed so the surface was a bit shiny, the pattern was sometimes unreadable. An experiment creating the pattern using black and white velcro (the furry side!) yielded better results, as did having multiple different patterns placed on a cube so at least one of the patterns was readable as the user moved around. This technique will however require some refinement to stop the overlaid image ‘jumping’ due to slight offsets/inaccuracies in the placing of the multi-markers.
Try it yourself
Print out this marker, or display it on a screen
In your mobile device’s browser go to:
You may be asked whether you will allow access to the camera. Then point your camera at the marker. If your mobile device is able, it should show the cockerel and you should be able to move around with the view of the cockerel altering appropriately.