Project explores the 'internet of things'

May 05, 2010

(PhysOrg.com) -- Researchers from University College London have developed a digital tool that allows people to attach memories to objects in the form of text, audio or video.

The team at UCL's Centre for Advanced Spatial Analysis (CASA) is part of the 'Tales of Things' project, a collaboration between five universities.

'Tales of Things' encourages users to 'tag' objects with digital media using the sort of technology found in Oyster Cards and bar codes.

Users can upload an image of the object and an associated in the form of text, audio or video to the project's website - talesofthings.com - or using a dedicated application.

Once the user has entered this information they receive a unique barcode which they can attach to the object.

Objects are tagged using RFID tags and QR Codes, which are used in products such as the Oyster Card and on consumer goods.

This code can be read by taking a photograph from a mobile phone or webcam, linking the object back to its entry on the website.

The tags will enable future generations to have a greater understanding of the object’s past and offer a new way of preserving social history.

Researchers hope the project will offer a new way for people to place more value on their own objects in an increasingly disposable economy.

The project explores the implications of the 'internet of things' - the idea of a network of objects traceable at anytime.

Dr Andy Hudson-Smith, from the UCL Centre for Advanced Spatial Analysis, said: "CASA has built the technology for the project, allowing the construction of a database suitable for the 'internet of things'.

"It has developed the concept of making QR Codes read/writable for the development of memories and custom built an iPhone application to interact with any tagged object.

"UCL has tagged BBC Broadcasting House as part of the Radio 4 show Click On - the first building in the world to record the memories of its occupants.

"The project has notable potential, in terms of both global research and commercial opportunities."

The project is part of research run by TOTeM, a collaboration between UCL, Edinburgh College of Art, Brunel University, the University Of Dundee and the University of Salford.

Explore further: Newest computer neural networks can identify visual objects as well as the primate brain

More information: talesofthings.com/

Related Stories

An added dimension for virtual museums

Dec 05, 2004

Culture vultures enjoy exploring museum collections online. New 3D technology promises to make their experience richer still. With a mouse click, people can manipulate valuable objects as if they were in their own hands. ...

Virtual experiences can cause embellished, false memories

Dec 18, 2006

The next time you're in the market for a new camera, it might be best to read about the product's capabilities in a brochure rather than taking it for a test-run in an interactive, computer-generated virtual world. New research ...

Travel book goes mobile with scannable QR code

Oct 27, 2009

(AP) -- Many travelers still rely on comprehensive printed guidebooks for tourism information. But travelers are also increasingly using mobile technology to plan a trip or find their way around.

Recommended for you

Coping with floods—of water and data

18 hours ago

Halloween 2013 brought real terror to an Austin, Texas, neighborhood, when a flash flood killed four residents and damaged roughly 1,200 homes. Following torrential rains, Onion Creek swept over its banks and inundated the ...

Cloud computing helps make sense of cloud forests

Dec 17, 2014

The forests that surround Campos do Jordao are among the foggiest places on Earth. With a canopy shrouded in mist much of time, these are the renowned cloud forests of the Brazilian state of São Paulo. It is here that researchers ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.