You are on page 1of 3

Bio: Dr George Ghinea is a Reader in Computing at Brunel University.

His interests lie in perceptual multimedia communications and he leads a group of six researchers exploring ways of furthering our understanding of interacting with computers.

Enriching Experience
The idea of computer-generated smell, known as olfactory data, is fast becoming a reality. Its an area of multimedia research that is generating interest right across the industry, as its possibilities seem endless. The idea that, through use of multiple media objects, consumers could be reached on a multi-sensory level is a tantalising one. Traditionally, media objects have engaged just two of our senses, namely our sight and hearing. Not for much longer it seems.

Olfaction, or smell, is one of the last challenges that multimedia applications have to conquer. The demand for increasingly sophisticated multimodal systems and applications that are highly interactive and multi-sensory in nature has led to the use of new media and new user interface devices in computing. Olfactory data is one of such media objects currently generating a lot of interest. In this way, information may be conveyed by the relationship between the combined media objects, as well as by the individual media objects. Benefits of multimodal systems include synergy, redundancy and an increased bandwidth of information transfer. Using olfactory data in computing is not without its challenges, however. One of these is the fact that olfactory data is virtual, unlike other data objects that have the ability to be stored in some computer data format. As such, olfactory data has to be stored in an external peripheral device attached to a computer with its emission generated by triggering the devices output of smell from the computer it is attached to. Nonetheless, areas of computing that have experienced significant usage of olfactory data over the years include virtual reality, multimodal displays and alerting systems, and media and entertainment systems. Sensory overload If we develop an effective way to involve scent in the users experience, it will mean that perception of a multimedia presentation or application is enriched in ways previously unimaginable. A benefit of having such multimodal information is that it appeals to all our senses at the same time, making the user experience even more complete and absorbing. Our visual and auditory senses are already overburdened with the respective visual and auditory cues we are required to respond to in computer information systems. Our attention spans are short and new techniques are needed to grab our awareness. Additional applications used to gain the users attention, more popularly known as notification or alerting systems, represent one of the areas in which olfactory data output

has been found to be quite beneficial. Research now distinguishes between odour output to convey information, where the smell released is related to the information to be conveyed (olfactory icons), and smell output to provide an abstract relationship with the data it expresses (also known as smicons). Another potential problem that is encountered when trying to develop and produce olfactory display devices is that whilst it is usually easy to combine other data formats (sound and video) to produce a desired output, it is extremely difficult to do the same with odour, as a standard additive process or model for combining different scents does not exist. Consequently, more often than not, smell-generating devices are limited to producing the specific smells that have been loaded into their individual storage systems; the system has its limits at the moment. Nonetheless, there has been some olfactory data usage in the computing field over the years, with a number of researchers building their own computer-generating smell systems and others relying on the few that are commercially available. More recently, researchers in Japan [3] have developed a smell-generating device which works by combining chemicals to produce the desired scents as and when required, but this device is also not yet commercially available and still at development stage. Smell of success Researchers in Scotland [1] have also used olfactory data for multimedia content searching, browsing and retrieval more specifically to aid in the search of digital photo collections. In their experiment, they compare the effects of using text-based tagging and smell-based tagging of digital photos by users to search and retrieve photos from a digital library. To achieve this, they developed an olfactory photo browsing and searching tool, which they called Olfoto. Smell and text tags from participants description of photos (personal photographs of participants were used) were created and participants had to use these tags to put a tag on their photos. Fragra, on the other hand, is a visual-olfactory virtual reality game [2] that enables players to explore the interactive relationship between olfaction and vision. The objective of the game is for the player to identify if the visual cues experienced correspond to the olfactory ones simultaneously. In the game, there is a mysterious tree that bears many kinds of foods. Players can catch these food items by moving their right hands and when they catch one of the items and move it in front of their nose, they can smell something that does not necessarily correspond to the food item they are holding. A similar interactive computer game, Cooking Game, has also been created by at the Tokyo Institute of Technology [3]. There is real potential in this area and the developing use of olfaction in the multimedia experience is likely to grow and become more sophisticated. Whilst there are undoubtedly challenges that need to be overcome, it is precisely these that spur on research scientists worldwide in the quest to create ever more complex and exciting

multimedia and multimodal applications for our use and enjoyment. The complete sensory experience is within our grasp.

(we dont usually cite references as its an article, not an academic piece.)
REFERENCES 1. Brewster, S.A., McGookin, D.K. & Miller, C.A. 2006, "Olfoto: Designing a smell-based interaction", CHI 2006: Conference on Human Factors in Computing Systems, pp. 653. 2. Mochizuki, A., Amada, T., Sawa, S., Takeda, T., Motoyashiki, S., Kohyama, K., Imura, M. & Chihara, K. 2004, "Fragra: a visual-olfactory VR game", SIGGRAPH '04: ACM SIGGRAPH 2004 SketchesACM Press, New York, NY, USA, pp. 123. 3. Nakamoto, T., Cooking game with scents, Tokyo Institute of Technology. Available: http://www.titech.ac.jp/news/e/news061129.html

You might also like