You are on page 1of 57

Context-Aware Pervasive Computing

S. Neelavathy Pari
Assistant Professor
MIT CAMPUS, Anna University

What is context?
Context of a person, software agents, things, situated ,devices, etc Context: location, time, etc -> information that can be used to recognize a situation

Related concept: the situation What is a context-aware pervasive computing system? A system that can respond intelligently to contextual information about the physical world acquired via sensors and information

Context-Aware Pervasive Computing Applications


Context-Aware Services Context-Aware Devices/Appliances/Artifacts Context-Aware Information Retrieval Context-Aware Security Context-Aware Messaging Context-Aware Environments (rooms, etc) . Basic Context-Aware system: Sensing + Reasoning + Acting

What are they? Systems with the capability to sense what is happening or changing in their environment, think and take appropriate actions to adapt to the changes. Why? To reduce the burden of excessive user involvement and providing proactive intelligent assistance. To realize the era of smart devices and smart environment.

When? Who? Xerox Palo Alto Work on ubiquitous computing began at


Research Center or PARC in 1987.

Bob Sprague, Richard Bruce, and others proposed developing wall-sized displays. Mark Weisercoined the phrase "ubiquitous computing" around 1988. The concept of pervasive computing emerged out ofubiquitous computingresearch at Xerox PARC and elsewhere in the early 1990s. The term 'context-aware' was first used by Schilit and Theimer in their 1994 paper.

On going research
Xerox's Palo Alto Research Center (PARC), for example, has been working on pervasive computing applications since the 1980s. IBM's project Planet Blue, for example, is largely focused on finding ways to integrate existing technologies with a wireless infrastructure. Carnegie Mellon University's Human Computer Interaction Institute (HCII) is working on similar research in their Project Aura, whose stated goal is "to provide each user with an invisible halo of computing and information services that persists regardless of location." The Massachusetts Institute of Technology (MIT) has a project called Oxygen. MIT named their project after that substance because they envision a future of ubiquitous computing devices as freely

Principles and Functionality


According to [R2], a context-aware pervasive system can be viewed as having three basic functionalities:
Sensing Thinking (metaphorically) Acting
Thinkin g

Sensing

Acting

Sensing (Context Acquisition)


Physical sensors Virtual sensors Logical sensors
Sensor

Sensor Fusion
a more comprehensive view of the physical world

Sensor 1

Sensor 2

Sensor 3

Context Context values from values from Sensor sensor 1 sensor 3 4 Combination of all the observations i.e. sensor1 + sensor2
+ sensors3 + sensor4

Thinking
Sensing Thinking Acting

Utilization of the data obtained Making sense to obtain knowledge Such knowledge together with other (perhaps built-in) knowledge can then be used to infer further knowledge about the context

Acting
Sensing Thinking Acting

Context information has been gathered, situations recognized, now it is time to take action. Action to be taken are application specific, and the action itself might be to perform further sensing. Considerations to be taken: Performance - action should be performed in time Control - the user should be able to retain control

Architecture
Design considerations for building context-aware systems relate to the previously mentioned three phases of sensing, thinking, and acting. The method of context-data acquisition is very important when designing context-aware systems because it predefines the architectural style of the system at least to some extent.

Methods of context-data Acquisition


Direct sensor access
Used in devices with sensors locally built in. The client software gathers the desired information directly from these sensors. It is not suited for distributed systems

Middleware infrastructure
Introduces a layered architecture with the intention of hiding low-level sensing details

Context server
Gathering sensor data is moved to the context server to facilitate concurrent multiple access. Matthias Baldauf; A survey on context-aware systems. Int. J. Ad Hoc and Ubiquitous Computing, Vol. 2, No. 4, 2007

Middleware Infrastructure based architecture


Application Acting Storage/Management Preprocessing Raw Data Retrieval Sensors Sensing Thinking

Scaling up Context-Aware Computing


How to live efficiently and effectively in increasingly complex, dense, dynamic, inhabited environments, i.e. places?
Hypothesis: Place Knowledge Bases can help...

How to enhance the physical world with digital informationthat adds value and meaning to our lives? -Hypothesis: Physical annotation
systems can help...

How to interact with the ever increasing complexity of collections of devices and ubiquitous services? -Hypothesis: Task-oriented abstraction
can help...

Smart Places: Place as Context, Context-Aware Mobile Applications for a Place


(with Anh-Tuan Nguyen, Seng Loke, Torab Torabi, Hongen Lu) Place? meaningful location/space... varying granularities What can we do if we have detailed knowledge about places? What knowledge about a place should we have? (static, dynamic/ real-time, historical, etc) Outline: Place-Based Virtual Communities The scenario System Architecture The PlaceAware Ontology

Place-Based Virtual Communities Place-Based Community where


communications happen at a place and are mediated by computers Community is supported by place-specific digital services Environment awareness is supported by sensing devices Ontology-based shared knowledge about a place: people (and their relationships), places, objects (and their relationships), etc... history (who was here?), movements, etc... ( a place knowledge base ) Smart

Scenarios
Friend Finder: by executing SPARQL query to the place knowledge base What happened there/here? What is happening there/here? Who were there/here?

System Architecture
The system uses JADE/LEAP to support agents on mobile devices

On stand alone mode: using JSR-82 for Bluetooth API

Protg, Jena, Sesame API is used for SPARQL queries

A snapshot of the PBVC Ontology in a tool called PlaceAware

PlaceAware software

Finding people

Finding Things (Devices, etc)es


When friends devices appear, their names and id is listed. If we choose the listing all mode, not only friends devices, but also other devices can be detected.

Location-Based Social Networking

Annotating the Physical World


Associate context with notes Markup for the physical world Associate things with notes Associate people with notes Associate space (and points in physical space) with notes

Indoor annotation

Outdoor annotation

Single object Collections of objects Single point in physical space Collections of points in physical space:
e.g.,

Associating Annotations with Things/Objects & Space

a line an area
collection of points)

(define by a line joining a

Annotating physical world

Software system to
Allow users to leave annotations/notes, to attach a note to a thing, a collection of things, a point in space, a collection of points in space (or its semantic equivalent, i.e., a building, a field, etc) in the right context... Allows users to retrieve annotations/notes in the right context...
(where Context = Time x ObjectID/LocationID x UserID x NearbyObjects x ...)

Things can get Complex

Ucode (in the Tokyo Ubiquitous Technology Project) But start with a zoo:

Spraying RFID tags all over a city?

http://www.tokyo-zoo.net/english/ueno/uc/mail.h

Or in you? swallow an RFID tag and annotate yourself ..


Kodak has developed an edible RFID tag, which they claim has many important benefits. The tags themselves are coated with a thick layer of soft gelatin which takes a while to dissolve. After dissolving the tags are intentionally fragile and will dissolve when exposed to gastric acids in the stomach. These tags can also be used in artificial joints, to help notify doctors when a replacement may be needed, and also these tags can be used on pills so that nurses can monitor if the patient has taken their medication or not.
http://www.trendhunter.com/trends/ei

What is a smart space?

Making Spaces and Things Taskable with


Seamlessly integrating computational elements into the fabric of everyday life [Weiser 1991], Everyday objects and environments are aware of their surroundings & peers and behave smartly.

The aim:
Support our activities, complement our skills, add to our pleasure, convenience, accomplishments

Invisible Computers?
The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it. Mark Weiser

Can we separate the interface from the computer?

Usability problems with smart spaces

Usability problems

Invisibility & Overload of features Technologies blend into environments. Frequently adding/removing devices and services to/from the spaces. One device -> tens of features Different combinations of devices Hundreds of features

Inconsistency of user interfaces Brand identification, product differentiation [Rich 2009; Oliveira 2008] Inconsistency of task executions Same tasks but different operations/ procedures when being executed in different smart spaces. How to tackle these usability problems? Our approach is based on task-oriented

Task-oriented scenarios
Pervasive citys scenario
Its 7p.m., its raining, and youre walking in the centre of city. You consult your phone and it suggests Dinner?, Taxi?, Bus?. Selecting Dinner? will present restaurants youre apt to like and even dishes that you may want

Pervasive university campuss scenario


Youre driving approaching Anna University Campus, the LCD on your car suggests Campus map?, Find a place?, Parking spot?. Selecting Parking spot will guide you to find a parking spot.

Pervasive Personal office Scenario


You enter your office. The lighting, heating, and cooling levels are automatically adjusted based on you electronic profile. The coffeemaker works to give you a cup of hot white coffee. You look at your Smartphone, and it tells you what tasks you can do with your phone here You point your phone or bring your phone near to a device/artifact and the phone shows you a list of tasks you can do with the device/artifact (e..g, device is an audio system, artifact is a research paper, chair, etc)

Task-Oriented Computing
Our approach is based on task-oriented computing [Wang et al. 2000]: A task is a users goal or objective Users interact with/think of the computing in terms of tasks instead of applications/devices functionality. Users focus on the tasks at hand rather than on the means for achieving those tasks [ Masuoka2003]. Application function is modeled as tasks and subtasks.

Approach: a task-oriented framework

Location-based task recommendation


Location = Anna University Campus
Location = Building PS1

Pointing based Task Recommendation

Current Implementation

Location = Personal Office

Current implementation

Future work, Issues


Design a comprehensive task description language Develop a graphical editor for authoring task descriptions Extend the task execution engine Develop mechanisms for effectively publishing and retrieving task models: Indexing, matching, searching, composing, recognizing task models Address conflicts of task executions in multi-user environments. Transparent/Translucent task execution... Macro recording of complex tasks... Performance Stability

You might also like