You are on page 1of 13

Surface computer

A surface computer is a computer that interacts with the user through the surface of an ordinary object, rather than through a monitor and keyboard. The category was created by Microsoft with Surface (codenamed Milan), the surface computer from Microsoft which was based entirely on a Multi-Touch interface and using a coffee-table like design, and was unveiled on 30 May 2007. Users can interact with the machine by touching or dragging their fingertips and objects such as paintbrushes across the screen, or by setting realworld items tagged with special bar-code labels on top of it. The Surface is a horizontal display on a table-like form. Somewhat similar to the iPhone, the Surface has a screen that can incorporate multiple touches and thus uses them to navigate multimedia content. Unlike the iPhone, which uses fingers' electrical properties to detect touch, the Surface utilizes a system of infrared cameras to detect input. Uploading digital files only requires each object (e.g. a Bluetooth-enabled digital camera) to be placed on the Surface. People can physically move around the picture across the screen with their hands, or even shrink or enlarge them. The first units of the Surface will be information kiosks in the Harrah's family of casinos. Also receiving units will be T-Mobile, for comparing several cell phones side-by-side, and Sheraton Hotels and Resorts, which will use Surface to service lobby customers in numerous ways.[1][2] The Surface has a 2.0GHz Core 2 Duo processor, 2GB of memory, an off the shelf graphics card, a scratch-proof spill-proof surface, a DLP projector, and 5 infrared cameras as mentioned above. However, the expensive components required for the interface also give the Surface a price tag of between $12,500 to $15,000.[3]

See also

Surface computing Multi TouchLight Jeff Han FITR [4]

References
1. ^ Grossman, Lev (14 June 2007). "Feeling out the Newest Touch Screens". TIME.

Retrieved 2007-06-16.
2. ^ "Microsoft Launches New Product Category: Surface Computing Comes to Life in

Restaurants, Hotels, Retail Locations and Casino Resorts". Microsoft. 29 May 2007. Retrieved 2007-06-16.
3. ^ "Microsoft Surface How To Buy". Microsoft. 2009. Retrieved 2009-05-01.[dead link] 4. ^ Jeff Han

External links

Microsoft Surface

Microsoft Surface
Microsoft Surface

Developer(s) Initial release Stable release Development status Operating system Available in Website

Microsoft April 17[1] 2008 Surface 2.0 / 2011 Commercial applications Surface: Windows Vista, Surface 2.0: Windows 7 English www.microsoft.com/surface/

Microsoft Surface (codename Milan) is a multi-touch product from Microsoft which is developed as a software and hardware combination technology that allows a user, or multiple users, to manipulate digital content by the use of gesture recognition. This could involve the motion of hands or physical objects. It was announced on May 29, 2007 at the D5 conference.[2] Targeted customers are in the hospitality businesses, such as restaurants, hotels, retail, public entertainment venues and the military for tactical overviews. The preliminary launch was on April 17, 2008, when Surface became available for customer use in AT&T stores.[1] The Surface was used by MSNBC during its coverage of the 2008 US presidential election;[3] and is also used by Disneylands future home exhibits, as well as various hotels and casinos. The Surface was also featured in the CBS series CSI: Miami and EXTRA! Entertainment news. As of March 2009, Microsoft had 120 partners in 11 countries that are developing applications for Surface's interface.[4] On January 6, 2011, Microsoft previewed the latest version of Microsoft Surface at Consumer Electronics Show (CES) 2011, simply named Microsoft Surface 2.0, which was built in partnership with Samsung.[5]

Contents
[hide]

1 Overview 2 History 3 Features 4 PixelSense

5 Specifications

5.1 Surface 5.2 Surface 2.0

6 Applications development 7 Related Microsoft research projects 8 See also 9 References 10 External links

Overview
Microsoft Surface is a surface computing platform that responds to natural hand gestures and real world objects. It has a 360-degree user interface, a 30 in (76 cm) reflective surface with a XGA DLP projector underneath the surface which projects an image onto its underside, while five cameras in the machine's housing record reflections of infrared light from objects and human fingertips on the surface. The surface is capable of object recognition, object/finger orientation recognition and tracking, and is multi-touch and is multi-user. Users can interact with the machine by touching or dragging their fingertips and objects such as paintbrushes across the screen, or by placing and moving placed objects. This paradigm of interaction with computers is known as a natural user interface (NUI). Surface has been optimized to respond to 52 touches at a time. During a demonstration with a reporter, Mark Bolger, the Surface Computing group's marketing director, "dipped" his finger in an on-screen paint palette, then dragged it across the screen to draw a smiley face. Then he used all 10 fingers at once to give the face a full head of hair. Using the specially-designed barcode-style "Surface tags" on objects, Microsoft Surface can offer a variety of features, for example automatically offering additional wine choices tailored to the dinner being eaten based on the type of wine set on the Surface, or in conjunction with a password, offering user authentication. A commercial Microsoft Surface unit is $12,500 (unit only), whereas a developer Microsoft Surface unit costs $15,000 and includes a developer unit, five seats and support. Partner companies use the Surface in their hotels, restaurants, and retail stores. The Surface is used to choose meals at restaurants, plan vacations and spots to visit from the hotel room. Starwood Hotels plan to allow users to drop a credit card on the table to pay for music, books, and other amenities offered at the resort. In AT&T stores, use of the Surface include interactive presentations of plans, coverage, and phone features, in addition to dropping two different phones on the table and having the customer be able to view and compare prices, features, and plans. MSNBC's coverage of the 2008 US presidential election used Surface to share with viewers information and analysis of the race leading up to the election. The anchor analyzes polling and election results, views trends and demographic information and explores county maps to determine voting patterns and predict outcomes, all with the flick of his finger. In some hotels and casinos, users can do a range of things, such as watch videos, view maps, order drinks, play games, and chat and flirt with people between Surface tables

History

Demonstration using Microsoft Surface (View in high quality) The product idea for Surface was initially conceptualized in 2001 by Steven Bathiche of Microsoft Hardware and Andy Wilson of Microsoft Research.[6] In October 2001, DJ Kurlander, Michael Kim, Joel Dehlin, Bathiche and Wilson formed a virtual team to bring the idea to the next stage of development. In 2003, the team presented the idea to the Microsoft Chairman Bill Gates, in a group review. Later, the virtual team was expanded and a prototype nicknamed T1 was produced within a month. The prototype was based on an IKEA table with a hole cut in the top and a sheet of architect vellum used as a diffuser. The team also developed some applications, including pinball, a photo browser and a video puzzle. Over the next year, Microsoft built more than 85 early prototypes for Surface. The final hardware design was completed in 2005. A similar concept was used in the 2002 science fiction movie Minority Report. As noted in the DVD commentary, the director Steven Spielberg stated the concept of the device came from consultation with Microsoft during the making of the movie. One of the film's technology consultant's associates from MIT later joined Microsoft to work on the Surface project.[7] Surface was unveiled by Microsoft CEO Steve Ballmer on May 30, 2007 at The Wall Street Journal's 'D: All Things Digital' conference in Carlsbad, California.[8] Surface Computing is part of Microsoft's Productivity and Extended Consumer Experiences Group, which is within the Entertainment & Devices division. The first few companies to deploy Surface will include Harrah's Entertainment, Starwood Hotels & Resorts Worldwide, T-Mobile and a distributor, International Game Technology.[9] On April 17, 2008, AT&T became the first retail location to launch Surface.[10] In June 2008 Harrahs Entertainment launched Microsoft Surface at Rio iBar[11] and Disneyland launched it in Tomorrowland, Innoventions Dream Home.[12] On August 13, 2008 Sheraton Hotels introduced Surface in their hotel lobbies at 5 locations.[13] On September 8th, 2008 MSNBC began using the Surface to work with election maps for the 2008 US Presidential Election on air. MSNBC's political director, Chuck Todd, was placed at the helm.

Features

Object recognition. Microsoft notes four main components being important in Surface's interface: direct interaction, multi-touch contact, a multi-user experience, and object recognition. Direct interaction refers to the user's ability to simply reach out and touch the interface of an application in order to interact with it, without the need for a mouse or keyboard. Multi-touch contact refers to the ability to have multiple contact points with an interface, unlike with a mouse, where there is only one cursor. Multi-user is a benefit of multi-touchseveral people can orient themselves on different sides of the surface to interact with an application simultaneously. Object recognition refers to the device's ability to recognize the presence and orientation of tagged objects placed on top of it. The technology allows non-digital objects to be used as input devices. In one example, a normal paint brush was used to create a digital painting in the software.[14] This is made possible by the fact that, in using cameras for input, the system does not rely on restrictive properties required of conventional touchscreen or touchpad devices such as the capacitance, electrical resistance, or temperature of the tool used (see Touchscreen). The computer's "vision" is created by a near-infrared, 850-nanometer-wavelength LED light source aimed at the surface. When an object touches the tabletop, the light is reflected to multiple infrared cameras with a net resolution of 1024 x 768, allowing it to sense, and react to items touching the tabletop. Surface will ship with basic applications, including photos, music, virtual concierge, and games, that can be customized for the customers.[15] A unique feature that comes preinstalled with Surface is the pond effect "Attract" application. Simply, it is a "picture" of water with leaves and rocks within it (a lot like Microsoft Surface Lagoon, included in the Surface Touch Pack). By touching the screen, users can create ripples in the water, much like a real stream. Additionally, the pressure of touch alters the size of the ripple created, and objects placed into the water create a barrier that ripples bounce off, just as they would in real life.

PixelSense
PixelSense is a technology used in newer Surface devices. It allows recognition of fingers, hands, and objects that are placed on the screen, enabling vision-based interaction without the use of cameras. Sensors in the individual pixels in the display register what is touching the screen. A step-by-step look at how PixelSense works: 1. An object is placed on the display

2. An infrared back light illuminates the object (through the optical sheets, LCD and protection glass) 3. Light reflected back from the object is registered by the sensors integrated in the pixels 4. Values reported from all of the sensors are used to create a picture of what is on the display 5. The picture is analyzed using image processing techniques creating a corrected image 6. The corrected sensor image and information about the objects placed on the display are sent to the PC

Specifications
Surface
Surface is a 30-inch (76 cm) display in a table-like form factor, 22 inches (56 cm) high, 21 inches (53 cm) deep, and 42 inches (107 cm) wide.[15] The Surface tabletop is acrylic, and its interior frame is powder-coated steel. The software platform runs on a custom version of Windows Vista and has wired Ethernet 10/100, wireless 802.11 b/g, and Bluetooth 2.0 connectivity.[15] Surface applications are written using either Windows Presentation Foundation or Microsoft XNA technology.[16] At Microsoft's MSDN Conference, Bill Gates told developers of "Maximum" setup the Microsoft Surface was going to have:

Intel Core 2 Quad Xeon "Woodcrest" @ 2.66 GHz with a custom motherboard form factor about the size of two ATX motherboards. 4GB DDR2-1066 RAM 1TB 7200RPM Hard Drive

The discontinued (as of 6 January 2011) commercially available version had the following specifications[17]: Intel Core 2 Duo @ 2.13 GHz 2GB DDR2 RAM 250GB SATA Hard Drive

Surface 2.0
Samsung's "SUR4.0 with Microsoft Surface", a third-party production of Microsoft Surface billed as "The Surface 2.0 Experience", has a 40 in (102 cm) 1080p LCD HD screen, a 2.9GHz AMD Athlon II X2 processor, a Radeon HD 6700M graphics card, and is 4 in (10 cm) thick. Microsoft Surface is now wall-mountable and running off a new more polished and refined Windows 7 GUI (now including Windows Phone 7 support). For this version, Microsoft created a new technology called PixelSense. In this technology, the IR sensors are made part of LCD display, which allows the surface of the table to sense (or see) what is on top of it without using a camera.[18]

Applications development
Microsoft Surface applications can be written in Windows Presentation Foundation or XNA. The development process is much like normal Vista development, but custom WPF controls had to

be created by the Surface team due to the unique interface of Surface. Developers already proficient in WPF can utilize the SDK to write Surface apps for deployments for the large hotels, casinos, and restaurants.[19]

Multi-touch

Multi-touch screen In computing, multi-touch refers to a touch sensing surface's (trackpad or touchscreen) ability to recognize the presence of two or more points of contact with the surface. This plural-point awareness is often used to implement advanced functionality such as pinch to zoom or activating predefined programs. In an effort of disambiguation or marketing classification some companies further breakdown the various definitions of multi-touch. An example of this is 3M defining multi-touch as a touchscreen's ability to register three or more distinct positions.[1]

Contents

1 History 2 Brands and manufacturers 3 Implementations 4 Popular culture references 5 See also 6 References 7 External links

History
The use of touchscreen technology to control electronic devices pre-dates multi-touch technology and the personal computer. Early synthesizer and electronic instrument builders like Hugh Le Caine and Bob Moog experimented with using touch-sensitive capacitance sensors to control the sounds made by their instruments.[2] IBM began building the first touch screens in the

late 1960s, and, in 1972, Control Data released the PLATO IV computer, a terminal used for educational purposes that employed single-touch points in a 16x16 array as its user interface.

The prototypes[3] of the x-y mutual capacitance multi-touch screens (left) developed at CERN One of the early implementations of mutual capacitance touchscreen technology was developed at CERN in 1977[4][5] based on their capacitance touch screens developed in 1972 by Danish electronics engineer Bent Stumpe. This technology was used to develop a new type of human machine interface (HMI) for the control room of the Super Proton Synchrotron particle accelerator. In a handwritten note dated 11 March 1972, Stumpe presented his proposed solution a capacitative touch screen with a fixed number of programmable buttons presented on a display. The screen was to consist of a set of capacitors etched into a film of copper on a sheet of glass, each capacitor being constructed so that a nearby flat conductor, such as the surface of a finger, would increase the capacity by a significant amount. The capacitors were to consist of fine lines etched in copper on a sheet of glass fine enough (80 m) and sufficiently far apart (80 m) to be invisible (CERN Courier April 1974 p117). In the final device, a simple lacquer coating prevented the fingers from actually touching the capacitors. Multi-touch technology began in 1982, when the University of Toronto's Input Research Group developed the first human-input multi-touch system. The system used a frosted-glass panel with a camera placed behind the glass. When a finger or several fingers pressed on the glass, the camera would detect the action as one or more black spots on an otherwise white background, allowing it to be registered as an input. Since the size of a dot was dependent on pressure (how hard the person was pressing on the glass), the system was somewhat pressure-sensitive as well.
[2]

In 1983, Bell Labs at Murray Hill published a comprehensive discussion of touch-screen based interfaces.[6] In 1984, Bell Labs engineered a touch screen that could change images with more than one hand. In 1985, the University of Toronto group including Bill Buxton developed a multi-touch tablet that used capacitance rather than bulky camera-based optical sensing systems.
[2]

A breakthrough occurred in 1991, when Pierre Wellner published a paper on his multi-touch Digital Desk, which supported multi-finger and pinching motions.[7][8] Various companies expanded upon these inventions in the beginning of the twenty-first century. The company Fingerworks developed various multi-touch technologies between 1999 and 2005, including Touchstream keyboards and the iGesture Pad. Several studies of this technology were published in the early 2000s by Alan Hedge, professor of human factors and ergonomics at Cornell University.[9][10][11] Apple acquired Fingerworks and its multi-touch technology in 2005.

Mainstream exposure to multi-touch technology occurred in 2007 when the iPhone gained popularity, with Apple stating they 'invented multi touch' as part of the iPhone announcement,[12] however both the function and the term predate the announcement or patent requests, except for such area of application as capacitive mobile screens, which did not exist before Fingerworks/Apple's technology (Apple filed patents for in 2005-2007 and was awarded with in 2009-2010). Publication and demonstration using the term Multi-touch by Jefferson Y. Han in 2005 predates these,[13] but Apple did give multi-touch wider exposure through its association with their new product and were the first to introduce multi-touch on a mobile device.[14] Microsoft's table-top touch platform Microsoft Surface, which started development in 2001, interacts with both the users touch and their electronic devices. Similarly, in 2001, Mitsubishi Electric Research Laboratories (MERL) began development of a multi-touch, multi-user system called DiamondTouch, also based on capacitance but able to differentiate between multiple simultaneous users (or rather, the chairs in which each user is seated or the floorpad the user is standing on); the Diamondtouch became a commercial product in 2008. Small-scale touch devices are rapidly becoming commonplace, with the number of touch screen telephones expected to increase from 200,000 shipped in 2006 to 21 million in 2012.[15]

Brands and manufacturers

A virtual keyboard on an iPad Apple has manufactured numerous products using multi-touch technology; most prominently including its iPhone smartphone and iPad tablet. Additionally, Apple also holds several patents related to the implementation of multi-touch in user interfaces[16] Apple additionally attempted to register "Multi-touch" as a trademark in the United States however its request was denied by the United States Patent and Trademark Office because it considered the term generic.[17] Multi-touch sensing and processing occurs via an ASIC sensor that is attached to the touch surface. Usually, separate companies make the ASIC and screen that combine into a touch screen; conversely, a trackpad's surface and ASIC are usually manufactured by the same company. There have been large companies in recent years that have expanded into the growing multi-touch industry, with systems designed for everything from the casual user to multinational organizations. It is now common for laptop manufacturers include multi-touch trackpads on their laptops, and tablet computers respond to touch input rather than traditional stylus input and it is supported by many recent operating systems. A few companies are focusing on large-scale surface computing rather than personal electronics, either large multi-touch tables or wall surfaces. These systems are generally used by government organizations, museums, and companies as a means of information or exhibit display.

Implementations
Multi-touch has been implemented in several different ways, depending on the size and type of interface. The most popular form are mobile devices, tablets, touchtables and walls. Both touchtables and touch walls project an image through acrylic or glass, and then back-light the image with LEDs. Types

Multitouch Capacitive Technology


Surface Capacitive Technology Projected Capacitive Touch (PST) In-cell: Capacitive Analog Resistive Digital Resistive or In-Cell: Resistive Optical Imaging or Infrared technology Infrared Grid Technology (opto-matrix) or Digital Waveguide Touch (DWT) or Infrared Optical Waveguide Frustrated Total Internal Reflection (FTIR) or Diffused Surface Illumination (DSI) / Dispersive Signal Touch (DST) Kinect In-Cell: Optical Surface Acoustic Wave (SAW) Bending Wave Touch (BWT)

Touch Resistive Technology


Multitouch Optical technologies


Rear Diffused Illumination (DI)

Touch Wave Technologies


Force-Based Sensing or Near Field Imaging (NFI)

The optical touch technology functions when a finger or an object touches the surface, causing the light to scatter, the reflection is caught with sensors or cameras that send the data to software which dictates response to the touch, depending on the type of reflection measured. Touch surfaces can also be made pressure-sensitive by the addition of a pressure-sensitive coating that flexes differently depending on how firmly it is pressed, altering the reflection.[18] Handheld technologies use a panel that carries an electrical charge. When a finger touches the screen, the touch disrupts the panel's electrical field. The disruption is registered and sent to the software, which then initiates a response to the gesture.[19] In the past few years, several companies have released products that use multi-touch. In an attempt to make the expensive technology more accessible, hobbyists have also published methods of constructing DIY touchscreens.[20]

Multitouch GLOBE (domed monitor) with "Multitouch Earth" application

Popular culture references


Popular culture has also portrayed potential uses of multi-touch technology in the future, including several installments of the Star Trek franchise. The television series CSI: Miami introduced both surface and wall multi-touch displays in its sixth season. Another television series, NCIS: Los Angeles, make use of multi-touch surfaces and wall panels as an initiative to go digital. Another form of a multi-touch computer was seen in the film The Island, where the professor, played by Sean Bean, has a multi-touch desktop to organize files, based on an early version of Microsoft Surface[2]. Multitouch technology can also be seen in the James Bond film Quantum of Solace, where MI6 uses a touch interface to browse information about the criminal Dominic Greene.[21] In an episode of the television series The Simpsons, when Lisa Simpson travels to the underwater headquarters of Mapple to visit Steve Mobs, the erstwhile pretender to the throne of Mapple is shown to be performing multiple multitouch hand gestures on a large touch wall. A device similar to the Surface was seen in the 1982 Disney sci-fi film Tron. It took up an executive's entire desk and was used to communicate with the Master Control computer. The interface used to control the alien ship in the 2009 film District 9 features such similar technology.[22] Microsoft's Surface was also used in the 2008 film The Day the Earth Stood Still. In the 2002 film Minority Report, Tom Cruise uses a set of gloves that resemble a multi-touch interface to browse through information.[23]

See also

List of multi-touch computers and monitors Gesture recognition Human-Computer Interaction Pen computing Multi-touch gestures Natural User Interface Sketch recognition Surface Computing

Tenori-on Touchpad Touch user interface Sensomusic_Usine

TouchLight
From Wikipedia, the free encyclopedia TouchLight is an imaging touch screen and 3D display for gesture-based interaction.[1] It was developed by Microsoft Research employee Andrew D. Wilson and made known to the public in late 2005.[2] The technology was licensed to Eon Reality in July 2006.[3]

Contents

1 Abilities 2 Cost 3 See also 4 References

Abilities
The TouchLight can both record and project simultaneously, and due to its 3D capabilities can be used almost as a mirror. This same principle could be applied to link two TouchLights together allowing two people anywhere in the world to communicate with each other as if they were sitting on opposite sides of the same desk. It can capture a high definition image of anything placed up against the screen. This image is then displayed in 2D. The user of the TouchLight can manipulate the size, position and orientation of the image by performing the corresponding action with his or her hands. The screen has a microphone built in that can detect vibration, thus allowing the user to change setting simply by tapping the screen.[4]

Cost
At present the high end product costs upwards of $60,000,[5] however Eon Reality predicts that within 24 to 36 months the technology will be affordable enough for desktops.[3]

See also

Microsoft Surface

References
1. ^ Andrew D. Wilson. "TouchLight: an imaging touch screen and display for gesture-

based interaction". Microsoft Research. Archived from the original on 2006-08-29. Retrieved 2006-12-08.
2. ^ "First look at Microsoft Research Touch-Light (Video)". TheChannel9Team. Channel

9. 2004-08-24. Retrieved 2007-05-19.

3. ^ a b "EON Reality Will Be First Company to Bring Next-Generation 3-D Technology to

Market". 2006-07-18. Archived from the original on 2006-11-16. Retrieved 2006-12-08.


4. ^ "Touchlight". CAVI Digital Experience. 2005-12-15. Archived from the original on

2011-07-11. Retrieved 2006-12-10.


5. ^ Laurie Sullivan (2006-07-19). "Microsoft Licenses 3D TouchLight IP". TechnoWeb.

Microsoft Research (MSR)


Languages and Polyphonic C# C Spec# Sing# Bartok Phoenix compilers Distributed/ grid Bigtop Gridline BitVault computing Internet and networking Other projects Avalanche Wallop Conference XP HoneyMonkey AjaxView Gazelle MyLifeBits LiveStation Terminator WorldWide Telescope

Main projects

Operating systems Singularity Barrelfish Verve APIs Launched as products Joins Accelerator Dryad SXM C# Comic Chat F# Sideshow Surface (TouchLight) SenseCam ClearType Group Shot Allegiance Songsmith Photosynth Pivot Seadragon (Deep Zoom DeepZoomPix)

Current Live Labs Applied research from MSR Labs

Discontinue Deepfish Listas Volta d FUSE Labs Bing Twitter Docs.com Kodu Project Emporia Spindex

Other Microsoft Research Labs: adCenter Labs Office Labs

You might also like