You are on page 1of 7

Is Smart Making Us Dumb?

A revolution in technology is allowing previously inanimate objects from cars to trash cans to teapotsto talk back to us and even guide our behavior. But how much control are we willing to give up?

How to live in a world where your trash bin can talk back.

Would you like all of your Facebook FB -0.56% friends to sift through your trash? A group of designers from Britain and Germany think that you might. Meet BinCam: a "smart" trash bin that aims to revolutionize the recycling process. BinCam looks just like your average trash bin, but with a twist: Its upper lid is equipped with a smartphone that snaps a photo every time the lid is shut. The photo is then uploaded to Mechanical Turk, the Amazon-run service that lets freelancers perform laborious tasks for money. In this case, they analyze the photo and decide if your recycling habits conform with the gospel of green living. Eventually, the photo appears on your Facebook page. You are also assigned points, as in a game, based on how well you are meeting the recycling challenge. The household that earns the most points "wins." In the words of its young techie creators, BinCam

is designed "to increase individuals' awareness of their food waste and recycling behavior," in the hope of changing their habits.
The HAPIfork is a smart fork that can monitor how quickly you are eating and signal you to slow down using indicator lights.

BinCam has been made possible by the convergence of two trends that will profoundly reshape the world around us. First, thanks to the proliferation of cheap, powerful sensors, the most commonplace objects can finally understand what we do with themfrom umbrellas that know it's going to rain to shoes that know they're wearing out and alert us to potential problems and programmed priorities. These objects are no longer just dumb, passive matter. With some help from crowdsourcing or artificial intelligence, they can be taught to distinguish between responsible and irresponsible behavior (between recycling and throwing stuff away, for example) and then punish or reward us accordinglyin real time. And because our personal identities are now so firmly pegged to our profiles on social networks such as Facebook andGoogle GOOG +0.53% +, our every interaction with such objects can be made "social"that is, visible to our friends. This visibility, in turn, allows designers to tap into peer pressure: Recycle and impress your friends, or don't recycle and risk incurring their wrath. These two features are the essential ingredients of a new breed of socalled smart technologies, which are taking aim at their dumber alternatives. Some of these technologies are already catching on and seem relatively harmless, even if not particularly revolutionary: smart watches that pulsate when you get a new Facebook poke; smart scales that share your weight with your Twitter followers, helping you to stick to a diet; or smart pill bottles that ping you and your doctor to say how much of your prescribed medication remains. But many smart technologies are heading in another, more disturbing direction. A number of thinkers in Silicon Valley see these technologies as a way not just to give consumers new products that they want but to push them to behave better. Sometimes this will be a

nudge; sometimes it will be a shove. But the central idea is clear: social engineering disguised as product engineering. In 2010, Google Chief Financial Officer Patrick Pichette told an Australian news program that his company "is really an engineering company, with all these computer scientists that see the world as a completely broken place." Just last week in Singapore, he restated Google's notion that the world is a "broken" place whose problems, from traffic jams to inconvenient shopping experiences to excessive energy use, can be solved by technology. The futurist and game designer Jane McGonigal, a favorite of the TED crowd, also likes to talk about how "reality is broken" but can be fixed by making the real world more like a videogame, with points for doing good. From smart cars to smart glasses, "smart" is Silicon Valley's shorthand for transforming present-day social reality and the hapless souls who inhabit it. But there is reason to worry about this approaching revolution. As smart technologies become more intrusive, they risk undermining our autonomy by suppressing behaviors that someone somewhere has deemed undesirable. Smart forks inform us that we are eating too fast. Smart toothbrushes urge us to spend more time brushing our teeth. Smart sensors in our cars can tell if we drive too fast or brake too suddenly. These devices can give us useful feedback, but they can also share everything they know about our habits with institutions whose interests are not identical with our own. Insurance companies already offer significant discounts to drivers who agree to install smart sensors in order to monitor their driving habits. How long will it be before customers can't get auto insurance without surrendering to such surveillance? And how long will it be before the self-tracking of our health (weight, diet, steps taken in a day) graduates from being a recreational novelty to a virtual requirement?
The Saturday Essay

Why Our Gun Debate Is Off Target (2/16/13)

Family Inc. (2/9/13) America's Baby Bust (2/2/13) Bill Gates: My Plan to Fix The World's Biggest Problems (01/26/13) The Guerrilla Myth (1/19/13) In Defense of the CEO (1/12/13) Have We Lost the War on Drugs? (1/05/13)

How can we avoid completely surrendering to the new technology? The key is learning to differentiate between "good smart" and "bad smart." Devices that are "good smart" leave us in complete control of the situation and seek to enhance our decision-making by providing more information. For example: An Internet-jacked kettle that alerts us when the national power grid is overloaded (a prototype has been developed by U.K. engineer Chris Adams) doesn't prevent us from boiling yet another cup of tea, but it does add an extra ethical dimension to that choice. Likewise, a grocery cart that can scan the bar codes of products we put into it, informing us of their nutritional benefits and country of origin, enhancesrather than impoverishes our autonomy (a prototype has been developed by a group of designers at the Open University, also in the U.K.). Technologies that are "bad smart," by contrast, make certain choices and behaviors impossible. Smart gadgets in the latest generation of carsbreathalyzers that can check if we are sober, steering sensors that verify if we are drowsy, facial recognition technologies that confirm we are who we say we areseek to limit, not to expand, what we can do. This may be an acceptable price to pay in situations where lives are at stake, such as driving, but we must resist any attempt to universalize this logic. The "smart bench"an art project by designers JooYoun Paek and David Jimison that aims to illustrate the dangers of living in a city that is too smartcleverly makes this point. Equipped with a timer and sensors, the bench starts tilting after a set time, creating an incline that eventually dumps its occupant. This might

appeal to some American mayors, but it is the kind of smart technology that degrades the culture of urbanismand our dignity. Projects like BinCam fall somewhere between good smart and bad smart, depending on how they're executed. The bin doesn't force us to recycle, but by appealing to our base instinctsMust earn gold bars and rewards! Must compete with other households! Must win and impress friends!it fails to treat us as autonomous human beings, capable of weighing the options by ourselves. It allows the Mechanical Turk or Facebook to do our thinking for us. The most worrisome smart-technology projects start from the assumption that designers know precisely how we should behave, so the only problem is finding the right incentive. A truly smart trash bin, by contrast, would make us reflect on our recycling habits and contribute to conscious deliberationsay, by letting us benchmark our usual recycling behavior against other people in our demographic, instead of trying to shame us with point deductions and peer pressure. There are many contexts in which smart technologies are unambiguously useful and even lifesaving. Smart belts that monitor the balance of the elderly and smart carpets that detect falls seem to fall in this category. The problem with many smart technologies is that their designers, in the quest to root out the imperfections of the human condition, seldom stop to ask how much frustration, failure and regret is required for happiness and achievement to retain any meaning. It's great when the things around us run smoothly, but it's even better when they don't do so by default. That, after all, is how we gain the space to make decisionsmany of them undoubtedly wrongheaded and, through trial and error, to mature into responsible adults, tolerant of compromise and complexity. Will those autonomous spaces be preserved in a world replete with smart technologies? Or will that world, to borrow a metaphor from the legal philosopher Ian Kerr, resemble Autopiaa popular Disneyland attraction in which kids drive specially designed little cars that run

through an enclosed track? Well, "drive" may not be the right word. Though the kids sit in the driver's seat and even steer the car sideways, a hidden rail underneath always guides them back to the middle. The DisneyDIS +0.15% carts are impossible to crash. Their socalled "drivers" are not permitted to make any mistakes. Isn't it telling that one of today's most eagerly anticipated technologies is a selfdriving car, now on its way to being rolled out by Google? To grasp the intellectual poverty that awaits us in a smart world, look no further than recent blueprints for a "smart kitchen"an odd but persistent goal of today's computer scientists, most recently in designs from the University of Washington and Kyoto Sangyo University in Japan. Once we step into this magic space, we are surrounded by video cameras that recognize whatever ingredients we hold in our hands. Tiny countertop robots inform us that, say, arugula doesn't go with boiled carrots or that lemon grass tastes awful with chocolate milk. This kitchen might be smart, but it's also a place where every mistake, every deviation from the master plan, is frowned upon. It's a world that looks more like a Taylorist factory than a place for culinary innovation. Rest assured that lasagna and sushi weren't invented by a committee armed with formulas or with "big data" about recent consumer wants. Creative experimentation propels our culture forward. That our stories of innovation tend to glorify the breakthroughs and edit out all the experimental mistakes doesn't mean that mistakes play a trivial role. As any artist or scientist knows, without some protected, even sacred space for mistakes, innovation would cease. With "smart" technology in the ascendant, it will be hard to resist the allure of a frictionless, problem-free future. When Eric Schmidt, Google's executive chairman, says that "people will spend less time trying to get technology to workbecause it will just be seamless," he is not wrong: This is the future we're headed toward. But not all of us will want to go there.

A more humane smart-design paradigm would happily acknowledge that the task of technology is not to liberate us from problem-solving. Rather, we need to enroll smart technology in helping us with problemsolving. What we want is not a life where friction and frustrations have been carefully designed out, but a life where we can overcome the frictions and frustrations that stand in our way. Truly smart technologies will remind us that we are not mere automatons who assist big data in asking and answering questions. Unless designers of smart technologies take stock of the complexity and richness of the lived human experiencewith its gaps, challenges and conflictstheir inventions will be destined for the SmartBin of history.

You might also like