Академический Документы
Профессиональный Документы
Культура Документы
Modern robotics
features
features
Robots
Flywheels are potentially a great way to store
power, but they can fail spectacularly
As with many robot sensors, artificial noses tend to be able to do one thing well, whereas the human nose can detect many different elements.
A BAtterY oF ProBLeMs
Small, lightweight power sources are a problem for many devices, but theyre a particular impediment for an android that has to be mobile. The reason Hondas Asimo looks and moves like a top-heavy astronaut is that it has to carry the giant batteries that provide its power on its back. Tirelessness is an essential feature of the robots of our dreams, and yet all industrial robots are permanently linked to mains power, which is no good for our robot butlers. lithium-ion batteries are the best we have at the moment, and while theyre charging your robot is out of operation. Solar cells are light and produce a constant flow of power, but they require sunlight and produce very low levels of power. The best option at present is the fuel cell. These combine hydrogen and oxygen to produce energy, with only CO2 and water as by-products. Daimlers director of research and development, Thomas Weber, says his team has brought down the cost of fuel cells so that theyre ready for mass production whenever the first network of hydrogen filling stations is ready. Possible future power sources range from the bizarre to the extremely bizarre. radioisotopic thermoelectric generators provide low power over very long periods, for example, but cant be used near people as they draw their power from radioactive decay. Meanwhile, waste-digesting robots powered by microbial fuel cells have been developed by the bristol robotics lab. These are
For navigating through environments, a more specialised sensor is far more effective. Something like the Kinect motion-detection system for Microsofts Xbox games console, or the Ir sensors on car parking systems, can allow the robot to understand the physical dimensions of its surroundings more easily. Kinect builds up a 3D image by firing out thousands of small infrared dots and then judging depth based on how big they appear. Such methods give the robot far better three-dimensional data to work with than a confusingly detailed high-resolution image. Even with a pair of high-resolution eyes to produce a 3D image, your brain does a lot of work and makes a lot of assumptions to give you a perception of depth. Theres still a lot of work to be done here, and even though its perfectly possible to build a robot today that can navigate safely around your home, teaching it the location and identity of all the objects youd want it to work with would be a long, painstaking task, and even then it would still make errors.
Sentient robots are a staple of science fiction, but nothing like them has yet appeared in real life. dan Griliopolus finds out whats taking the scientists so long
other senses
Microphones are sensitive across a wider range of frequencies than the human ear, but again its the interpretation of audio thats particularly difficult for computers. Anyone whos tried dictation software will know that accents, speech patterns, ambient noise and even the common cold can completely defeat such software. Dragon claims that its naturally Speaking software can recognise 160 words a minute with 95 per cent accuracy, although its not as good as a human at interpreting the missing words. You might think that this means robots are in a better position with sound than they are when deciphering visual input, but this is only true for speech. recognizing a wide range of audio cues and reacting to them is another matter entirely, and again it would require painstaking teaching and considerable processing power. Artificial noses are still relatively primitive. The human nose contains more than 100 million individual receptors, whereas a typical artificial nose might have fewer than 10. researchers at the University of Warwick recently found that sensitivity can be improved by coating it in a layer of artificial mucus, so if our android were to malfunction, it might end up with a runny nose.
W
ILLUSTRATION: CHrIS rObSOn
hen we think of a robot, most of us probably imagine something like a mechanical man. Hed be a tireless, flexible humanoid assistant who can do everything a man can do, only better in short, an android. Such an android should be entirely subservient to our commands, and yet also be able to show initiative and ingenuity. Its a tricky mix. Essentially, we want to create the ultimate slave, a device that can free humankind from the drudgery of day-to-day chores. Much of this vision comes from I, Robot, Isaac Asimovs classic short story collection, which was published over 60 years ago. So are we nearing the creation of a practical android? Or will the robots of the near future be much like they are today large, immobile, single-task devices that live next to conveyor-belt lines? In this feature well look at the key capabilities of the human body such as our senses, our ability to manipulate objects, to move around, and even make decisions that
any successful android would have to master. Well take each in turn and see how close science has come to replicating them.
one Vision
Starting at the top, the human head is packed with sensors, making it easily the most complex part of our body. It has audio inputs and outputs, taste, smell and, of course, vision. The maximum acuity of the human eye is estimated at between 100- and 600-megapixels. Its complex, as our visual apparatus builds up an image of our surroundings by making rapid sweeps of the area. This visual data is then stored in our memory, which is very handy as the eyes see in high detail only in the centre of our vision. Much of the rest is filled in by our brains. Digital video cameras can capture only a tiny fraction of this
112
august 2011
Issue 282
detail. The human eyes huge advantage over a robot eye is more than its extreme sensitivity and immense resolution, though. Our eyes dont suffer from wide-angle distortion (despite having around a 155 viewing angle), vignetting, chromatic aberration or colour distortion. Despite this, digital cameras have some advantages over the eye, such as a variable zoom and the ability to keep an entire scene in focus at maximum aperture. They adjust rapidly to low-light scenes, although they dont have as great a dynamic range as the human eye think of the amount of data a camera loses in shadows
or highlights compared with what you can see. Whats more, robot eyes can detect a much wider range of light than the human eye, including infrared, ultraviolet and x-rays. Even after weighing up the pros and cons, the human eye will always come out on top as its backed up by the brain. A major stumbling block for robots isnt seeing objects but identifying them. It would be hard for a robot to find and pick up a vase on command, say, as vases come in a bewildering range of shapes and sizes. Even if the robot had learnt to recognise all the vases in your home, they may be partly obscured by other objects, which would confuse its vision. At present, the best way to recreate human vision is to use a range of separate sensors rather than a single eye. A camera and processor is still a good start, though. The open-source CMUcam project, for example, can be used to recognise and track predefined objects. Its inexpensive and low-power, but then it has a resolution of just 352x288 at 26fps.
113
Modern robotics
Modern robotics
features
features
also unlikely to be welcome near people for less lethal (but no less rational) reasons. Perhaps our best hope is to store rotational energy in vacuum-sealed flywheels. These can operate like batteries and store tremendous amounts of energy. However, they can have strange gyroscopic effects, they still need to be either charged or swapped, and they can be subject to a catastrophic but exciting-sounding flywheel explosion. Either way, those in the know predict that battery technology is about to race forwards, spurred on by the demands of
electric cars. Ian robertson of bMW recently said that well see batteries advance more in five years than they have in the last 100.
A dog that walks itself on the beach? Someones missing the point maybe?
liquids or gases, or elastic, using built-in springs to balance forces, like disabled sprinter Oscar Pistoriuss replacement blade legs. There are also miniature internal combustion engines, but well discount these as they require large quantities of dangerous and bulky fuel. Air muscles, which have been in use since the 1950s, work in opposed pairs, like our own muscles, and consist of pneumatic bladders filled with pressurised air, Theyre very light, but they require heavy air compressors to be attached and arent very accurate. Another option, Muscle Wire, is made from memory metals, which are easily deformed and revert to their original shape when heated. Meanwhile, electro-active polymers are typically elastic materials made from ceramics that change in shape and size when subjected to an electrical field. Sadly, most robotic muscles, such as those of the HDT MK1 robot arm (see our Muscling in box on page 117), use DC electric motors, and this is still the most likely option for a modern android. The degree of potential in this field is impressive. As with power sources, a lot of future actuators sound like something straight out of science fiction. When you hear about transparent elastic carbon nanotube actuators that are 8mm wide and can produce the same power as a human biceps, visions of super-thin, super-powerful, super-flexible robots seem realistic. nanotubes represent the best hope for the future. They can operate at extremes of temperature (from -196C to 1,538C) and consist of transparent aerogel built from aligned
carbon nanotubes. Their only problem is that they require high voltages to activate.
its unlikely to develop these abilities in the near future. Taking itself to the service centre is about as much as we can hope for.
sKin deeP
not only does a robots outer skin need to be durable, but every part of the structure must be able to withstand abuse. Consider how fragile electronics are and the level of protection required to ruggedise, say, a laptop, and then imagine doing that to an entire android. Humans are fragile, too, but we have the advantage that our senses can detect threats, such as heat, and our fast reflexes enable us to avoid serious damage. building such failsafes into an android is certainly possible, but youd need yet more processing power to monitor and react to all those additional sensory inputs. One way to negate the problem of damage is to have multiple robots. Swarm robots consist of hundreds of smaller robots that act as one. These communicate wirelessly and can fill gaps in the swarm if a robot fails. For our android, however, a better model might be the DASH, a super-light robot thats built to replicate the resilience and mobility of an insect. Weve seen it survive a 28m fall from a building intact. This would leave us with the problem of incorporating the muscle power and battery life needed for a robotic butler into a device that weighs so little.
its not inconceivable that robots could travel easily in the near future.
reProduCtion Line
Humans are capable of making new humans, and to some extent we can tend to our own injuries. However, with general-purpose manipulators rather than hands, our android would have a hard time replacing worn out parts, let alone making a copy of himself. That said, robot reproduction has been demonstrated in principle. A rapid prototyping machine called reprap has been designed specifically to be able to make components for itself or for building another reprap machine (see our feature on page 118), while a modular robot arm has been designed that can assemble copies of itself, but only from pre-produced pieces. It isnt possible for our android to repair and reproduce itself at the moment, though, and
Interview with
hAndroid
The human hand is a fantastic evolutionary achievement that incorporates subtle touch, strength, dexterity and flexibility. There are
That would involve a huge amount of work. Humans know tens of millions of facts and can infer more, understanding cause and effect and bringing common-sense, simulation knowledge of typical human lives, a passable understanding of meaning of the input, and so on. What is the state of AI outside the sphere of chatbot programming? It depends on what you mean by AI. Game AI is generally not mainstream AI. Game AI seeks to create the illusion of a competent opponent by whatever means (often using smoke and mirrors). Mainstream AI actually tries to create techniques whereby problems are solved as if by a human and covers a wide range of things. Every time they succeed, the technique is swallowed up by mainstream computing and is no longer considered AI. Speech recognition is a good example. Did you use any of your chatbot expertise in your games? Telltale Games hired me because it is interested in natural language interfaces. Im currently researching on a game similar to Scribblenauts (A Nintendo DS game where you can create in-game objects by entering words). Scribblenauts accepts nouns and adjectives; our game would accept nouns and verbs and act out a story in some fashion.
Hi Suzette!
Suzette
Why gets way too
You saying old reminds me of physical self. Do you consider your self
Ha! Why?
Aha, and to you. How does it feel to be the best chatbot around? How do you mean unique?
I concentrate very hard, rehearse what I'm going to do, visualiz e it in my mind. And then I just do it.
insect swarms.
result in emergent
When?
Oh, I think you misunderstood me. What did you mean by unique?
sense of humor.
When you just said it. Please dont start the round the houses routine with me. Ive dealt with chatbots before, you know.
Name a topic. If you can't think of a topic, try one of thes e: booze, mock combat, shop ping, taxation, chocolat e.
Stop it with the non sequiturs! How did you fool anyone?
abilities?
Eskimos take raw fish heads, bury them in ice for 3-4 mont hs, then eat the foul smelling result.
Oh dear
What do you think it will take for something to pass the Turing Test?
Oh, the old bait and switch. No, and its a sensitive topic.
Star Trek.
rice crackers.
Jellied Blood -
Goodbye Suzette.
114
august 2011
Issue 282
115
Modern robotics
Modern robotics
features
features
robotic manipulators designed to replicate human motion, and newer models (such as Shadow Hand, left) incorporate feedback from tactile sensors. However, its easier to program for non-human hand shapes, which can then carry out automated routines extremely quickly. Using Mindstorm, legos DIY robot kit, you can build a robot thats able to complete a rubiks cube in under 12 seconds. Touch is actually the easiest of the senses to replicate. SynTouch, a technology currently in development, uses a conductive fluid pouch that encloses an array of electrodes. The impedance of these changes rapidly when the pouch is pressurised. It also has thermisters, which sense heat, and a pressure sensor to detect vibrations. All this is mounted on a rigid substructure, ruggedised to survive daily use, and looks curiously like a human finger. Of all the parts of the android, this component seems the most ready, although its level of accuracy is still a concern.
conversational AI. The main prize of $100,000 has never been claimed as no-one has ever got even near to producing an AI thats convincing in the long run (see our interview with the most recent prize winner, Suzette, on page 115). IbMs Watson is a good example of the current state of the art in AI. Having taken on the worlds best chess players in 1997 with Deep blue, the company wanted a more subtle and complex challenge, so it built a questionanswering machine designed specifically for the US general knowledge quiz show Jeopardy! Watson doesnt have any general intelligence, but it possesses a huge data processing rate (500Gb/s) and an enormous database, and it managed to beat the all-time best human contestants with ease. It competes on as level a playing field as possible, with no internet connection, and receives the questions as text at the same time as its opponents. Its startling to see it take in and respond to complex questions in a fraction of a second. IbM hopes to use the technology to assist in other researchheavy areas, helping to answer questions in subjects such as medicine and law. AI is so complex that practitioners in different sub-disciplines have little idea about the other elements creativity researchers dont tend to work with perception researchers, for example. Few people are working towards the ultimate aim of general intelligence, but weve spoken to one man who is (see Machines that create below). AI is progressing in several fields, such as perception, natural language processing, some forms of learning and deduction, but general intelligence is a long way off. Cramming something as powerful as Watson into a mobile unit would currently be impossible. With cloud computing, however, a robot could operate autonomously with far less built-in processing power while more complicated computing went on in the cloud. This looks to be
IBMs Watson defeated the best players at popular US quiz show Jeopardy!
the only way well get considerable processing power into our android for now, although the obvious problem is that it would have to rely on a wireless network or mobile connection, which could severely hamper its reaction speeds.
At WhAt PriCe?
To build our ideal android, wed need a compact power source, a very powerful artificial brain and a wide range of sensors and general-purpose manipulators, all combined in a mobile and resilient chassis. Many of the physical aspects of our android arent beyond our current capabilities, although its range would probably be limited and it would need a constant wireless connection to run from a remote computer. The problem is that todays android wouldnt be able to perform tasks that it hadnt been programmed to do. We simply havent created an artificial intelligence that can respond instantly to the vast range of human whims.
Answering questions is one thing, and retrieving objects is possible, but cooking and serving a wide range of meals from a standard kitchen, for example, involves too many variables. A key factor here is that human labour is far cheaper. Science fiction such as Star Trek imagines a future where mankind has been freed from the drudgery of everyday chores. but anyone who had the funds to purchase and run a state-of-the-art android today could get far superior home help from a human being. Financial inequalities in our own society, let alone the world, give the android butler some stiff competition when it comes to pricing. This has made it hard for serious android development to progress. Our android would need to be superior to a human in some respects to justify its existence, and at present it simply
isnt. Were close to matching human capabilities in some areas, but we need to go beyond this to justify the limitations any android will also have. Such limitations stem from replacing the human brain with a computing system that can take commands and work out the best way to perform them. The brain is a powerful tool, but its ability to learn and extrapolate solutions from past events is key to making a practical android. Our best hope of such advances lies with experimental work such as Dr Thalers, where androids learn from their own mistakes. Using virtual environments, they could then be given the time and space to do this before they were unleashed in our homes. Until then, if you want home help, we recommend hiring a human being. Pick the right one, and they can be surprisingly capable and reliable.
Muscling in Dr Tom Van Zoren Creator of the HDT MK-1 robotic arm
Dr Tom Van Zorens HDT MK-1 robot arm is based on technology developed under the DARPA Revolutionizing Prosthetics program. Its completely portable, uses sophisticated actuators and can lift 23kg. Why is this arm so impressive? We have developed a dexterous, lightweight, compact, high-load, mobile, and modular robotic manipulator arm, which can do many of the tasks and use many of the tools that a human arm can and do it remotely. Does the arm have any AI? Right now, it is purely user-controlled through tele-operation. In the future we will have some autonomous behaviour built in. What do you see it being used for? Initial applications will be for military explosive ordnance disposal and law enforcement bomb disposal. After that, any application in which remote projection of human dexterity is useful will benefit from it. Is it possible to incorporate touchfeedback technology in artificial limbs? Yes. The DARPA program from which HDTs arm technology was spun off has developed an arm that incorporates touch feedback. What great leaps forward in technology will we need to reach the classical android? Autonomous behaviour is the biggest challenge.
critiques and guides the stream of consciousness. This fundamental architecture will then be able to query other, non-creative, non-contemplative, but fast computational systems, such as Watson. Youve spoken in rather human terms about your CMs dreaming; is this just for marketing, or is this something you believe?
116
august 2011
Issue 282
117