Вы находитесь на странице: 1из 13

Robot With A Biological Brain: New

Research Provides Insights Into How The


Brain Works
ScienceDaily (Aug. 14, 2008) — A multidisciplinary team at the University of Reading has
developed a robot which is controlled by a biological brain formed from cultured neurons. This
cutting-edge research is the first step to examine how memories manifest themselves in the brain,
and how a brain stores specific pieces of data.

The key aim is that eventually this will lead to a better understanding of development and of
diseases and disorders which affect the brain such as Alzheimer's Disease, Parkinson's Disease,
stoke and brain injury.

The robot's biological brain is made up of cultured neurons which are placed onto a multi-
electrode array (MEA). The MEA is a dish with approximately 60 electrodes which pick up the
electrical signals generated by the cells. This is then used to drive the movement of the robot.
Every time the robot nears an object, signals are directed to stimulate the brain by means of the
electrodes. In response, the brain's output is used to drive the wheels of the robot, left and right,
so that it moves around in an attempt to avoid hitting objects. The robot has no additional control
from a human or a computer, its sole means of control is from its own brain.

The researchers are now working towards getting the robot to learn by applying different signals
as it moves into predefined positions. It is hoped that as the learning progresses, it will be
possible to witness how memories manifest themselves in the brain when the robot revisits
familiar territory.

Professor Kevin Warwick from the School of Systems Engineering, said: "This new research is
tremendously exciting as firstly the biological brain controls its own moving robot body, and
secondly it will enable us to investigate how the brain learns and memorises its experiences. This
research will move our understanding forward of how brains work, and could have a profound
effect on many areas of science and medicine."

Dr Ben Whalley from the School of Pharmacy, said: "One of the fundamental questions that
scientists are facing today is how we link the activity of individual neurons with the complex
behaviours that we see in whole organisms. This project gives us a really unique opportunity to
look at something which may exhibit complex behaviours, but still remain closely tied to the
activity of individual neurons. Hopefully we can use that to go some of the way to answer some
of these very fundamental questions. "
Cultured neurons from rats are placed onto a multi-electrode array -- a dish with approximately
60 electrodes which pick up the electrical signals generated by the cells. (Credit: Image courtesy
of University of Reading)

Researchers Create Self-Assembling


Nanodevices That Move and Change Shape
on Demand
ScienceDaily (June 21, 2010) — By emulating nature's design principles, a team at Harvard's
Wyss Institute for Biologically Inspired Engineering, Harvard Medical School and Dana-Farber
Cancer Institute has created nanodevices made of DNA that self-assemble and can be
programmed to move and change shape on demand. In contrast to existing nanotechnologies,
these programmable nanodevices are highly suitable for medical applications because DNA is
both biocompatible and biodegradable.

The work appears in the June 20 advance online Nature Nanotechnology.

Built at the scale of one billionth of a meter, each device is made of a circular, single-stranded
DNA molecule that, once it has been mixed together with many short pieces of complementary
DNA, self-assembles into a predetermined 3D structure. Double helices fold up into larger, rigid
linear struts that connect by intervening single-stranded DNA. These single strands of DNA pull
the struts up into a 3D form -- much like tethers pull tent poles up to form a tent. The structure's
strength and stability result from the way it distributes and balances the counteracting forces of
tension and compression.

This architectural principle -- known as tensegrity -- has been the focus of artists and architects
for many years, but it also exists throughout nature. In the human body, for example, bones serve
as compression struts, with muscles, tendons and ligaments acting as tension bearers that enable
us to stand up against gravity. The same principle governs how cells control their shape at the
microscale.

"This new self-assembly based nanofabrication technology could lead to nanoscale medical
devices and drug delivery systems, such as virus mimics that introduce drugs directly into
diseased cells," said co-investigator and Wyss Institute director Don Ingber. A nanodevice that
can spring open in response to a chemical or mechanical signal could ensure that drugs not only
arrive at the intended target but are also released when and where desired.

Further, nanoscopic tensegrity devices could one day reprogram human stem cells to regenerate
injured organs. Stem cells respond differently depending on the forces around them. For
instance, a stiff extracellular matrix -- the biological glue surrounding cells -- fabricated to mimic
the consistency of bone signals stem cells to become bone, while a soupy matrix closer to the
consistency of brain tissue signals the growth of neurons. Tensegrity nanodevices "might help us
to tune and change the stiffness of extracellular matrices in tissue engineering someday," said
first author Tim Liedl, who is now a professor at Ludwig-Maximilians-Universität in Munich.

"These little Swiss Army knives can help us make all kinds of things that could be useful for
advanced drug delivery and regenerative medicine," said lead investigator William Shih, Wyss
core faculty member and associate professor of biological chemistry and molecular
pharmacology at HMS and Dana-Farber Cancer Institute. "We also have a handy biological
DNA Xerox machine that nature evolved for us," making these devices easy to manufacture.

This new capability "is a welcome element in the structural DNA nanotechnology toolbox," said
Ned Seeman, professor of chemistry at New York University.

This research was funded by the Wyss Institute for Biologically Inspired Engineering at Harvard
University, National Institutes of Health, Deutscher Akademischer Austauschdienst Fellowship,
Swedish Science Council Fellowship and Claudia Adams Barr Program Investigator award.

Journal Reference:

1. Tim Liedl, Bjorn Hogberg, Jessica Tytell, Donald E. Ingber, William M. Shih. Self-
assembly of 3D prestressed tensegrity structures from DNA. Nature Nanotechnology,
2010; DOI: 10.1038/nnano.2010.107
Left: A tensegrity built with wooden rods and string. Middle: A diagrammatic image of a
tensegrity built with DNA struts (shown as colored ladders folded into rods) and DNA cable
strands (shown as colored single lines). Light gray arrows show contractile forces exerted by the
cable strands, while dark gray arrows show compressive forces along the struts. Right: An
electron micrograph of an actual nanoscale tensegrity built using the new DNA-based, self-
assembling nanofabrication capabilities. Scale bars equal 20 nanometers (billionths of a meter).
(Credit: Images by Tim Liedl)

Researchers Unlock Key To Memory Storage


In Brain

ScienceDaily (Apr. 20, 2007) — Scientists know little about how the brain assigns cells to
participate in encoding and storing memories. Now a UCLA/University of Toronto team has
discovered that a protein called CREB controls the odds of a neuron playing a role in memory
formation. The April 20 edition of Science reports the findings, which suggest a new approach
for preserving memory in people suffering from Alzheimer's or other brain injury.

"Making a memory is not a conscious act," explained Alcino Silva, principal investigator and a
professor of neurobiology and psychiatry at the David Geffen School of Medicine at UCLA.
"Learning triggers a cascade of chemicals in the brain that influence which memories are kept
and which are lost.

"Earlier studies have linked the CREB protein to keeping memories stable," added Silva, a
member of the UCLA Brain Research Institute. "We suspected it also played a key role in
channeling memories to brain cells that are ready to store them."

Silva and his colleagues used a mouse model to evaluate their hypothesis. They implanted CREB
into a virus, which they introduced into some of the cells in the animal's amygdala, a brain region
critical to emotional memory.
Next they tested the mouse's ability to recall a specific cage it had visited before. The cage was
outfitted with patterned walls and a unique smell.

To visualize which brain cells stored the mouse's memories about the cage, the scientists tracked
a genetic marker that reveals recent neuron activity. When the team examined the animals'
amygdalas after the experiment, they found substantial amounts of CREB and the marker in
neurons.

"We discovered that the amount of CREB influences whether or not the brain stores a memory,"
said Silva. "If a cell is low in CREB, it is less likely to keep a memory. If the cell is high in
CREB, it is more likely to store the memory."

Human implications of the new research could prove profound.

"By artificially manipulating CREB levels among groups of cells, we can determine where the
brain stores its memories," he explained. "This approach could potentially be used to preserve
memory in people suffering from Alzheimer's or other brain injury. We may be able to guide
memories into healthy cells and away from sick cells in dying regions of the brain."

Our memories define who we are, so learning how the brain stores memory is fundamental to
understanding what it is to be human, Silva observed.

"A memory is not a static snapshot," he said. "Memories serve a purpose. They are about
acquiring information that helps us deal with similar situations in the future. What we recall
helps us learn from our past experiences and better shape our lives."

The study was funded by the National Institute on Aging and the NARSAD: The Mental Health
Research Association. Silva's coauthors included Steven Kushner and Robert Brown of UCLA;
Sheena Josselyn, Jin-Hee Han, Adelaide Yiu and Christy Cole of the University of Toronto;
Rachel Neve of Harvard University; and John Guzowski of UC Irvine.

Story Source:

The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials
provided by University of California - Los Angeles, via EurekAlert!, a service of AAAS.

DNA Could Be Backbone of Next-Generation


Logic Chips
(http://www.sciencedaily.com/releases/2010/0
5/100511133833.htm)
ScienceDaily (May 12, 2010) — In a single day, a solitary grad student at a lab bench can
produce more simple logic circuits than the world's entire output of silicon chips in a month.

So says a Duke University engineer, who believes that the next generation of these logic circuits
at the heart of computers will be produced inexpensively in almost limitless quantities. The
secret is that instead of silicon chips serving as the platform for electric circuits, computer
engineers will take advantage of the unique properties of DNA, that double-helix carrier of all
life's information.

In his latest set of experiments, Chris Dwyer, assistant professor of electrical and computer
engineering at Duke's Pratt School of Engineering, demonstrated that by simply mixing
customized snippets of DNA and other molecules, he could create literally billions of identical,
tiny, waffle-looking structures.

Dwyer has shown that these nanostructures will efficiently self-assemble, and when different
light-sensitive molecules are added to the mixture, the waffles exhibit unique and
"programmable" properties that can be readily tapped. Using light to excite these molecules,
known as chromophores, he can create simple logic gates, or switches.

These nanostructures can then be used as the building blocks for a variety of applications,
ranging from the biomedical to the computational.

"When light is shined on the chromophores, they absorb it, exciting the electrons," Dwyer said.
"The energy released passes to a different type of chromophore nearby that absorbs the energy
and then emits light of a different wavelength. That difference means this output light can be
easily differentiated from the input light, using a detector."

Instead of conventional circuits using electrical current to rapidly switch between zeros or ones,
or to yes and no, light can be used to stimulate similar responses from the DNA-based switches
-- and much faster.

"This is the first demonstration of such an active and rapid processing and sensing capacity at the
molecular level," Dwyer said. The results of his experiments were published online in the journal
Small. "Conventional technology has reached its physical limits. The ability to cheaply produce
virtually unlimited supplies of these tiny circuits seems to me to be the next logical step."

DNA is a well-understood molecule made up of pairs of complimentary nucleotide bases that


have an affinity for each other. Customized snippets of DNA can cheaply be synthesized by
putting the pairs in any order. In their experiments, the researchers took advantage of DNA's
natural ability to latch onto corresponding and specific areas of other DNA snippets.
Dwyer used a jigsaw puzzle analogy to describe the process of what happens when all the waffle
ingredients are mixed together in a container.

"It's like taking pieces of a puzzle, throwing them in a box and as you shake the box, the pieces
gradually find their neighbors to form the puzzle," he said. "What we did was to take billions of
these puzzle pieces, throwing them together, to form billions of copies of the same puzzle."

In the current experiments, the waffle puzzle had 16 pieces, with the chromophores located atop
the waffle's ridges. More complex circuits can be created by building structures composed of
many of these small components, or by building larger waffles. The possibilities are limitless,
Dwyer said.

In addition to their use in computing, Dwyer said that since these nanostructures are basically
sensors, many biomedical applications are possible. Tiny nanostructures could be built that could
respond to different proteins that are markers for disease in a single drop of blood.

Dwyer's research is supported by the National Science Foundation, the Air Force Research
Laboratory, the Defense Advanced Research Projects Agency and the Army Research Office.
Other members of the Duke team were Constantin Pistol, Vincent Mao, Viresh Thusu and Alvin
Lebeck.

Story Source:

The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials
provided by Duke University, via EurekAlert!, a service of AAAS.

Journal Reference:

1. Constantin Pistol, Vincent Mao, Viresh Thusu, Alvin R. Lebeck, Chris Dwyer.
Molecular logic gates: Encoded Multichromophore Response for Simultaneous Label-
Free Detection Small 7/2010. Small, 2010; 6 (7): NA DOI: 10.1002/smll.201090020
Virtual Museum Guide
ScienceDaily (Feb. 15, 2010) — Archaeological treasures are brought to life by Fraunhofer
software. Real images are enriched with digital information on a virtual tour through ancient
buildings, creating a more vivid experience for the museum visitor.

Every visitor would like to embark on a virtual time journey into the past. Researchers have
already set the stage for just such a journey, as exemplified by a recent exhibition in the Allard
Pierson Museum in Amsterdam, where visitors could take a stroll through historical sites. A flat
screen on a rotating column stood beside the many art works, showing an extract of the image on
the wall -- a gigantic black and white photo of the Roman Forum ruins. When the column is
rotated to the left, this correspondingly changes what the viewer sees. A camera connected to the
back of the movable display provides information about the new view appearing on the monitor
-- in this case, the Temple of Saturn ruins. At the same time, a digital animation shows what the
temple might have looked like when intact. If the screen is rotated further, it displays
information, pictures and videos about other ancient buildings, including the Colosseum.

The sophisticated animation is based on software developed by the Fraunhofer Institute for
Computer Graphics Research IGD in Darmstadt. "We have taught the computer to recognize the
image," explains Fraunhofer IGD researcher, Michael Zöllner. "The program knows where the
center of the camera is pointing and can superimpose the relevant overlay -- a text, video or
animation." The original image can always be clearly seen under the overlays, so that visitors
always know where they are on the virtual tour. This technology is known as augmented reality
to the experts.

The Fraunhofer IGD software in the museum currently runs on a mini-computer, controlled via a
touch screen. This handy console clearly indicates a trend towards mobile, virtual guidebooks.
When tourists will hold their consoles in front of a baroque prince's palace, the relevant
customized information will appear immediately on their screens. Fraunhofer IGD researchers
have tested this vision in practice in the iTACITUS project, in which Zöllner's team programmed
a portable computer to act as an electronic tourist guide for the Royal Palace of Venaria near
Turin. New mobile phone technology could accelerate acceptance of augmented reality. "The
smart phone means that augmented reality is at last suitable for the mass market," Zöllner says.
Intelligent Therapies With Virtual Reality for
the Psychological Treatment of Patients
Suffering from Fibromyalgia
ScienceDaily (May 24, 2010) — Researchers of the Labpsictec at the Universitat Jaume I of
Castellon (UJI) and the LahHuman Group at the Universidad Politecnica of Valencia (UPV) and
the University of Valencia (UVEG) have developed a new therapy based on the use of mobile
devices and virtual reality for the psychological treatment of patients suffering from
fibromyalgia.

This therapy is currently being validated by researchers of the UJI and the University of the
Balearic Islands (UIB) with a group of 24 patients and it counts on the essential collaboration of
the Rheumatology Department of the Hospital General of Castellón, supervised by the medical
doctor Belmonte.

Fibromyalgia is a complex and chronic pain syndrome which causes generalized pain and deep
exhaustion, among other symptoms. It is a serious public health problem, more usual among
adult women, and which causes significant negative psychological effects. In fact, 35% of
affected patients suffer from depressive and anxious syndrome.

"Our aim is to achieve that woman patients learn strategies to face the pain which are an
alternative to those they use and which are adaptive in order to improve their physical and mental
state and their quality of life," points out Beatriz Rey, researcher of the LabHuman of the UPV.
The method developed by the researchers is made of three applications. The first one is an
evaluation system of the chronic pain key factors through mobile devices. It is based on a
commercial PDA and a made-to-measure device. The device monitors the degree of physical
activity (accelerometer) and communicates with the PDA via Bluethooth.

The PDA runs an application that offers some questions the patient has to answer three times a
week: intensity of pain (on a scale from 0 to 10), intensity of fatigue (on a scale from 0 to 10)
and mood (on a scale from 1 to 7; in this case, the application shows a series of emoticons). The
answers to each three questions are stored in the PDA. When the user goes to the medical office,
the PDA can be synchonized with the computer of the medical and the data can be stored in a
server.

It has been designed a new version of the Virtual Reality system EMMA to induce positive
emotions on woman patients that works together with this system. "The psychologist supervises
the group sessions using a system of unique screen projection," points out Azucena García-
Palacios researcher of the Labpsitec of the UJI.

Those sessions are carefully guided and use contents (texts, sounds, videos, music and images,
etc) selected to induce positive emotions. The therapist is present during the session and guides
its development. During each session, the system helps the woman patients to consider a feasible
objective they must fulfil before taking part on the next one. Woman patients will follow a
treatment of three weeks with two sessions a week for making an evaluation of the system.

The therapy also has an application of telepsychology (intelligent therapy) through mobile
devices in order patients to continue the treatment out of the doctor's office, such as from home.
"The application is run in the PDA and also allows watching videos on the screen. The videos
are fragments of the treatment sessions with EMMA, which are used to induce positive emotions
along sessions," points Rosa Baños of the UVEG.

Computer Intelligence Predicts Human


Visual Attention for First Time
ScienceDaily (June 17, 2010) — Scientists have just come several steps closer to understanding
change blindness -- the well studied failure of humans to detect seemingly obvious changes to
scenes around them -- with new research that used a computer-based model to predict what types
of changes people are more likely to notice.

These findings on change blindness were presented in the Journal of Vision.

"This is one of the first applications of computer intelligence to help study human visual
intelligence, " said author Peter McOwan, professor at Queen Mary, University of London. "The
biologically inspired mathematics we have developed and tested can have future uses in letting
computer vision systems such as robots detect interesting elements in their visual environment."
During the study, participants were asked to spot the differences between pre-change and post-
change versions of a series of pictures. Some of these pictures had elements added, removed or
color altered, with the location of the change based on attention grabbing properties (this is the
"salience" level referred to in the article).

Unlike previous research where scientists studied change blindness by manually manipulating
such pictures and making decisions about what and where to make a change, the computer model
used in this study eliminated any human bias. The research team at Queen Mary's School of
Electronic Engineering and Computer Science developed an algorithm that let the computer
"decide" how to change the images that study participants were asked to view.

While the experiments confirmed that change blindness can be predicted using this model, the
tests also showed that the addition or removal of an object from the scene is detected more
readily than changes in the color of the object, a result that surprised the scientists. "We expected
a color change to be a lot easier to spot, since color plays such an important role in our day-to-
day lives and visual perception," said lead researcher Milan Verma of Queen Mary.

The authors suggest that the computer-based approach will be useful in designing displays of an
essential nature such as road signs, emergency services, security and surveillance to draw
attention to a change or part of the display that requires immediate attention.

"We live in a world in which we are immersed in visual information," explained Verma. "The
result is a huge cognitive burden which may hinder our ability to complete a given task. This
study is an important step toward understanding how visual information is processed and how we
can go about optimizing the presentation of visual displays."

Story Source:

The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials
provided by Association for Research in Vision and Ophthalmology, via EurekAlert!, a
service of AAAS.

Journal Reference:

1. M. Verma, P. W. McOwan. A semi-automated approach to balancing of bottom-up


salience for predicting change detection performance. Journal of Vision, 2010; 10 (6): 3
DOI: 10.1167/10.6.3

Вам также может понравиться