Welcome to the desk of an Engineer......

For an optimist the glass is half full, for a pessimist it’s half empty, and for an engineer is twice bigger than necessary.

Wednesday, May 30, 2012

Ion-Based Electronic Chip to Control Muscles: Entirely New Circuit Technology Based On Ions and Molecules

Klas Tybrandt, doctoral student in organic electronics at Linkoping University, Sweden, has developed an integrated chemical chip.

The chemical chip can control the delivery of the neurotransmitter acetylcholine. This enables chemical control of muscles, which are activated when they come into contact with acetylcholine. (Credit: LiU/Ingemar Franzén)


The Organic Electronics research group at Linköping University previously developed ion transistors for transport of both positive and negative ions, as well as biomolecules. Tybrandt has now succeeded in combining both transistor types into complementary circuits, in a similar way to traditional silicon-based electronics.
An advantage of chemical circuits is that the charge carrier consists of chemical substances with various functions. This means that we now have new opportunities to control and regulate the signal paths of cells in the human body.
"We can, for example, send out signals to muscle synapses where the signalling system may not work for some reason. We know our chip works with common signalling substances, for example acetylcholine," says Magnus Berggren, Professor of Organic Electronics and leader of the research group.
The development of ion transistors, which can control and transport ions and charged biomolecules, was begun three years ago by Tybrandt and Berggren, respectively a doctoral student and professor in Organic Electronics at the Department of Science and Technology at Linköping University. The transistors were then used by researchers at Karolinska Institutet to control the delivery of the signalling substance acetylcholine to individual cells. The results were published in the Proceedings of the National Academy of Sciences.
In conjunction with Robert Forchheimer, Professor of Information Coding at LiU, Tybrandt has now taken the next step by developing chemical chips that also contain logic gates, such as NAND gates that allow for the construction of all logical functions.
His breakthrough creates the basis for an entirely new circuit technology based on ions and molecules instead of electrons and holes.

Monday, May 28, 2012

Gestures Fulfill a Big Role in Language

People of all ages and cultures gesture while speaking, some much more noticeably than others. But is gesturing uniquely tied to speech, or is it, rather, processed by the brain like any other manual action?


Scientists have discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain’s understanding of speech than simply gesturing as if stirring a spoon in a cup. (Credit: Image courtesy of Acoustical Society of America (ASA))
A U.S.-Netherlands research collaboration delving into this tie discovered that actual actions on objects, such as physically stirring a spoon in a cup, have less of an impact on the brain's understanding of speech than simply gesturing as if stirring a spoon in a cup. This is surprising because there is less visual information contained in gestures than in actual actions on objects. In short: Less may actually be more when it comes to gestures and actions in terms of understanding language.
Spencer Kelly, associate professor of Psychology, director of the Neuroscience program, and co-director of the Center for Language and Brain at Colgate University, and colleagues from the National Institutes of Health and Max Planck Institute for Psycholinguistics will present their research at the Acoustics 2012 meeting in Hong Kong, May 13-18, a joint meeting of the Acoustical Society of America (ASA), Acoustical Society of China, Western Pacific Acoustics Conference, and the Hong Kong Institute of Acoustics.
Among their key findings is that gestures -- more than actions -- appear to make people pay attention to the acoustics of speech. When we see a gesture, our auditory system expects to also hear speech. But this is not what the researchers found in the case of manual actions on objects.
Just think of all the actions you've seen today that occurred in the absence of speech. "This special relationship is interesting because many scientists have argued that spoken language evolved from a gestural communication system -- using the entire body -- in our evolutionary past," points out Kelly. "Our results provide a glimpse into this past relationship by showing that gestures still have a tight and perhaps special coupling with speech in present-day communication. In this way, gestures are not merely add-ons to language -- they may actually be a fundamental part of it."
A better understanding of the role hand gestures play in how people understand language could lead to new audio and visual instruction techniques to help people overcome major challenges with language delays and disorders or learning a second language.
What's next for the researchers? "We're interested in how other types of visual inputs, such as eye gaze, mouth movements, and facial expressions, combine with hand gestures to impact speech processing. This will allow us to develop even more natural and effective ways to help people understand and learn language," says Kelly.

It's in the Genes: Research Pinpoints How Plants Know When to Flower

Determining the proper time to flower, important if a plant is to reproduce successfully, involves a sequence of molecular events, a plant's circadian clock and sunlight.

Understanding how flowering works in the simple plant used in this study -- Arabidopsis -- should lead to a better understanding of how the same genes work in more complex plants grown as crops such as rice, wheat and barley, according to Takato Imaizumi, a University of Washington assistant professor of biology and corresponding author of a paper in the May 25 issue of the journal Science.
"If we can regulate the timing of flowering, we might be able to increase crop yield by accelerating or delaying this. Knowing the mechanism gives us the tools to manipulate this," Imaizumi said. Along with food crops, the work might also lead to higher yields of plants grown for biofuels.
At specific times of year, flowering plants produce a protein known as FLOWERING LOCUS T in their leaves that induces flowering. Once this protein is made, it travels from the leaves to the shoot apex, a part of the plant where cells are undifferentiated, meaning they can either become leaves or flowers. At the shoot apex, this protein starts the molecular changes that send cells on the path to becoming flowers.
Changes in day length tell many organisms that the seasons are changing. It has long been known that plants use an internal time-keeping mechanism known as the circadian clock to measure changes in day length. Circadian clocks synchronize biological processes during 24-hour periods in people, animals, insects, plants and other organisms.
Imaizumi and the paper's co-authors investigated what's called the FKF1 protein, which they suspected was a key player in the mechanism by which plants recognize seasonal change and know when to flower. FKF1 protein is a photoreceptor, meaning it is activated by sunlight.
"The FKF1 photoreceptor protein we've been working on is expressed in the late afternoon every day, and is very tightly regulated by the plant's circadian clock," Imaizumi said. "When this protein is expressed during days that are short, this protein cannot be activated, as there is no daylight in the late afternoon. When this protein is expressed during a longer day, this photoreceptor makes use of the light and activates the flowering mechanisms involving FLOWERING LOCUS T. The circadian clock regulates the timing of the specific photoreceptor for flowering. That is how plants sense differences in day length."
This system keeps plants from flowering when it's a poor time to reproduce, such as the dead of winter when days are short and nights are long.
The new findings come from work with the plant Arabidopsis, a small plant in the mustard family that's often used in genetic research. They validate predictions from a mathematical model of the mechanism that causes Arabidopsis to flower that was developed by Andrew Millar, a University of Edinburgh professor of biology and co-author of the paper.
"Our mathematical model helped us to understand the operating principles of the plants' day-length sensor," Millar said. "Those principles will hold true in other plants, like rice, where the crop's day-length response is one of the factors that limits where farmers can obtain good harvests. It's that same day-length response that needs controlled lighting for laying chickens and fish farms, so it's just as important to understand this response in animals.
"The proteins involved in animals are not yet so well understood as they are in plants but we expect the same principles that we've learned from these studies to apply."
First author on the paper is Young Hun Song, a postdoctoral researcher in Imaizumi's UW lab. The other co-authors are Benjamin To, who was a UW undergraduate student when this work was being conducted, and Robert Smith, a University of Edinburgh graduate student. The work was funded by the National Institutes of Health, and the United Kingdom's Biotechnology and Biological Sciences Research Council.

Saturday, May 26, 2012

Bacterial trick keeps robots in sync

You don’t have to watch Dancing with the Stars to know that keeping in sync is tough — and it’s even tougher for a robot. A new approach keeps several robots in step, and even enables a dancing robot that loses its footing to seamlessly rejoin its synchronized peers.
One way to synchronize a group of robots is for each to communicate with one another about their positions, but distance between the robots can lead to time delays. And when many robots are involved, the complexity of this communication network grows. To skirt such problems, researchers from MIT have taken inspiration from bacteria that synchronize their behavior not by checking in with each other, but by checking in with their environment.

Synchronizing robots this way might work well in rescue operations where robots are damaged and need to be replaced, says Paola Flocchini, a distributed computing expert at the University of Ottawa in Canada.
Many bacteria coordinate via a process called quorum sensing, releasing a steady stream of signaling molecules into the environment and also sensing the signaling molecules. When enough bacteria are around that the local concentration of these molecules soars, it’s time for group action: Genes get turned on, molecular switches are flipped and the bacteria all change their behavior in sync.
Similarly, MIT’s Jean-Jacques Slotine and Patrick Bechon coordinated the behavior of eight dancing humanoid robots by having the bots send information to — and get information from — an external computer server. The work was posted May 14 on arXiv.org.
The robots go through cycles of prescribed actions, such as bobbing their heads, and send the  server information about where they are in these cycles. The server then sends the average of this information back to all the robots. So a robot joining its dancing peers will check in with the server about what the other robots are doing. It can then calculate what the next movement is in the synchronized cycle and rejoin the group. Information about the music — in the test case, Michael Jackson’s “Thriller” — is also embedded in the information sent back to the robots.
Incorporating math that describes the oscillating movements of body parts, such as arm and heads, is quite clever, says Mehran Mesbahi of the University of Washington in Seattle, whose research includes spacecraft navigation and control. It’s much harder to incorporate information on position, angles and music, he says, than to have a simple command such as “March.”