How to talk a computer into creating a program for you


With Apple's Siri and other voice-recognition software becoming commonplace, you might take it for granted that we can now talk to computers. But as dependable as these systems have become, you will not get them to do anything that they are not already programmed to do. But Regina Barzilay and colleagues at the Massachusetts Institute of Technology have talked a computer into writing new software.

Their system takes a task described in natural language and automatically generates the computer code to carry it out – an important first step toward allowing people who are not familiar with computer code to program computers. "It won't replace the need for programmers, but it can help with specific programming tasks," says Barzilay.
The team focused on a common problem – writing software that reads the input given to a computer. By generating this code automatically, programmers are freed up to write the parts of software that require more creativity.
Code that checks input is at the heart of web forms, spreadsheets and databases. The challenge is to specify what kind of input is allowed. When you log in to a website, for example, software code checks that what you type matches the required format for a password or email address. An email address must consist of letters and/or digits, then an @ symbol, more letters and/or digits, and end with ".com" or ".co.uk" or similar.
Barzilay's team developed a technique that takes a natural-language description of the required input and automatically generates the code to check for that input. The system works by extracting noun phrases – such as "one or more letters and digits" and "an 'at' symbol" – and builds code accordingly.

Imprecise and ambiguous

They tested it with 106 natural-language descriptions of different input formats taken from ACM International Collegiate Programming Contests – input formats designed to challenge coders. They found that their system could automatically generate correct software for over 70 per cent of these descriptions.
The work will be presented in August at the annual meeting of the Association for Computational Linguistics in Sofia, Bulgaria.
Robert Chatley at software development consultancy Develogical in London agrees that generating code from natural language can be useful. But he notes that we are a long way from doing this for more complex tasks. "Natural language can often be imprecise, ambiguous and idiomatic, none of which are handled well by computers," he says.

Fake smile in a mirror makes you buy what you try on



"Does my smile look big in this?" Future fitting-room mirrors in clothing stores could subtly tweak your reflection to make you look – and hence feel – happier, encouraging you to like what you see.
That's the idea behind the Emotion Evoking System developed by Shigeo Yoshida and colleagues at the University of Tokyo in Japan. The system can manipulate your emotions and personal preferences by presenting you with an image of your own smiling or frowning face.

The principle that physiological changes can drive emotional ones – that laughter comes before happiness, rather than the other way around – is a well-established idea.
The researchers wanted to see if this idea could be used to build a computer system that manipulates how you feel. The system works by presenting the user with a webcam image of his or her face – as if they were looking in a mirror. The image is then subtly altered with software, turning the corners of the mouth up or down and changing the area around the eyes, so that the person appears to smile or frown.
Without telling them the aim of the study, the team recruited 21 volunteers and asked them to sit in front of the screen while performing an unrelated task. When the task was complete the participants rated how they felt. When the faces on screen appeared to smile, people reported that they felt happier. Conversely, when the image was given a sad expression, they reported feeling less happy.

As you like it

Yoshida and his colleagues tested whether manipulating the volunteers' emotional state would influence their preferences. Each person was given a scarf to wear and again presented with the altered webcam image. The volunteers that saw themselves smiling while wearing the scarf were more likely to report that they liked it, and those that saw themselves not smiling were less likely.
The system could be used to manipulate consumers' impressions of products, say the researchers. For example, mirrors in clothing-store fitting rooms could be replaced with screens showing altered reflections. They also suggest people may be more likely to find clothes attractive if they see themselves looking happy while trying them on.
"It's certainly an interesting area," says Chris Creed at the University of Birmingham, UK. But he notes that using such technology in a shop would be harder than in the lab, because people will use a wide range of expressions. "Attempting to make slight differences to these and ensuring that the reflected image looks believable would be much more challenging," he says.
Of course, there are also important ethical questions surrounding such subtlymanipulative technology. "You could argue that if it makes people happy what harm is it doing?" says Creed. "On the other hand, I can imagine that many people may feel manipulated, uncomfortable and cheated if they found out."

NASA's upcoming astronaut capsule has hints of Apollo

For similar stories, visit the Picture of the day Topic Guide



A flashback to the space future (Image: NASA/Robert Markowitz)
For an out-of-this-world commute, you need a perfectly tricked-out vehicle. With sky-blue LED lighting and seating for seven, this space capsule certainly fits the bill.
This photo gives a glimpse inside of the CST-100, a commercial crew capsule being built by Boeing with support from NASA, which aims to restore the US's ability to independently launch astronauts into space.
The full-scale mock-up of the capsule recently underwent a day-long series of tests by two NASA astronauts. The purpose of the tests was to see how the astronauts were able to work with the space and equipment available before the design is finalised.
Don't be fooled by its retro, Apollo-like exterior appearance – the CST-100 uses the latest technology, including enhanced thermal protection for that long drop back through the atmosphere and touchscreen tablets to replace the sea of buttons seen in space capsules of yore.
"What you're not going to find is 1100 or 1600 switches," says Chris Ferguson, a former astronaut and director of Boeing's commercial crew development programme. "We don't want to burden [the astronauts] with an inordinate amount of training to fly this vehicle. We want it to be intuitive."
The project is funded by NASA in its bid to get the US back in the astronaut transport business after it retired the shuttle programme in July 2011. Currently, US astronauts are dependent on Russia's Soyuz capsules. The US forks out $71 million (£46 million) per seat to reach the International Space Station.
Boeing has plans to test the CST-100 in 2016 in a crewed, three-day orbital flight, riding an Atlas V rocket into space. The capsule will attempt to dock with the ISS in 2017 – as long as NASA gets the funding from the US Congress.
NASA is also funding the development of Boeing's rivals: the Sierra Nevada Corporation's Dream Chaser spaceplane and SpaceX's Dragon spacecraft. Dragon is already powering ahead, charged with delivering crucial supplies to the ISS, having first successfully docked with it in May 2012.

Sensor knows when you're lying through your teeth


A sensor embedded in a tooth could one day tell doctors when people have defied medical advice to give up smoking or eat less. Built into a tiny circuit board that fits in a tooth cavity, the sensor includes an accelerometer that sends data on mouth motion to a smartphone.

Machine learning software is taught to recognise each telltale jaw motion pattern, then works out how much of the time the patient is chewing, drinking, speaking, coughing or smoking.
The inventors – Hao-hua Chu and colleagues at National Taiwan University in Taipei – want to use the mouth as a window on a variety of health issues. The device can be fitted into dentures or a dental brace, and the team plan to miniaturise the device to fit in a cavity or crown.
The researchers say the sensor shows great promise: in tests on eight people with a prototype implant installed in their dentures, the system recognised oral activities correctly 94 per cent of the time.
The prototype was attached to a power source by an external wire, so the team still needs a way to include a microbattery.
Once they manage this, the researchers want to add a Bluetooth radio to the device. But as that is a microwave energy source – albeit a very low power one – Chu says medical experts are advising the team on how to ensure the implant would be safe.
If miniaturised and made wireless, the device has potential, says Trevor Johnson, vice-chair of research at the Faculty of General Dental Practice in the UK. "This could have a number of uses in dentistry, for example as a research tool, for monitoring patients who clench or grind their teeth, and for assessing the impact of various dental interventions," he says.
 
Blogger Widgets