Programming smart molecules: Machine-learning algorithms could make chemical reactions intelligent

Dec 12, 2013
Ryan P. Adams is an assistant professor of computer science at Harvard SEAS. Credit: Eliza Grinnell, SEAS Communications.

Computer scientists at the Harvard School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard University have joined forces to put powerful probabilistic reasoning algorithms in the hands of bioengineers.

In a new paper presented at the Neural Information Processing Systems conference on December 7, Ryan P. Adams and Nils Napp have shown that an important class of artificial intelligence algorithms could be implemented using .

These algorithms, which use a technique called "message passing inference on factor graphs," are a mathematical coupling of ideas from graph theory and probability. They represent the state of the art in and are already critical components of everyday tools ranging from search engines and fraud detection to error correction in mobile phones.

Adams' and Napp's work demonstrates that some aspects of artificial intelligence (AI) could be implemented at microscopic scales using molecules. In the long term, the researchers say, such theoretical developments could open the door for "smart drugs" that can automatically detect, diagnose, and treat a variety of diseases using a cocktail of chemicals that can perform AI-type reasoning.

"We understand a lot about building AI systems that can learn and adapt at macroscopic scales; these algorithms live behind the scenes in many of the devices we interact with every day," says Adams, an assistant professor of computer science at SEAS whose Intelligent Probabilistic Systems group focuses on machine learning and computational statistics. "This work shows that it is possible to also build intelligent machines at tiny scales, without needing anything that looks like a regular computer. This kind of chemical-based AI will be necessary for constructing therapies that sense and adapt to their environment. The hope is to eventually have drugs that can specialize themselves to your personal chemistry and can diagnose or treat a range of pathologies."

Adams and Napp designed a tool that can take probabilistic representations of unknowns in the world (probabilistic graphical models, in the language of machine learning) and compile them into a set of chemical reactions that estimate quantities that cannot be observed directly. The key insight is that the dynamics of chemical reactions map directly onto the two types of computational steps that computer scientists would normally perform in silico to achieve the same end.

This insight opens up interesting new questions for computer scientists working on statistical machine learning, such as how to develop novel algorithms and models that are specifically tailored to tackling the uncertainty molecular engineers typically face. In addition to the long-term possibilities for smart therapeutics, it could also open the door for analyzing natural biological reaction pathways and regulatory networks as mechanisms that are performing statistical inference. Just like robots, biological cells must estimate external environmental states and act on them; designing artificial systems that perform these tasks could give scientists a better understanding of how such problems might be solved on a molecular level inside living systems.

"There is much ongoing research to develop chemical computational devices," says Napp, a postdoctoral fellow at the Wyss Institute, working on the Bioinspired Robotics platform, and a member of the Self-organizing Systems Research group at SEAS. Both groups are led by Radhika Nagpal, the Fred Kavli Professor of Computer Science at SEAS and a Wyss core faculty member. At the Wyss Institute, a portion of Napp's research involves developing new types of robotic devices that move and adapt like living creatures.

"What makes this project different is that, instead of aiming for general computation, we focused on efficiently translating particular algorithms that have been successful at solving difficult problems in areas like robotics into molecular descriptions," Napp explains. "For example, these algorithms allow today's robots to make complex decisions and reliably use noisy sensors. It is really exciting to think about what these tools might be able to do for building better molecular machines."

Indeed, the field of machine learning is revolutionizing many areas of science and engineering. The ability to extract useful insights from vast amounts of weak and incomplete information is not only fueling the current interest in "big data," but has also enabled rapid progress in more traditional disciplines such as computer vision, estimation, and robotics, where data are available but difficult to interpret. Bioengineers often face similar challenges, as many molecular pathways are still poorly characterized and available data are corrupted by random noise.

Using machine learning, these challenges can now be overcome by modeling the dependencies between random variables and using them to extract and accumulate the small amounts of information each random event provides.

"Probabilistic graphical models are particularly efficient tools for computing estimates of unobserved phenomena," says Adams. "It's very exciting to find that these tools map so well to the world of cell biology."

Explore further: Machine learning branches out

Related Stories

Machine learning branches out

Nov 14, 2013

Much artificial-intelligence research is concerned with finding statistical correlations between variables: What combinations of visible features indicate the presence of a particular object in a digital ...

DARPA envisions the future of machine learning

Mar 20, 2013

Machine learning – the ability of computers to understand data, manage results, and infer insights from uncertain information – is the force behind many recent revolutions in computing. Email spam filters, ...

Scientists develop advanced biological computer

May 24, 2013

(Phys.org) —Using only biomolecules (such as DNA and enzymes), scientists at the Technion-Israel Institute of Technology have developed and constructed an advanced biological transducer, a computing machine capable of manipulating ...

Recommended for you

Computerized emotion detector

12 hours ago

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

Cutting the cloud computing carbon cost

Sep 12, 2014

Cloud computing involves displacing data storage and processing from the user's computer on to remote servers. It can provide users with more storage space and computing power that they can then access from anywhere in the ...

Teaching computers the nuances of human conversation

Sep 12, 2014

Computer scientists have successfully developed programs to recognize spoken language, as in automated phone systems that respond to voice prompts and voice-activated assistants like Apple's Siri.

Mapping the connections between diverse sets of data

Sep 12, 2014

What is a map? Most often, it's a visual tool used to demonstrate the relationship between multiple places in geographic space. They're useful because you can look at one and very quickly pick up on the general ...

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

11791
3 / 5 (3) Dec 12, 2013
Has anyone read Blood music? There are dangers of misusing the technology discussed here. Nanomachines and "germs that think" can do us in and make the human race extinct.
Huns
3 / 5 (3) Dec 13, 2013
That's true, but the tech will come one way or another. We have to work on figuring out some kind of countermeasures because it's only a matter of time before it falls into the hands of someone who wants to murder the planet, or simply makes a mistake and creates "gray goo."