How is artificial intelligence changing science?

May 25, 2018, Intel
Credit: Intel

Intel's Gadi Singer believes his most important challenge is his latest: using artificial intelligence (AI) to reshape scientific exploration.

In a Q&A timed with the first Intel AI DevCon event, the Intel vice president and architecture general manager for its Artificial Intelligence Products Group discussed his role at the intersection of science—computing's most demanding customer—and AI, how scientists should approach AI and why it is the most dynamic and exciting opportunity he has faced.

Q. How is AI changing science?

Scientific exploration is going through a transition that, in the last 100 years, might only be compared to what happened in the '50s and '60s, moving to data and large data systems. In the '60s, the amount of data being gathered was so large that the frontrunners were not those with the finest instruments, but rather those able to analyze the data that was gathered in any scientific area, whether it was climate, seismology, biology, pharmaceuticals, the exploration of new medicine, and so on.

Today, the data has gone to levels far exceeding the abilities of people to ask particular queries or look for particular insights. The combination of this data deluge with modern computing and deep learning techniques is providing new and many times more disruptive capabilities.

Q. What's an example?

One of them, which uses the basic strength of deep learning, is the identification of very faint patterns within a very noisy dataset, and even in the absence of an exact mathematical model of what you're looking for.

Think about cosmic events happening in a far galaxy, and you're looking for some characteristics of the phenomena to spot them out of a very large dataset. This is an instance of searching without a known equation, where you are able to give examples, and through them, let the deep learning system learn what to look for and ultimately find out a particular pattern.

Q. So you know what you're looking for, but you don't know how to find it?

You can't define the exact mathematical equation or the queries that describe it. The data is too large for trial-and-error and previous big-data analytics techniques do not have enough defined features to successfully search for the pattern.

You know what you're looking for because you tagged several examples of it in your data, and you can generally describe it. Deep learning can help you spot occurrences from such a class within a noisy multidimensional dataset.

Q. Are there other ways AI can change the scientific approach?

Another example is when you do have a mathematical model, like a set of accurate equations. In this case you can use AI to achieve comparable results in 10,000 times less time and computing.

Say you have a new molecular structure and you want to know how it's going to behave in some environment for pharma exploration. There are very good predictive models on how it will behave. The problem is that those models take a tremendous amount of computation and time—it might take you weeks to try just one combination.

More: Intel AI VP Gadi Singer on One Song to the Tune of Another (The Next Platform) | Intel AI DevCon (Press Kit) | Artificial Intelligence at Intel (Press Kit) | More Intel Explainers

In such a case, you can use a deep learning system to shadow the accurate system of equations. You iteratively feed sample cases to this system of equations, and you get the results days later. The deep learning network learns the relationship between the input and the output, without knowing the equation itself. It just tracks it. It was demonstrated in multiple cases that, after you train the deep learning system with enough examples, it shows excellent ability to predict the result that will be given by the exact model. This translates to an efficiency that could turn hours or days into second.

Granted, sometimes the full computation will be required for ultimate model accuracy. However, that would only be needed for a small subset of cases. The fact that you can generate an accurate result so much faster with a fraction of the power and the time allows you to explore the potential solution space much faster.

In the last couple years, new machine learning methods have emerged for "learning how to learn." These technologies are tackling an almost-endless realm of options—like all the possible mutations in human DNA—and are using exploration and meta-learning techniques to identify the most relevant options to evaluate.

Q. What's the big-picture impact to the scientific method or just the approach that a scientist would take with AI?

Scientists need to partner with AI. They can greatly benefit from mastering the tools of AI, such as and others, in order to explore phenomena that are less defined, or when they need faster performance by orders of magnitude to address a large space. Scientists can partner with machine learning to explore and investigate which new possibilities have the best likelihood of breakthroughs and new solutions.

Q. I'm guessing you could retire if you wanted to. What keeps you going now?

Well, I'm having a great time. AI at Intel today is about solving the most exciting and most challenging problems the industry and science are facing. This is an area that moves faster than anything I've seen in my 35 years at Intel, by far.

The other aspect is that I'm looking at it as a change that is brewing in the interaction between humans and machines. I want to be part of the effort of creating this new link. When I talk about partnership of science and AI, or autonomous vehicles and other areas, there's a role here for a broader thinking than just how to give the fastest processor for the task. This newly forged interaction between people and AI is another fascinating part of this space.

Explore further: Using deep neural network acceleration for image analysis in drug discovery

Related Stories

Intel buys artificial intelligence startup

August 10, 2016

US-based Intel announced a deal to buy an artificial intelligence startup as the computer chip colossus looks to broaden its role in data centers and the expanding internet of things.

Supercomputing speeds up deep learning training

November 13, 2017

A team of researchers from the University of California, Berkeley, the University of California, Davis and the Texas Advanced Computing Center (TACC) published the results of an effort to harness the power of supercomputers ...

Recommended for you

Security gaps identified in internet protocol IPsec

August 15, 2018

In collaboration with colleagues from Opole University in Poland, researchers at Horst Görtz Institute for IT Security (HGI) at Ruhr-Universität Bochum (RUB) have demonstrated that the internet protocol IPsec is vulnerable ...

Researchers find flaw in WhatsApp

August 8, 2018

Researchers at Israeli cybersecurity firm said Wednesday they had found a flaw in WhatsApp that could allow hackers to modify and send fake messages in the popular social messaging app.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet May 29, 2018
Another is the replacement of analytical models with simulations

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.