As sensors proliferate, opportunities are emerging in the field of machine learning

As sensors proliferate, opportunities are emerging in the field of machine learning
Roundworm: 302 Neurons

Biological learning systems run the gamut from the lowly roundworm (Caenorhabditis elegans) with its 300 or so neurons, all the way up to the adult elephant brain, with its 200 billion neurons. Whether they're located in fruit flies or cockroaches, chimpanzees or dolphins, all neurons do the same thing: they process and transmit information. And the reason for this is the same across the biological board: To avoid danger and maximize success in sustaining and propagating themselves, all organisms must be able to sense the environment, respond to it accordingly, and remember those stimuli that indicate risks and rewards.

Learning, in short, is a prerequisite for the survival of individuals and species in the natural world. The same iron law, however, is becoming increasingly applicable to the world of man-made systems.

According to Dr. Volker Tresp, one of Siemens' top machine learning authorities and a computer science professor at Ludwig Maximillian University in Munich, there are three kinds of learning: memorization (such as the ability to remember facts); skills (such as the ability to learn to throw a ball); and abstraction (such as the ability to form rules based on observations). Computers, which are born whizzes in the first area, are rapidly catching on to the other two.

As sensors proliferate, opportunities are emerging in the field of machine learning
Fruit Fly: 100,000 Neurons

Take, for instance, the skill needed to produce a flawlessly even sheet of steel in a given thickness—an area in which Siemens has been a leader for over 20 years. "Here," says Tresp, "the simplest learning schema is to make a prediction, and then check to see if the output product meets the desired specification." Confronted with an output requirement for, say, a particularly high grade of steel, an automated rolling mill would take sensor data (composition, strip temperature, etc.) into account, estimate the required pressure based on previously learned information, and then adjust itself accordingly in real time in response to its measuring data until it achieved exactly the right pressure to get the desired thickness. "In a neural network-based learning system," explains Tresp, "this would be achieved by adjusting the relative weight matrix (see diagram) of all the factors that influence a given parameter, such as thickness."

As sensors proliferate, opportunities are emerging in the field of machine learning
Cockroach: 1,000,000

Beyond memorization and the ability to optimize skills, artificial systems are increasingly being called upon to generalize or abstract the characteristics that make an individual item a member of a group. Optical character recognition (OCR), which has traditionally been used for high-speed postal sorting, is a case in point. Since approximately 1985, when this technology was first developed, accuracy has skyrocketed from single digits to over 95 percent for handwritten Latin alphabets and over 90 percent for Arabic handwriting. In fact, in 2007, Siemens' ARTread won first place in the International Conference on Document Analysis and Recognition contest for OCR in Arabic. Given OCR technology's exceptionally high level of reliability, it is beginning to migrate to applications such as automatic license plate recognition  and industrial vision.

As sensors proliferate, opportunities are emerging in the field of machine learning
Human: 100,000,000,000 Neurons

What else will be possible in the future? Better performance and increasing numbers of sensors will open up great new opportunities, especially for industry. More and more data is becoming accessible locally and through networks. However, this flood of data has to be intelligently analyzed if it is to be useful.

As sensors proliferate, opportunities are emerging in the field of machine learning
Elephant: 200,000,000,000 Neurons
"Machine learning plays an important role in the development of new smart data applications," explains Tresp.  Unlike purely statistical applications, which focus on interpreted parameters,  or data mining, which primarily recognizes patterns in the sea of data, machine learning systems, such as artificial neural networks, supply forecasts that can be used to make automatic decisions. Today's "deep learning" processes use up to 100,000 simulated neurons and 10 million simulated connections. They thus break all of the previous records for artificial intelligence and make new applications possible in areas such as automatic image recognition.

As sensors proliferate, opportunities are emerging in the field of machine learning
Octopus: 300,000,000 Neurons

These new deep methods use far more levels of artificial neurons than was previously the case. Each level handles a single level of abstraction of the material to be learned. Through the interconnection of numerous levels, the resulting insights are much more detailed than with previous artificial neural networks. Most of us carry an around with us, because such deeply layered neural networks are used for the voice recognition systems of all state-of-the-art Android smartphones. However, Tresp's team is going a step further by modeling mathematical knowledge networks encompassing up to 10 million objects.

These networks can make up to 1014 possible predictions regarding the relationships between these objects—a figure that corresponds more or less to the number of synapses in the brain of a grown person.

As sensors proliferate, opportunities are emerging in the field of machine learning
Neural networked-based learning system (1) based on input information (2) and providing output prediction (3) regarding gas demand over seven days based on a 14-day training phase. Learning is represented in three snapshots from random weighted (4) to partially learned (5), to fully trained (6). Neural networked-based systems have the ability to process huge amounts of input data in order to adjust their output. To accomplish this, such a system must build up a mathematical model that duplicates its real-world counterpart. Such a model is essentially a community of decision units. Collectively, the interaction of the decision units can be represented in the form of a matrix (see inset in each box). Depending on the complexity of the application, hundreds of interaction matrices may be required. Initially, interactions among the decision units are random. Thus, when the system begins its training phase (see time line left), its error level — the difference between expectation and observation — is high (4). Once compared to actual output, the error level is fed back into each matrix (arrows pointing right to each box), thus modifying the internal weights of each decision unit away from randomness and altering each input parameter based on what has been learned (arrows pointing left from each box). Eventually, after thousands of iterations, each of which is designed to reduce the error level, the system learns to describe the entire flow of input information over time in such a way that its output exactly duplicates (6) — and eventually predicts — the behavior of the real world.

Explore further

Neural networks that function like the human visual cortex may help realize faster, more reliable pattern recognition

Provided by Siemens
Citation: As sensors proliferate, opportunities are emerging in the field of machine learning (2014, November 12) retrieved 25 June 2019 from https://phys.org/news/2014-11-sensors-proliferate-opportunities-emerging-field.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
0 shares

Feedback to editors

User comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more