Researchers take a step toward light-based brain-like computing chip
A team of European scientists from Germany and the UK has revealed a pioneering new way to create optical neural networks that 'teach themselves' to recognise patterns in image data.
Traditional computers are built on the von Neumann architecture, with separate memory and processor units operating one command at a time.
Compared to the brain – where processing and memory functions are co-located and a massively parallel approach is used – this can be very inefficient.
To develop computers that work more like the brain, hardware devices that operate in a similar way to brain neurons and synapses are needed, and such devices should be combined into large-scale networks capable of real-world tasks.
Prof Wolfram Pernice from the University of Muenster, lead-partner in the study, explains, "We have made significant steps towards this goal – working here with light-based devices rather than electronics—demonstrating integrated photonic neurosynaptic networks that can recognise patterns, identify letters and numbers, even correctly differentiate between the languages of written text."
Prof Harish Bhaskaran, co-author from Oxford University added, "Working with photons instead of electrons will allow us to exploit well-known benefits of optical technologies—wavelength division multiplexing, ultra-high bandwidths, low energy consumption—but here in the realm of computing rather than the more usual communications field".
Johannes Feldmann, first author of the paper, also from Muenster, pointed out that, "Key to our work is the successful merging of phase-change devices and silicon photonics – this gives us the ability to successfully mimic the behaviour of biological neurons and synapses, at least in a basic way."
Prof C David Wright, co-author of the study from the University of Exeter, summed up by saying, "This is, we believe, a significant experimental milestone—a fully-scalable integrated photonic system that can process and store information in a brain-like fashion. Our approach could find widespread utility in power-critical situations such mobile and so-called 'edge computing' applications."
Provided by University of Exeter