Optalysys will launch prototype optical processor

August 9, 2014 by Nancy Owano weblog
Credit: Optalysys

UK-based startup Optalysys is moving ahead to deliver exascale levels of processing power on a standard-sized desktop computer within the next few years, reported HPCwire earlier this week. The company itself announced on August 1 that it is "only months away from launching a prototype optical processor with the potential to deliver exascale levels of processing power on a standard-sized desktop computer." The company will demo its prototype, which meets NASA Technology Readiness Level 4, in January next year. Though the January date represents only a proof-of-concept stage, the processor is expected to run at over 340 gigaFLOPS , which will enable it to analyze large data sets, and produce complex model simulations in a laboratory environment. Engadget commented that those numbers were not bad for a proof of concept. HPCwire pointed at the potential significance of this work in its article's headline, "IsThis the Exascale Breakthrough We've Been Waiting For?" Optalysys' technology uses light for compute-intensive mathematical functions at speeds that exceed what can be achieved with electronics, at a fraction of the cost and power consumption.

The company CEO, Dr. Nick New, said that, "Using low power lasers and high resolution liquid crystal micro-displays, calculations are performed in parallel at the ." Engadget contributing editor Steve Dent wrote how "low-intensity lasers are beamed through layers of liquid crystal grids, which change the light intensity based on user-inputted data. The resulting interference patterns can be used to solve mathematical equations and perform other tasks. By splitting the beam through multiple grids, the system can compute in parallel much more efficiently than standard multi-processing supercomputers."

Optalysys technology can provide a big improvement over traditional computing, the company noted. "Electronic processing is fundamentally serial, processing one calculation after the other. This results in big data management problems when the resolutions are increased, with improvements in processor speed only resulting in incremental improvements in performance. The Optalysys technology, however, is truly parallel – once the data is loaded into the grids, the processing is done at the speed of light, regardless of the resolution. The Optalysys technology we are developing is highly innovative and we have filed several patents in the process."

The company's target audience for their technology? "We hope the will be used by everyone, but to begin with we envisage the first users to be the users of high power Computational Fluid Dynamics (CFD) simulations and Big Data sets."

This video is not supported by your browser at this time.

Optalysys chairman James Duez said in applications such as aerofoil design, weather forecasting, MRI data analysis and quantum computing, "it is becoming increasingly common for traditional computing methods to fall short of delivering the needed." CFD, noted Duez, can be used to predict the weather, design cars and model airflow, "but the speed of processing needed to create models is constrained by current electrical capabilities."

Dr. New said "We are currently developing two products, a 'Big Data' analysis system and an Optical Solver Supercomputer, both of which are expected to be launched in 2017."

This video is not supported by your browser at this time.

Optalysys was founded in 2013 by New and Duez. The team includes specialists in software development, free-space optics, optical engineering and production engineering.

Explore further: Intel flirts with exascale leap in supercomputing

More information:www.optalysys.com/faq/


Related Stories

Intel flirts with exascale leap in supercomputing

June 19, 2012

(Phys.org) -- If exascale range is the next destination post in high-performance computing then Intel has a safe ticket to ride. Intel says its new Xeon Phi line of chips is an early stepping stone toward exascale. Intel ...

NVIDIA helps spark 64-bit ARM systems for HPC

June 23, 2014

(Phys.org) —NVIDIA could not have chosen a better venue for a chosen target: The International Supercomputing Conference, running to June-26 in Leipzig, Germany, is where NVIDIA took center stage, to demonstrate how server ...

Recommended for you

Smart home heating and cooling

August 28, 2015

Smart temperature-control devices—such as thermostats that learn and adjust to pre-programmed temperatures—are poised to increase comfort and save energy in homes.

Smallest 3-D camera offers brain surgery innovation

August 28, 2015

To operate on the brain, doctors need to see fine details on a small scale. A tiny camera that could produce 3-D images from inside the brain would help surgeons see more intricacies of the tissue they are handling and lead ...

Team creates functional ultrathin solar cells

August 27, 2015

(Phys.org)—A team of researchers with Johannes Kepler University Linz in Austria has developed an ultrathin solar cell for use in lightweight and flexible applications. In their paper published in the journal Nature Materials, ...

Interactive tool lifts veil on the cost of nuclear energy

August 24, 2015

Despite the ever-changing landscape of energy economics, subject to the influence of new technologies and geopolitics, a new tool promises to root discussions about the cost of nuclear energy in hard evidence rather than ...


Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (2) Aug 09, 2014
Some more uses - Modelling universe , Modelling virus and bacteria completely, and may be artificial intellligence
5 / 5 (1) Aug 09, 2014
The way it's described makes it an optical math co-processor. Very impressive for implementing highly parallel mathematical calculations, but it doesn't appear to be useful for other big data tasks like searching, sorting, and moving data - unless I'm missing something here. I don't think anyone will be implementing database engines with this. Still, it should be a boon for highly parallel mathematical tasks, saving a lot of space and energy while delivering extremely high performance levels.
5 / 5 (3) Aug 10, 2014
What have they achieved so far? How can it be programmed? Is it open for developers community?
not rated yet Aug 11, 2014
It'll be a stone bitch to program, I'm thinking - at least until software matures for controlling the thing. That'll take a long time, if history is any guide.

I bet they'll have reliability problems with their microlasers, too.

But even so, I hope they succeed. It seems worth trying.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.