Optalysys will launch prototype optical processor

Aug 09, 2014 by Nancy Owano weblog
Credit: Optalysys

UK-based startup Optalysys is moving ahead to deliver exascale levels of processing power on a standard-sized desktop computer within the next few years, reported HPCwire earlier this week. The company itself announced on August 1 that it is "only months away from launching a prototype optical processor with the potential to deliver exascale levels of processing power on a standard-sized desktop computer." The company will demo its prototype, which meets NASA Technology Readiness Level 4, in January next year. Though the January date represents only a proof-of-concept stage, the processor is expected to run at over 340 gigaFLOPS , which will enable it to analyze large data sets, and produce complex model simulations in a laboratory environment. Engadget commented that those numbers were not bad for a proof of concept. HPCwire pointed at the potential significance of this work in its article's headline, "IsThis the Exascale Breakthrough We've Been Waiting For?" Optalysys' technology uses light for compute-intensive mathematical functions at speeds that exceed what can be achieved with electronics, at a fraction of the cost and power consumption.

The company CEO, Dr. Nick New, said that, "Using low power lasers and high resolution liquid crystal micro-displays, calculations are performed in parallel at the ." Engadget contributing editor Steve Dent wrote how "low-intensity lasers are beamed through layers of liquid crystal grids, which change the light intensity based on user-inputted data. The resulting interference patterns can be used to solve mathematical equations and perform other tasks. By splitting the beam through multiple grids, the system can compute in parallel much more efficiently than standard multi-processing supercomputers."

Optalysys technology can provide a big improvement over traditional computing, the company noted. "Electronic processing is fundamentally serial, processing one calculation after the other. This results in big data management problems when the resolutions are increased, with improvements in processor speed only resulting in incremental improvements in performance. The Optalysys technology, however, is truly parallel – once the data is loaded into the grids, the processing is done at the speed of light, regardless of the resolution. The Optalysys technology we are developing is highly innovative and we have filed several patents in the process."

The company's target audience for their technology? "We hope the will be used by everyone, but to begin with we envisage the first users to be the users of high power Computational Fluid Dynamics (CFD) simulations and Big Data sets."

This video is not supported by your browser at this time.

Optalysys chairman James Duez said in applications such as aerofoil design, weather forecasting, MRI data analysis and quantum computing, "it is becoming increasingly common for traditional computing methods to fall short of delivering the needed." CFD, noted Duez, can be used to predict the weather, design cars and model airflow, "but the speed of processing needed to create models is constrained by current electrical capabilities."

Dr. New said "We are currently developing two products, a 'Big Data' analysis system and an Optical Solver Supercomputer, both of which are expected to be launched in 2017."

This video is not supported by your browser at this time.

Optalysys was founded in 2013 by New and Duez. The team includes specialists in software development, free-space optics, optical engineering and production engineering.

Explore further: IBM to spend $3 bn aiming for computer chip breakthrough

More information:www.optalysys.com/faq/

www.optalysys.com/blog/light-s… ow-only-months-away/

add to favorites email to friend print save as pdf

Related Stories

Intel flirts with exascale leap in supercomputing

Jun 19, 2012

(Phys.org) -- If exascale range is the next destination post in high-performance computing then Intel has a safe ticket to ride. Intel says its new Xeon Phi line of chips is an early stepping stone toward ...

NVIDIA helps spark 64-bit ARM systems for HPC

Jun 23, 2014

(Phys.org) —NVIDIA could not have chosen a better venue for a chosen target: The International Supercomputing Conference, running to June-26 in Leipzig, Germany, is where NVIDIA took center stage, to demonstrate ...

Recommended for you

Microsoft beefs up security protection in Windows 10

2 hours ago

What Microsoft users in business care deeply about—-a system architecture that supports efforts to get their work done efficiently; a work-centric menu to quickly access projects rather than weather readings ...

US official: Auto safety agency under review

15 hours ago

Transportation officials are reviewing the "safety culture" of the U.S. agency that oversees auto recalls, a senior Obama administration official said Friday. The National Highway Traffic Safety Administration has been criticized ...

Out-of-patience investors sell off Amazon

15 hours ago

Amazon has long acted like an ideal customer on its own website: a freewheeling big spender with no worries about balancing a checkbook. Investors confident in founder and CEO Jeff Bezos' invest-and-expand ...

Ebola.com domain sold for big payout

15 hours ago

The owners of the website Ebola.com have scored a big payday with the outbreak of the epidemic, selling the domain for more than $200,000 in cash and stock.

Hacker gets prison for cyberattack stealing $9.4M

19 hours ago

An Estonian man who pleaded guilty to orchestrating a 2008 cyberattack on a credit card processing company that enabled hackers to steal $9.4 million has been sentenced to 11 years in prison by a federal judge in Atlanta.

Magic Leap moves beyond older lines of VR

20 hours ago

Two messages from Magic Leap: Most of us know that a world with dragons and unicorns, elves and fairies is just a better world. The other message: Technology can be mindboggingly awesome. When the two ...

User comments : 4

Adjust slider to filter visible comments by rank

Display comments: newest first

srikkanth_kn
5 / 5 (2) Aug 09, 2014
Some more uses - Modelling universe , Modelling virus and bacteria completely, and may be artificial intellligence
Code_Warrior
5 / 5 (1) Aug 09, 2014
The way it's described makes it an optical math co-processor. Very impressive for implementing highly parallel mathematical calculations, but it doesn't appear to be useful for other big data tasks like searching, sorting, and moving data - unless I'm missing something here. I don't think anyone will be implementing database engines with this. Still, it should be a boon for highly parallel mathematical tasks, saving a lot of space and energy while delivering extremely high performance levels.
raducanmariuscatalin
5 / 5 (3) Aug 10, 2014
What have they achieved so far? How can it be programmed? Is it open for developers community?
Urgelt
not rated yet Aug 11, 2014
It'll be a stone bitch to program, I'm thinking - at least until software matures for controlling the thing. That'll take a long time, if history is any guide.

I bet they'll have reliability problems with their microlasers, too.

But even so, I hope they succeed. It seems worth trying.