Lack of effective timing signals could hamper 'Internet of things' development

March 20, 2015, National Institute of Standards and Technology
Lack of Effective Timing Signals Could Hamper ‘Internet of Things’ Development
Credit: K. Irvine-NIST/ ©dimedrol68 and Helen Sergeyeva /

Our fast-approaching future of driverless cars and "smart" electrical grids will depend on billions of linked devices making decisions and communicating with split-second precision to prevent highway collisions and power outages. But a new report released by the National Institute of Standards and Technology (NIST) warns that this future could be stalled by our lack of effective methods to marry computers and networks with timing systems.

The authors, who include NIST's Marc Weiss and seven experts from academia and industry, are concerned about the way most modern data systems are designed to process and exchange data with one another and what that could mean for a world of discrete processors and mechanical devices linked by an information network—the "Internet of Things" (IoT). In addition to giving you access to the status of your home appliances anywhere, anytime, the IoT encompasses many potentially important but delicate applications such as cars that drive themselves and telemedicine surgical suites that allow doctors to operate on patients from remote locations. People are still imagining applications for the IoT, but GE predicts that nearly half the global economy can benefit from it.

The trouble is that these applications frequently will depend on precision timing in computers and networks, which were designed to operate optimally without it. For example, for a to decide whether what it senses ahead is a plastic bag blowing in the wind or a child running, its decision-making program needs to execute within a tight deadline. Yet modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require.

In addition, many IoT systems will require precision synchronization across networks. "Imagine writing a letter to your friend saying it is now 2:30 p.m., and then sending it by snail mail so he can synchronize his watch with yours," says Weiss. "That's the equivalent of how accurate the timing of messages are in computers and systems right now. The transfer delay must be accounted for to do the things that are expected of the IoT."

In their paper, the authors review the state of the art in several areas central to the use of timing signals and find that crosscutting research is needed to improve current technology and approach. These areas include clock design, the use of timing in networking systems, hardware and software architecture, and application design, among others.

Weiss outlines research areas that could address these issues. Networked components need a way to combine time-sensitive processes and those that can be done whenever the system gets around to them, he says. Additionally, systems need to be designed from the ground up to accommodate updates to computer applications and networks where timing is critical, updates that are virtually inevitable as systems change and grow. Currently, systems need to be rebuilt and recalibrated every time they become obsolete, costing time and money.

"The kind of growth in the IoT that is expected to happen will be severely hampered without these improvements," he says. "It won't be able to grow the way people want."

Explore further: An Internet of Things reality check

More information: M. Weiss, J. Eidson, C. Barry, D. Broman, L. Goldin, B. Iannucci, E.A. Lee and K. Stanton. Time-Aware Applications, Computers, and Communication Systems (TAACCS). NIST Tech Note 1867, Feb. 21, 2015.

Related Stories

An Internet of Things reality check

February 9, 2015

Connecting different kinds of devices, not just computers and communications devices, to the Internet could lead to new ways of working with a wide range of machinery, sensors, domestic and other appliances. Writing in the ...

Intel has end-to-end reference model for IoT

December 12, 2014

Intel has declared its move to simplify and unify connectivity and security for the Internet of Things. Earlier this week, Intel announced platform, products and expanded company ecosystem designed to speed IoT adoption and ...

Energy industry ready to get smart

May 16, 2014

We're entering the age of the Internet of Things (IoT): a brave new world populated by 'smart' objects' capable of interacting and communicating among them-selves and with their environment. This emerging IoT ecosystem will ...

Cyberattacks to worsen in 2015: McAfee researchers

December 9, 2014

A series of spectacular cyberattacks drew headlines this year, and the situation will only worsen in 2015 as hackers use more advanced techniques to infiltrate networks, security researchers said Tuesday.

Recommended for you

Technology near for real-time TV political fact checks

January 18, 2019

A Duke University team expects to have a product available for election year that will allow television networks to offer real-time fact checks onscreen when a politician makes a questionable claim during a speech or debate.

Privacy becomes a selling point at tech show

January 7, 2019

Apple is not among the exhibitors at the 2019 Consumer Electronics Show, but that didn't prevent the iPhone maker from sending a message to attendees on a large billboard.

China's Huawei unveils chip for global big data market

January 7, 2019

Huawei Technologies Ltd. showed off a new processor chip for data centers and cloud computing Monday, expanding into new and growing markets despite Western warnings the company might be a security risk.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Mar 20, 2015
Yet modern computer programs only have probabilities on execution times, rather than the strong certainties that safety-critical systems require.

They're talking of the differences between "hard" and "soft" real-time (RT) operating systems as employed in different computer systems.

The issue is largely present only because developers are increasingly relying on deeper and deeper levels of abstraction, because they are no longer familiar with the hardware. We see products such as a memory card which is built with a SoC circuit that runs an entire Linux operating system (which is not RT) in order to run a http web server, in order to display a web page, in order to run a simple javascript snippet that lets you browse the contents of the card over wi-fi.

It could be done much more efficiently than that, but it would cost the company competent programmers and time.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.