Using Big Data to monitor societal events shows promise, but the coding tech needs work

September 30, 2016 by Thea Singer
Using Big Data to monitor societal events shows promise, but the coding tech needs work
Distinguished Professor David Lazer and his colleagues analyzed global-scale databases of news events and found them wanting. Their recommendations for improvements would enable researchers to build models anticipating everything from the escalation of conflicts to the progression of epidemics. Credit: Matthew Modoono/Northeastern University

In the age of Big Data, automated systems can track societal events on a global scale. These systems code and collect vast stores of real-time "event data"—happenings gleaned from news articles covering everything from political protests to ecological shifts around the world.

In new research published Thursday in the journal Science, Northeastern network scientist David Lazer and his colleagues analyzed the effectiveness of four global-scale databases and found they are falling short when tested for reliability and validity.

Misclassification and duplication

The fully automated systems studied were the International Crisis Early Warning System, or ICEWS, maintained by Lockheed Martin, and Global Data on Events Language and Tone, or GDELT, developed and run out of Georgetown University. The others were the hand-coded Gold Standard Report, or GSR, generated by the nonprofit MITRE Corp., and the Social, Political, and Economic Event Database, or SPEED, at the University of Illinois, which uses both human and automated coding.

First the researchers tested the systems' reliability: Did they all detect the same protest events in Latin America? The answer was "not very well." ICEWS and GDELT, they found, rarely reported the same protests, and ICEWS and SPEED agreed on just 10.3 percent of them.

Next they assessed the systems' validity: Did the protest events reported actually occur? Here they found that only 21 percent of GDELT's reported events referred to real protests. ICEWS' track record was better, but the system reported the same event more than once, jacking up the protest count.

The systems were also vulnerable to missing news. "If something doesn't get reported in a newspaper or a similar outlet, it will not appear in any of these databases, no matter how important it really is," says Lazer, Distinguished Professor of Political Science and Computer and Information Sciences who also co-directs the NULab for Texts, Maps, and Networks.

"These global-monitoring systems can be incredibly valuable, transformative even," added Lazer. "Without good data, you can't develop a good understanding of the world. But to gain the insights required to tackle global problems such as national security and climate change, researchers need more reliable event data."

And what about the reported protests that actually weren't protests at all? "Automated systems can misclassify words," says Lazer. For example, the word "protest" in a news article can refer to an actual political demonstration, but it can also refer to, say, a political candidate "protesting" comments from a rival candidate.

"It's so easy for us as humans to read something and know what it means," says Lazer. "That's not so for a set of computational rules."

Analysis begets policy

From community building among scholars and the formation of multidisciplinary groups—which were among the policy recommendations by the researchers—teams within the group could compete against one another to spur innovation.

"Transparency is key," says Lazer. In the best-case scenario, the development methods, the software, and the source materials would be available to everyone involved. "But many of the source materials have copyright protection, and so they can't be shared widely," he says. "So one question is: How do we develop a large publicly shareable corpus?"

Participants should be able to test their varying coding methods on open, representative sets of event data to see how the methods compare, Lazer says. Contests could be used as a catalyst. Finally, the researchers recommend that a consortium should be established to balance the business needs of the news providers with the source needs of the developers and event-data users.

The authors suggest that reliable data-tracking systems can be used to build models that anticipate the escalation of conflicts, forecast the progression of epidemics, or trace the effect of global warming on the ecosystem.

Explore further: Do women talk more than men? It depends

More information: W. Wang et al. Growing pains for global monitoring of societal events, Science (2016). DOI: 10.1126/science.aaf6758

Related Stories

Do women talk more than men? It depends

July 16, 2014

We've all heard the stereotype: Women like to talk. We bounce ideas off each other about everything from career moves to dinner plans. We hash out big decisions through our conversations with one another and work through ...

What social media data could tell us about the future

April 7, 2016

Northeastern's Alessandro Vespignani, Sternberg Family Distinguished University Professor of physics, computer science, and health sciences, has teamed up with an interdisciplinary group of scientists to develop an innovative ...

Recommended for you

Startup Pi out to slice the charging cord

September 19, 2017

Silicon Valley youngster Pi on Monday claimed it had developed the world's first wireless charger that does away with cords or mats to charge devices.

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.