Unconventional visualization method wins jury prize at media festival

Jan 03, 2013

Collaborative work performed by the Remote Data Analysis and Visualization Center (RDAV) and University of Tennessee (UT), Knoxville, artist Evan Meaney that examines the interplay of data, information, and knowledge has won the jury prize for the Distributed Microtopias exhibition at the 15th Annual Finger Lakes Environmental Film Festival (FLEFF).

The RDAV–Meaney collaborative project, entitled "Null_Sets," is a collection of artwork that visualizes the size and structure of data. The artwork was created using an open-source script developed at RDAV with which whole bodies of text, from classic literature to HTML to , can be exported as .

"In a gallery, we can analyze these data sets side by side and consider the differences between, say, Moby Dick and an X-chromosome," Szczepanski said. "Our method relies on an encoding that represents the changes in pixel color and intensity, and might be adapted to explore how values in a dataset change."

"Null_Sets explores the gap between data and information," Meaney said. "This project makes it possible to visualize both the size and architecture of large-scale data sets through an aesthetic lens."

The novel use of encoding employed by Null_Sets coincides with the focus of this year's FLEFF, the exploration of what it terms "Distributed Microtopias" and defines as projects that "run across distributed networks like the Internet to provoke and educate from remote locations on a sustainable scale, expand knowledge rather than contain it, invite participation and exploration, and unhinge familiar habits of thinking to envision new possibilities for historical and cultural clarity."

The project took shape in the spring of 2010 when Szczepanski, searching for digital media artists with whom RDAV could collaborate, contacted Meaney under the advice of UT's visual arts committee.

After discussing Null_Sets and the theory behind it with Meaney, Szczepanski wrote the initial code, and then a student assumed the task. As project designer and director, Meaney suggested revisions to the code to improve the work, chose the texts, handled tasks related to producing physical images, made submissions to shows and festivals, and printed catalogs, Szczepanski said.

"The techniques we developed in this project laid the groundwork for a larger that will likely use the Nautilus supercomputer in the future," she said.

Nautilus is managed for the National Science Foundation by the National Institute for Computational Sciences (NICS).

Explore further: Newest computer neural networks can identify visual objects as well as the primate brain

add to favorites email to friend print save as pdf

Related Stories

New technology enables high-speed data transfer

Jun 18, 2009

GridFTP, a protocol developed by researchers at Argonne National Laboratory, has been used to transfer unprecedented amounts of data over the Department of Energy's (DOE) Energy Sciences Network (ESnet), which provides a ...

Recommended for you

Cloud computing helps make sense of cloud forests

Dec 17, 2014

The forests that surround Campos do Jordao are among the foggiest places on Earth. With a canopy shrouded in mist much of time, these are the renowned cloud forests of the Brazilian state of São Paulo. It is here that researchers ...

Teaching robots to see

Dec 15, 2014

Syed Saud Naqvi, a PhD student from Pakistan, is working on an algorithm to help computer programmes and robots to view static images in a way that is closer to how humans see.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.