Augmented reality lets students operate a chemical plant
The chemical engineering major from Milwaukee, Wisconsin, repeated that comment more than once during the next half hour, as he and two other students continually rearranged coffee mugs and popsicle sticks on the tabletop's glass surface to simulate reactions in a real-life, sprawling chemical plant.
The exercise was part of an innovative, augmented reality (AR) teaching experiment in which:
Coffee mugs became virtual 10-cubic meter reactors - both plug flow reactors (PFR) and continuous stirred tank reactors (CSTR); popsicle sticks served as the virtual pipes that connect them; a nob let students adjust the temperature inside each reactor as it was added to the configuration; QR coding on the bottom of the reactors enabled a camera inside the table to capture each reactor's precise location; the information was relayed to a computer where the simulations were run; and a projector inside the table flashed the results onto the tabletop—all in real time.
"We're trying to use AR as a way to enable new types of STEM (science, technology, engineering, and math) undergraduate laboratories that weren't possible before," explains Andrew White, assistant professor of chemical engineering.
"What we've done is build a hands-on, tactile, collaborative lab where students can explore putting together multiple reactors at different temperatures, and see what effect this has on optimizing a chemical reaction."
More important, however, is the effect the table could have on optimizing the students' educational experience.
"This is a really a needed piece in higher education," says April Luehmann, an associate professor and director of secondary science education at the Warner School of Education, who is collaborating on the project.
"We know a lot about what's important in learning that doesn't ever get translated into the classroom," she says. Opportunities to engage in dialogue with fellow learners; to make mistakes; to wrestle with complex, real-life problems that have no single answer; and to physically interact in an environment—all of these should be part of the process, she says.
"That doesn't happen when students are asked to do problems one through four on page 262, turn it in, and the only interaction they have is with a professor who tells you whether the final number you arrived at is right or wrong."
"A table like this can allow so much more than that to happen," Luehmann says.
Eventually, the table will be connected to the University's super computer, allowing for even more sophisticated simulations, says Brendan Mort, director of the Center for Integrated Computing (CIRC), who is also collaborating on the project.
White, Luehmann, and Mort have also proposed working with the Rochester Museum & Science Center on developing an AR platform simulating oil and water at the molecular level, to show what happens when there's an oil spill.
White, whose expertise is in using experiments, molecular simulations, and machine-learning to design new materials, has explored other innovative ways to teach his students.
For example, he loaded all of his lectures and course content for a class on numerical methods and statistics onto an open source web application called Jupyter Notebook, where students can:
create their own notebooks to do homework and keep notes; easily copy and paste all the equations and other course content they need; use the platform's interactive features to solve the equations and to create dynamic graphs; incorporate videos, text, and code all in the same documents; and export their work as websites, PDFs, or slideshows.
And they don't have to spend $160 or more for the textbook that would otherwise be used to the teach the class.
To assess the effectiveness of the AR table as a teaching tool, members of Luehmann's research team conducted an experiment. They videotaped four students doing a task at the table, while a control group of four other students addressed the same task in a classroom, using computers, spreadsheets, and white boards.
About 30 minutes into the exercise, Luehmann's team noticed that as the students at the table leaned in to reconfigure the reactors and pipes, they would say something like, "how about . . ."—and then move a reactor without even finishing the sentence.
In other words, they "started reasoning with their bodies," Luehmann says. "At some point, the need for dialogue and words was transcended. That's great because the table gave the students more resources to communicate, a richer set of literacies."
"If we're going to prepare chemical engineering students to be part of a knowledge society," she says, "they need to be able to negotiate complex, ill-structured tasks. And that's the kind of thing that happens at that table."
Luehmann's team is still analyzing the videos, as well as the pre- and post-experiment surveys and interviews with the students involved in the exercise. The results will help the team refine its methodologies for further assessing the table's effectiveness when it is used as part of a chemical engineering class later this semester.
But as far as Eder is concerned, the verdict is already in. "I would love to see this kind of teaching tool become more prevalent," he says of incorporating AR into classrooms and labs. "Bringing useful technology to hands-on experiential learning gives us an excellent resource."
"Engineering is all about real-world applications," adds Sabrina Westgate '19, a chemical engineering major from Conway, Massachusetts, who also had a chance to use the table. "We learn a lot in the classroom, but to be able to see a visual breakdown like this is really helpful. I think this has a lot of potential to help both students and actual engineers in the field, which is awesome."