Automated-grading skeptic uses Babel to expose nonsense essay

Apr 30, 2014 by Nancy Owano weblog
students
Credit: Wikipedia

(Phys.org) —The good news: A former MIT instructor and students have come up with software that can write an entire essay in less than one second; just feed it up to three keywords. The bad news: The essay is gibberish. Oh, wait, more news: The nonsense essay was fed through an online writing product using essay-scoring technology. Perelman pasted the essay into the answer field, clicked "submit," and the paper got a score of 5.4 out of 6. The essay, after all, had good grammar and impressive vocabulary words. The end result was nonsense, nonetheless.

For the curious, the essay had sentences such as "Privateness has not been and undoubtedly never will be lauded, precarious, and decent.". The program was fed one keyword: "privacy." The key players behind this , called Babel, are Les Perelman, former director of undergraduate writing at the Massachusetts Institute of Technology, together with Harvard and MIT students. Perelman is concerned over the idea of using software to grade essays. The Babel program delivers essays that are intentionally gibberish to prove the weaknesses of automated essay-grading software. The Babel Generator stands for "Basic Automatic B.S. Essay Language" Generator.

A detailed report on Perelman's work appeared in Monday's The Chronicle of Higher Education, where Steve Kolowich wrote that Perelman's fundamental problem with essay-grading automatons is that they are not measuring any of the real constructs that have to do with writing. Kolowich said Babel Generator is turning "the concept of automation into a farce: machines fooling machines for the amusement of human skeptics."

Still, not everyone falls into Perelman's camp. With further development, machines giving student feedback may become increasingly useful. Kolowich observed that interesting work in automated essay grading is also taking place at MIT, where computer scientists at edX, co-founded by the university, is developing an automated essay-scoring system. This system is called the Enhanced AI Scoring Engine, or EASE. The engine is defined as a library that allows for machine learning-based classification of textual content, for tasks such as scoring student essays.

"Essentially," wrote Kolowich, "the edX software tries to make its machine graders more human. Rather than simply scoring essays according to a standard rubric, the EASE software can mimic the grading styles of particular professors."

At the same time, the report noted, Perelman used to teach a course with Anant Agarwal, chief executive of edX, back in the 1990s, and said "they have been talking about running some experiments to see if the Babel Generator can be used to inoculate EASE against some of the weaknesses the generator was designed to expose."

Explore further: Study: Interventions help women's reluctance to discuss accomplishments

add to favorites email to friend print save as pdf

Related Stories

NASA selects essay competition winners

May 10, 2007

The National Aeronautics and Space Administration has identified the winners of its high school essay competition to describe "Air Transportation in 2057."

Recommended for you

Watching others play video games is the new spectator sport

Aug 29, 2014

As the UK's largest gaming festival, Insomnia, wrapped up its latest event on August 25, I watched a short piece of BBC Breakfast news reporting from the festival. The reporter and some of the interviewees appeared baff ...

SHORE facial analysis spots emotions on Google Glass

Aug 28, 2014

One of the key concerns about facial recognition software has been over privacy. The very idea of having tracking mechanisms as part of an Internet-connected wearable would be likely to upset many privacy ...

User comments : 0