Mozilla to improve JPEG compression with mozjpeg tool

Mar 06, 2014 by Nancy Owano weblog
Mozilla to improve JPEG compression with mozjpeg tool
8x8 pixel subimage used as an example for JPEG. Credit: Wikipeida

(Phys.org) —Mozilla announced on Wednesday its new project to provide a production-quality JPEG encoder that improves compression. Project mozjpeg will bring better compression efficiency to JPEG, the popular image format with proven staying power since 1992. The Wednesday blog announcement from the Mozilla Foundation presented reasons for the launch. The number of photos that the average Web site displays has grown over the years, as has the size of those photos. HTML, JS, and CSS files are relatively small in comparison. The Mozilla blog noted too that photos can easily make up the bulk of the network traffic for a page load. "Reducing the size of these files is an obvious goal for optimization." Why did the Mozilla team focus on JPEG specifically? The announcement had an answer for that too.

"Nearly every photograph on the Web is served up as a JPEG. It's the only lossy compressed image format which has achieved nearly universal compatibility, not just with Web browsers but all software that can display images."

The mozjpg project was the result of frequent discussions over JPEG encoders. "Production JPEG encoders have largely been stagnant in terms of efficiency, so replacing JPEG with something better has been a frequent topic," said the blog." (End-users gain from compression in that smaller files arrive much faster but compression can be tricky. Poor use of compression can result in a sacrifice in image quality.)

Discussions at Mozilla involved whether JPEG encoders, after over 20 years, had really reached their full compression potential. The answer even within constraints of compatibility requirements was no. The mozjpeg software is at version 1.0 now, on GitHub. The tool is actually a fork of libjpeg-turbo with 'jpgcrush' functionality added. Mozilla Corporation's Senior Technology Strategist Josh Aas, author of the Wednesday blog post, said, "We noticed that people have been reducing JPEG file sizes using a perl script written by Loren Merritt called 'jpgcrush', references to which can be found on various forums around the Web. It losslessly reduces file sizes, typically by 2-6% for PNGs encoded to JPEG by IJG libjpeg, and 10% on average for a sample of 1500 JPEG files from Wikimedia."

Aas also said the next goal in the project is to improve encoding through the use of trellis quantization, an algorithm to improve data compression.

Explore further: New data compression method reduces big-data bottleneck

More information: blog.mozilla.org/research/2014… the-mozjpeg-project/
github.com/mozilla/mozjpeg

add to favorites email to friend print save as pdf

Related Stories

Google aims to win developers over to image format WebP

Feb 08, 2013

(Phys.org)—In 2011, news circulated over Google's enhancements to WebP, the image format set to outdistance JPEG and, with more features in a newer version, to take on Portable Network Graphics, another ...

Program does impressive file size reductions

Sep 18, 2009

We intuitively understand the value of being able to make things smaller without sacrificing performance. The endeavor produces smaller speakers with bigger sound and a host of portable electronic devices such as digital ...

New data compression method reduces big-data bottleneck

Dec 19, 2013

(Phys.org) —In creating an entirely new way to compress data, a team of researchers from the UCLA Henry Samueli School of Engineering and Applied Science has drawn inspiration from physics and the arts. ...

Lights, camera, real-time 3D action

May 26, 2010

(PhysOrg.com) -- The 3D movies on today's cinema screens rely on visual tricks to cope with fast action. A new generation, produced at lower cost but delivering higher quality with real-time action, is soon ...

Mozilla's pdf.js project reaches its first milestone

Jul 06, 2011

(PhysOrg.com) -- You may recall our earlier reporting on the Mozilla's pdf.js project, in which the folks over at Mozilla are trying to get their browser to display PDF files in your Firefox web browser with ...

Recommended for you

Apple issues security warning for iCloud

3 hours ago

Apple has posted a new security warning for users of its iCloud online storage service amid reports of a concerted effort to steal passwords and other data from people who use the popular service in China.

Review: Better cameras, less glare in iPad Air 2

3 hours ago

If I've seen you taking photos with a tablet computer, I've probably made fun of you (though maybe not to your face, depending on how big you are). I'm old school: I much prefer looking through the viewfinder ...

User comments : 5

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
5 / 5 (3) Mar 06, 2014
The internet is full of people who upload photographs and game screenshots as 32 bit PNG files because that's the default file format of their graphics editor, or 500 frame video clips as GIF files, or simply neglect to even remove ICC profiles and metadata from files taken straight out of their digital cameras or cellphones and uploaded to websites as-is. If the picture is too large to fit the layout, they simply force it down to size in the html tag.

Stupidity and ignorance are the main cause of internet traffic - not the efficiency of JPEG compression. Nobody even bothers to use pngcrush to make their files smaller, so why would they use this?
hangman04
not rated yet Mar 06, 2014
i could only see this working for image hosting/sharing boards/sites. From their point of view a net 10% decrease is a gain.
Jarek
not rated yet Mar 06, 2014
There is a new approach to entropy coding which combines speed of Huffman coding with accuracy of arithmetic coding - would be perfect e.g. to encode DCT coefficients here.
Some implementation and sources: https://github.co...eEntropy
alfie_null
5 / 5 (1) Mar 07, 2014
The internet is full of people who upload photographs and game screenshots as 32 bit PNG files because that's the default file format of their graphics editor, or 500 frame video clips as GIF files, or simply neglect to even remove ICC profiles and metadata from files taken straight out of their digital cameras or cellphones and uploaded to websites as-is. If the picture is too large to fit the layout, they simply force it down to size in the html tag.

Stupidity and ignorance are the main cause of internet traffic - not the efficiency of JPEG compression. Nobody even bothers to use pngcrush to make their files smaller, so why would they use this?

The Internet is full of lossily compressed jpeg images that don't lend themselves to manipulation or reuse. Can't blow them up. Full of artifacts. I suppose I could attribute this to stupidity and ignorance, but I think, rather, inertia.
Eikka
not rated yet Mar 07, 2014
The Internet is full of lossily compressed jpeg images that don't lend themselves to manipulation or reuse.


Maybe they aren't meant to? Maybe the author never wanted it to?

JPEG is a publishing format, and usually when people publish an image for viewing, they aren't thinking about someone grabbing it for use elsewhere. Many people just erroneously think that any image on Google image search is public domain.

Mostly what you're describing is a case of "Ooh I like that, I wish I had that, but I don't want to pay StockPhotos for the full res one".