Is Extreme Image Optimization Possible in Java? -


i attempting write image optimization software in java. first , obvious step strip exif metadata, have done successfully. tried compress images using imageio , compression quality parameter below:

filepath=chosen.getcanonicalpath()+"-temp.jpg"; file=new file(filepath); iterator<imagewriter> writers = imageio.getimagewritersbyformatname("jpg"); if (!writers.hasnext()){     throw new illegalstateexception("no writers found"); } outputstream os = new fileoutputstream(file); imagewriter writer = (imagewriter) writers.next(); imageoutputstream ios = imageio.createimageoutputstream(os); writer.setoutput(ios);  imagewriteparam param = writer.getdefaultwriteparam(); param.setcompressionmode(imagewriteparam.mode_explicit); param.setcompressionquality(0.9f); writer.write(null, new iioimage(optimized, null, null), param); 

however, doesn't work well. works sort of okay when source image different format (i.e. not jpeg), when compressing jpeg jpeg, makes file size larger.

sites www.tinyjpg.com claim (and do) reduce file size of jpeg images 40%-60% no quality loss. how on earth (both procedurally , programmatically)? types of data removing, , how possible remove data no quality loss? possibly achieve in java?

any guidance and/or resources can give me appreciated!

from wikipedia - jpg - lossless further compression guess tiny jpg has been using improved algorithms developed after creation of standard tools. improved algorithms implemented in packjpg conveniently open-sourced. there doesn't appear java implementation.

as side note: packjpg claims 20% improvement while tiny jpg claims 70%. might overstatement might want test both claims anyway.


Comments

Popular posts from this blog

get url and add instance to a model with prefilled foreign key :django admin -

css - Make div keyboard-scrollable in jQuery Mobile? -

ruby on rails - Seeing duplicate requests handled with Unicorn -