researchandtechnology.net
Benchmarks of the BCIF lossless image compression algorithm
http://www.researchandtechnology.net/bcif/benchmarks.php
The BCIF image compression algorithm. Integrates with the BCIF GUI to convert from other formats than BMP). Homepage of the author. Other image compression algorithms. Kakadu: a Jpeg2000 implementation. The LOCO (Jpeg-LS) algorithm. Char-LS: a Jpeg-LS implementation. The Waterloo image set. The Kodak image set. The BCIF compression algorithm. Has been compared with the main lossless compression standards: JPEG 2000 ( implementation from Kakadusoftware. JPEG-LS ( reference implementation. Is the most rece...
researchandtechnology.net
About the BCIF lossless image compression algorithm
http://www.researchandtechnology.net/bcif/about.php
The BCIF image compression algorithm. Integrates with the BCIF GUI to convert from other formats than BMP). Homepage of the author. Other image compression algorithms. Kakadu: a Jpeg2000 implementation. The LOCO (Jpeg-LS) algorithm. Char-LS: a Jpeg-LS implementation. The Waterloo image set. The Kodak image set. The BCIF lossless image compression algorithm was born as an evolution of the older PCIF algorithm. It has been developed and realized by Stefano Brocchi.
en.wikipedia.org
Lossless compression - Wikipedia, the free encyclopedia
https://en.wikipedia.org/wiki/Lossless_data_compression
From Wikipedia, the free encyclopedia. Redirected from Lossless data compression. Is a class of data compression. Algorithms that allows the original data to be perfectly reconstructed from the compressed data. By contrast, lossy compression. Permits reconstruction only of an approximation of the original data, though this usually improves compression rates. And therefore reduces file sizes). Lossless data compression is used in many applications. For example, it is used in the ZIP. Most lossless compres...
fastcompression.blogspot.com
RealTime Data Compression: 11/23/14 - 11/30/14
http://fastcompression.blogspot.com/2014_11_23_archive.html
Development blog on compression algorithms. Tuesday, November 25, 2014. Portability woes : Endianess and Alignment (Part 2). Endianess and 32/64 bits detection were detailed. Now we'll have a look at more complex memory alignment troubles. Is a lesser known issue, but its impact can be huge : it will crash your program or result in a disproportionate slow down. But it has a cost, it makes the CPU more complex, and consume some precious transistor space. As a consequence, several CPU vendors selected ...
fastcompression.blogspot.com
RealTime Data Compression: Portability woes : Endianess and Alignment (Part 2)
http://fastcompression.blogspot.com/2014/11/portability-woes-endianess-and.html
Development blog on compression algorithms. Tuesday, November 25, 2014. Portability woes : Endianess and Alignment (Part 2). Endianess and 32/64 bits detection were detailed. Now we'll have a look at more complex memory alignment troubles. Is a lesser known issue, but its impact can be huge : it will crash your program or result in a disproportionate slow down. But it has a cost, it makes the CPU more complex, and consume some precious transistor space. As a consequence, several CPU vendors selected ...
fastcompression.blogspot.com
RealTime Data Compression: Zstandard
http://fastcompression.blogspot.com/p/zstandard.html
Development blog on compression algorithms. Short for Zstandard, is a new lossless compression algorithm, which provides both good compression ratio. Standard" translates into everyday situations which neither look for highest possible ratio (which LZMA and ZPAQ cover) nor extreme speeds (which LZ4. A reference source code is provided as a BSD-license package, hosted on Github. Which also host some up to date compression benchmark. You can download latest release here. June 9, 2015 at 9:46 PM. Perhaps su...
fastcompression.blogspot.com
RealTime Data Compression: 5/24/15 - 5/31/15
http://fastcompression.blogspot.com/2015_05_24_archive.html
Development blog on compression algorithms. Friday, May 29, 2015. Since I started programming a few years ago, and selected data compression as my little hobby and obsession, I nonetheless remained a part-time, amateur, programmer. I therefore started programming with a good excuse : I was convinced that it helped me understand and communicate with programming teams, hence making me a better. That couldn't last. With a baby soon to come, it became clear that I would either have to stop, by starvation...
fastcompression.blogspot.com
RealTime Data Compression: Compression benchmark
http://fastcompression.blogspot.com/p/compression-benchmark.html
Development blog on compression algorithms. On this page, a new methodology is proposed. To compare different compression programs depending on target speed criteria. We're using a simple scenario, in which a file is compressed, sent over a variable-speed pipe, and then decompressed. Times are added, and total compared. It gives the following graph:. Click on the graph for enlarged display. Therefore, a new representation is proposed, to improve results clarity. Using the same figures. Matt's LTCB (Large...
fastcompression.blogspot.com
RealTime Data Compression: 4/5/15 - 4/12/15
http://fastcompression.blogspot.com/2015_04_05_archive.html
Development blog on compression algorithms. Tuesday, April 7, 2015. Sampling, or a faster LZ4. Quite some time ago, I've been experimenting with some unusual sampling methods, in an attempt to find a better way to compress data with LZ4. It turned out my expectation were too optimistic. Any time I tried to reduce the update rate, it would result in a correspondingly reduced compression ratio. With that experiment failed, I settled for an "optimal" sampling pattern, which became the core of LZ4. LZ4 fast ...
fastcompression.blogspot.com
RealTime Data Compression: Sampling, or a faster LZ4
http://fastcompression.blogspot.com/2015/04/sampling-or-faster-lz4.html
Development blog on compression algorithms. Tuesday, April 7, 2015. Sampling, or a faster LZ4. Quite some time ago, I've been experimenting with some unusual sampling methods, in an attempt to find a better way to compress data with LZ4. It turned out my expectation were too optimistic. Any time I tried to reduce the update rate, it would result in a correspondingly reduced compression ratio. With that experiment failed, I settled for an "optimal" sampling pattern, which became the core of LZ4. LZ4 fast ...