Man this is exactly what I needed a year ago. We had to twiddle all kinds of knobs in jpeg compression to make images from an onboard camera fit into our very limited uplink budget. We still couldn't guarantee anything, because of the nature of jpeg. The compression target size feature is very important for actually handling images as scientific data in a constrained environment.
I recognised the artifacts at lower quality levels, thought "this looks like JPEG2000" and as expected, it's wavelet-based. A quick skim through the specifications shows that it is very similar to JPEG2000, but also much simplified.
This is great. I will have a look later tonight for sure.
I used a NASA shape-from-shading algorithm as the basis of a Python script I wrote for Blender3D back in the early 2000s to turn photos into 3D bas reliefs for carving on my 4'x8' router table. I felt pleased to get something out of my tax dollar!
I was experimenting with wavelets to analyze EKG data (there's some cool stuff out there with this). I'll have to see what happens to a time series EKG data using compression!
I like the cnc carving bit. I have long thought about setting that up for myself, but I am worried I will almost immediately run out of projects. What are some of the things you use it for?
I was making carvings for the wooden kayaks that were built from 4mm CNC'd Okouame plywood I cut on the same table.
I also started a business, The Wooden Image, where people could upload a photo, preview a rendering of the bas relief created from it and order it to be cut on some 1" maple board. Staining was extra, but really brought out the relief and made it pop.
I received a call from a guy who created bronze plaques for gravestones and he wanted to buy my software. Nothing ever came of that probably due to the fact that I was on to the next thing back then. "thewoodenimage.com" shows on the internet archive, but not with the photos and the upload photo section...
In package repositories for Debian, Fedora etc? As this library has only just been open sourced it hasn't been added to Debian yet but may one day appear here: https://packages.debian.org/search?keywords=icer
Something like glib2 or boost maybe? They have lots of separate modules (or at least you can pick and choose which bits to use and what to ignore), but (especially glib) form a single library with a fairly consistent style.
Error correcting codes will be able to perfectly recover the original data as long as the error rate stays below a threshold, but past that threshold they can't recover almost anything.
If you combine EC codes with a standard image codec, which usually has some non-tolerant entropy stage, an unrecoverable error means you probably lose a significant portion of the image.
With a method like this, instead, the image quality smoothly degrades as the error rate increases.
Video formats meant for editing (e.g. mjpeg) are a series of images (intra frame only), but video formats meant for playback (e.g. H.264, HEVC, AV1) are much more compressed, by making frames depend on other frames (inter frame). Blocks of pixels in one frame copy a block of pixels from another frame (backwards and forwards!), and apply a "diff".
yeah !
There could be some interesting usages. Because this is loss tolerant I'm thinking of the consumer drones, where the stream is usually h264, when you lost a frame the entire second following get bad