Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
NASA ICER image compression algorithm as a C library (github.com/therealorange)
95 points by asicsp on March 24, 2023 | hide | past | favorite | 22 comments


Man this is exactly what I needed a year ago. We had to twiddle all kinds of knobs in jpeg compression to make images from an onboard camera fit into our very limited uplink budget. We still couldn't guarantee anything, because of the nature of jpeg. The compression target size feature is very important for actually handling images as scientific data in a constrained environment.


I recognised the artifacts at lower quality levels, thought "this looks like JPEG2000" and as expected, it's wavelet-based. A quick skim through the specifications shows that it is very similar to JPEG2000, but also much simplified.


> compressed size 69913, time taken: 0.054055

Converting the same image with bpgenc yields 19092 bytes. Probably not transmission errors tolerant though.


In that case, compressing to bpgenc and then interlacing some error correction to get to a similar size would probably yield far more fault tolerance


been a fan of this format for years. it has nice lossless compression, too


This is great. I will have a look later tonight for sure.

I used a NASA shape-from-shading algorithm as the basis of a Python script I wrote for Blender3D back in the early 2000s to turn photos into 3D bas reliefs for carving on my 4'x8' router table. I felt pleased to get something out of my tax dollar!

I was experimenting with wavelets to analyze EKG data (there's some cool stuff out there with this). I'll have to see what happens to a time series EKG data using compression!


I like the cnc carving bit. I have long thought about setting that up for myself, but I am worried I will almost immediately run out of projects. What are some of the things you use it for?


I was making carvings for the wooden kayaks that were built from 4mm CNC'd Okouame plywood I cut on the same table.

I also started a business, The Wooden Image, where people could upload a photo, preview a rendering of the bas relief created from it and order it to be cut on some 1" maple board. Staining was extra, but really brought out the relief and made it pop.

I received a call from a guy who created bronze plaques for gravestones and he wanted to buy my software. Nothing ever came of that probably due to the fact that I was on to the next thing back then. "thewoodenimage.com" shows on the internet archive, but not with the photos and the upload photo section...


This image compression algorithm is optimized for progressive download.

In the space, bandwidth is low, latency is high. A good progressive algorithm helps a lot.


I like these small, single purpose, C libraries. Is there a maintained list of these somewhere?



In package repositories for Debian, Fedora etc? As this library has only just been open sourced it hasn't been added to Debian yet but may one day appear here: https://packages.debian.org/search?keywords=icer


I know exactly what you mean, but I am going to be intentionally obtuse and say as a joke.

Small, single purpose code, I usually call that a library, where are you finding multipurpose libraries?


Something like glib2 or boost maybe? They have lots of separate modules (or at least you can pick and choose which bits to use and what to ignore), but (especially glib) form a single library with a fairly consistent style.


i just learned about fountain codes about 2 seconds ago. i wonder how the trade offs compare between something like Raptor and this algorithm

https://en.wikipedia.org/wiki/Raptor_code


Error correcting codes will be able to perfectly recover the original data as long as the error rate stays below a threshold, but past that threshold they can't recover almost anything.

If you combine EC codes with a standard image codec, which usually has some non-tolerant entropy stage, an unrecoverable error means you probably lose a significant portion of the image.

With a method like this, instead, the image quality smoothly degrades as the error rate increases.


Could this be extended to video compression? I guess not, but naively thinking a video is just 30 pics a sec...


Video formats meant for editing (e.g. mjpeg) are a series of images (intra frame only), but video formats meant for playback (e.g. H.264, HEVC, AV1) are much more compressed, by making frames depend on other frames (inter frame). Blocks of pixels in one frame copy a block of pixels from another frame (backwards and forwards!), and apply a "diff".


> naively thinking a video is just 30 pics a sec

No, that's literally all video is. The problem comes in the implementation of inter-frame diffs.


yeah ! There could be some interesting usages. Because this is loss tolerant I'm thinking of the consumer drones, where the stream is usually h264, when you lost a frame the entire second following get bad


Motion ICER, or MICER?


There is google compression https://github.com/google/guetzli




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: