pouët.net

Compressing large arrays of floating point data?

category: code [glöplog]
Well, I thought the erosion was done on the 2-4k² heightmap. Not as a step going from the 8x8 seed heightmap up to the final 2-4k² heightmap.

But it might be worth to extract a diff from your nicely eroded + hand painted 2-4k² heightmap and a more simple expansion of the seed heightmap. And compress that diff.
added on the 2013-04-18 13:01:04 by p01 p01
16 bits greyscale PNG is one easy solution., or JPEG2000 if you're ok with lossy compression. That said, if you are into lossless compression... PNG use Deflate compression, while LZMA is usually significantly better, size wise. I used it to store spritesheets, 25% smaller file, with something rather less clever than rrrolla approach
* pixels in zorder
* I consider deltas of pixels, not their value themselves

If it's a heightmap, you might consider encoding at scale 1x, 2x, 4x, where scale 2(n+1) elevations is what you should add to scale 2n. Zorder allow to store that neatly, you can see it as an implicit quadtree. For your typical heightmap, larger scale will store more "entropy", thus being the hardest to compress and contribute the least to the overall heightmap look.
wavelets.
added on the 2013-04-18 13:45:20 by kusma kusma
xtrim: this is from one of my own landscape rendering experiments, using a 1024^2 8bit texture, 1 or 2 detail height maps added:

BB Image

No visible issue with stepping. From what I can remember of it, I generated the normals from a blurred copy of the height map, where the blur radius depended on the local image contrast (i.e. 'rough' areas get little blurring, 'smooth' areas get a lot). This removes the stepping from the lighting at least. I then simply stored the normals alongside the height map texture as RGBA (it's a mixed raytrace/march setup so this saves time doing the normal calculation or extra texture lookups).
added on the 2013-04-18 13:45:25 by psonice psonice
@xTr1m: inspiring stuff!
added on the 2013-04-18 15:06:51 by trc_wm trc_wm
Quote:
procedural generation already takes 45 seconds for that, due to the bruteforce and highly serial erosion algorithm I'm using. Precalc is not an option anymore :)


on the cpu or gpu?

( in case of cpu, do a gpu version of it or use multithreading on cpu )
added on the 2013-04-18 16:06:42 by abductee abductee
What kusma said. But I guess properly applying some of the more trivial approaches presented here should already be good enough.
added on the 2013-04-18 16:15:05 by las las
abductee: I'm sorry, but I explicitly said "highly serial" algorithm. That means that it can't be parallelized, at least not easily.
added on the 2013-04-18 16:18:53 by xTr1m xTr1m
Quote:
As a reference, here's the same heightmap I linked earlier, rendered with L3DT. Notice how 8-bit precision are just not enough.

Actually I like the look of the quantized version much more than the smooth one...

Jpeg2000 is optimized for visual perception, so it doesn't work particularly well on data which doesn't represent brightness.
Those "0.25bits/pixel" are far from realistic for 16bit data.
added on the 2013-04-18 16:38:54 by hfr hfr
xTr1m: oh, sorry, i must have missed that... :P

added on the 2013-04-18 16:55:48 by abductee abductee
xtrim: how small do you need your compressed heightmap to be?
added on the 2013-04-18 18:53:20 by trc_wm trc_wm
I've got no reference value, I just know that 16mb for 2048² or 64mb for 4096² is overkill. I also need space for the music, meshes, textures and code (incl. used libs). For a total size of 64MB.
added on the 2013-04-18 20:25:16 by xTr1m xTr1m
First results:

2048² 16bit = 8MB uncompressed, 4.83MB png 16-bit grayscale. Compression rate: 60.375%
4096² 16bit = 32MB uncompressed, 17.40MB png 16-bit grayscale. Compression rate: 54.375%
added on the 2013-04-18 20:43:54 by xTr1m xTr1m
actually, reducing a bit the high-frequency details lowers the 4096² version to 15.2MB, resulting in 47.5% compression rate.
added on the 2013-04-18 20:49:27 by xTr1m xTr1m
You can get better compression ratios using a custom compression algo, for sure.

I used a predictor + rice encoding for lossless audio compression and got around 20% of the original size. This should work for height fields also, perhaps even better because of the 2D structure.
added on the 2013-04-18 21:15:36 by trc_wm trc_wm
Nice results. Certainly better than 64meg.

You should definitely go for some lossy compression imho. The hills and slopes of the terrain don't need to be reproduced with the same per pixel precision as cliffs and rivers. And there are lots of smooth areas. Compression rate can easily go up to 90%.

This image describes perfectly what I want to say.

If you put in a tree/graph like structure, you can run an algorithm which removes nodes that can be reproduced using your favorite polynomial interpolation with less than let's say 10% error.

I'd probably just use jpeg though.
added on the 2013-04-18 21:16:37 by musk musk
What I'd definitely do is to split it into two standard jpeg files. One stores the low frequencies the other the higher ones. The sum would give back the original image and you can get roughly 16bit precision.
added on the 2013-04-18 21:27:42 by musk musk
This is of absolutely no interest for your problem, but your heightmap, already seen in "work in progress", reminds me of diffused limited aggregation as in
BB Image
but with the particles going from the center to the edge of a disc instead of the contrary.
added on the 2013-04-18 21:32:31 by baah baah
Huge improvement: I'm now saving 2 8-bit grayscale pngs. One with the high-order byetes and the other with the low-order bytes. Here they are:
11.6mb (low bytes)
976kb (high bytes)
Makes a total of 12.5MB, being only 39.06% of the original size, lossless.
added on the 2013-04-18 22:04:50 by xTr1m xTr1m
For 2048² it's 3.57mb and 441kb, which is 49.63% of the original size. So in both sizes it's definitely an improvement to split high and low bytes into two pngs.

Also implemented the loading and reconstructing of the height values back into float, the 16 bit precision is quite sufficient :) I'm pretty much satisfied now! Thanks for all your help :)
added on the 2013-04-18 22:49:16 by xTr1m xTr1m
Nice saving :) Have you tried the same thing with jpeg to see if there's any serious downside?
added on the 2013-04-18 23:47:37 by psonice psonice
the problem is that if you split the byteplanes BEFORE delta-encoding, ten your lower byteplane will mostly be just noise and a lot of discontinuity. try what rrrola said: deltaencode the whole thing (although personally I'd only do it in 1D), offset it back, and THEN split it into two planes.
added on the 2013-04-18 23:53:15 by Gargaj Gargaj
now please use terragen or smth to do better mountains than the one you showed :P
added on the 2013-04-18 23:54:58 by maali maali
Gargaj: if you look at the MSB map http://www.xtr1m.com/temp/heightMap2.png then you'll see that this is very good for png compression. I'm thinking of deltacoding only the LSB map.
added on the 2013-04-19 00:01:33 by xTr1m xTr1m
Be sure to run your PNGs through pngout before coming to conclusions about compression ratios. As iq says, the PNG format includes all sorts of delta encoding tricks that can help here, and I wouldn't count on any random graphics util/library being able to exploit those tricks to maximum effectiveness.
added on the 2013-04-19 00:18:30 by gasman gasman

login