Re: mesh encoding for 4k intros

category: general [glöplog]
Interesting read.

You might also want to check out:

(<2 bits for triangle connectivity, <1 bit for vertex locations => <3 bits per triangle total)

http://www.cs.technion.ac.il/~gotsman/AmendedPubl/SpectralCompression/Spectral Compression.pdf
(makes a good argument for quantizing frequency spectrum instead of directly quantizing vertex locations)

added on the 2007-10-18 19:15:21 by kusma kusma
Hmm some of those uncompression routines look too large for 4k...
added on the 2007-10-18 19:45:22 by auld auld
auld: Loonies already used a simplified version of edgebreaker int Benitoite
added on the 2007-10-18 20:22:18 by kusma kusma
I'm no pc coder, but for what I know meshes should be generated, not encoded :)
added on the 2007-10-18 20:37:19 by Oswald Oswald
true...but I wonder if iqs method beats it due to its simplicity and reliance on the compressor?
added on the 2007-10-18 20:50:35 by auld auld
mrtheplagu, as far as I understand in the Edgebeaker paper, they only speak about topolgy compression, and then 2 bits/tri is not impressive, I think I'm around 5 bits/tri with my cheap methods. The real problem is coding the verices (prediction errors), but unfortunalely they don't speak about that (don't give figures) on the paper.

However I know state of the art is few bits per triangle, both geometry and topology included. But that only works for meshes where prediction works very fine, ie, highly tesselated (read smooth) meshes... What raises the question of it worths to encode a low res mesh at high bitrate and subdivide, instead of directly storing a higres mesh and skip the subdivision code... My feeling is second will be best for 4k, but I'm not sure of course.

Should I put some of the meshes I used in the public domain so people tests and tries to beat my compression rates? That would be cool, and very useful for all of us!

ps-Oswald, only if you want to show extruded cubes and planes the rest of your life... ;)
added on the 2007-10-18 21:23:09 by iq iq
sorry, I meant "first will be best for 4k" (low_poly_mesh + subdiv)
added on the 2007-10-18 21:25:01 by iq iq
i think you should
added on the 2007-10-18 23:43:07 by doomdoom doomdoom
Yeah iq, I think you're right that these papers focus on large meshes and don't consider code size. The Edgebreaker paper does talk about vertex compression, but they use a parallelogram predictor which, as you say, doesn't make much sense for meshes that aren't locally flat. Compressing control meshes for subdivision surfaces is tougher since they're already in such a compact format... so basically I agree with everything you're saying! :) Just thought I'd add those links for people who hadn't seen them in case they inspired some new ideas.

I think you should definitely release your data for people to play with. It's easy to sit around on a message board and speculate about what works best, but more useful to actually go and try it out!
ok, I'll try to do it asap...

1. I guess a stupid obj or xml with the list of vertices and quad indices should be fine for everyone? Or even simpler, just two small C arrays (no parsing needed)?
2. Should we do it (posting results or whatever) here in a pouet thread?
added on the 2007-10-19 18:46:53 by iq iq
That would be fun!
added on the 2007-10-19 21:23:38 by doomdoom doomdoom
Oh, and please do post some object I can load in Max. Like .obj. But C arrays would be also handy.
added on the 2007-10-19 21:24:42 by doomdoom doomdoom
iq: wavefront .obj works everywhere. :)
added on the 2007-10-21 12:44:51 by smash smash
what about starting a little competiton...
we take a base object (maybe the pouet pig obj?) and try to get the best possible quality/size relation results!
added on the 2007-10-21 14:13:15 by las las
Yeah, I vote for OBJ too.

If you want to play around with the pig model, I'm hosting it here: http://p.oisono.us/origins

If we're going to have a competition, we'll have to define an error metric (i.e., a way of determining how far the reconstructed model is from the original model). Maybe someone can code up a little "judge" app that sucks in two OBJs and spits out the relative error. [Maybe that person can be me.]
or how about public voting?
added on the 2007-10-21 21:17:56 by kusma kusma
mrtheplague, could you upload the unsibdivided mesh? That one is far too high poly for the purpose of the comptetition...

ah, you can develop the error calculator, that would be nice.
added on the 2007-10-22 01:45:07 by iq iq
Ok, preliminary version of the contest page: http://www.rgba.org/iq/trastero/4kmesh

I say sorry for my poor English. Please comment so we set the contest up asap.
added on the 2007-10-22 01:50:15 by iq iq
(ah, I will prepare the basic sample app as soon as I have one hour free)
added on the 2007-10-22 01:52:01 by iq iq
The links-section is a bit... wrong? :)
added on the 2007-10-22 02:03:53 by kusma kusma
"We introduce FreeLence, a novel and simple single-rate compression coder for triangle manifold meshes. Our method uses free valences and exploits geometric information for connectivity encoding. Furthermore, we introduce a novel linear prediction scheme for geometry compression of 3D meshes. Together, these approaches yield a significant entropy reduction for mesh encoding with an average of 30% over leading single-rate region-growing coders, both for connectivity and geometry."
added on the 2007-10-22 03:27:43 by update update
i don't see the point. why would you go to great lengths to encode a mesh in 4k if you could as well code the mesh? except for a few specific 4k-ideas of course, doing it by hand or with some homebrewn tool (and not importing some 3ds file) sounds much more reasonable to me...
added on the 2007-10-22 08:00:16 by skrebbel skrebbel
>mrtheplague, could you upload the unsibdivided mesh?

yeah, it had a bunch of non-quads in it which were a bitch to get rid of, but I've now added a quad-only control mesh:


(there's also a link from the original page)