tag:blogger.com,1999:blog-3779956188045272690.post7467407807468154963..comments2017-03-24T07:46:49.617-04:00Comments on Procedural World: Playing Dice with Mesh SimplificationMiguel Ceperohttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comBlogger16125tag:blogger.com,1999:blog-3779956188045272690.post-48330533344683035832011-11-21T09:23:37.203-05:002011-11-21T09:23:37.203-05:00@Voodoo: The chance error goes up dramatically, ne...@Voodoo: The chance error goes up dramatically, near to 100%. But at this time you are almost done with the simplification. Check an earlier comment about this issue.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-41985077561159597832011-11-21T02:29:41.631-05:002011-11-21T02:29:41.631-05:00Quick question about the algorithm: suppose you...Quick question about the algorithm: suppose you're simplifying a structure with ~100,000 edges to 1%, 1,000 edges. Wouldn't the rate of error go up as there are fewer edges remaining? It's a bit beyond my level, just interested in understanding how that works.Deja Voodoohttp://www.blogger.com/profile/14963617006060368873noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-72215336172683860362011-10-28T15:06:55.923-04:002011-10-28T15:06:55.923-04:00Derp. I should've know the thing wasn't i...Derp. I should've know the thing wasn't initialized with costs. That's the point...<br /><br />Nonetheless, the point is the initial mesh has to be generated at some point, and since the probability curve I am envisioning is based on shape, not mesh (areas with sharp changes need more detail), and since it's going to take some integration over the surface, mesh generation seems like a good step to do it. <br /><br />I wish I had more time to spend looking at this...Abnaxisnoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-3369944882903885512011-10-28T14:47:28.251-04:002011-10-28T14:47:28.251-04:00@Abnaxis: Well with the MCA approach you don't...@Abnaxis: Well with the MCA approach you don't need to initialize the costs for each edge anymore. Costs are computed every time for all the edeges in the set and immediately discarded after one edge is selected for collapse. <br /><br />It may seem an overhead, but it is not. When costs are precomputed, you still need update some of them when there is an edge collapse. A single collapse usually alters from 5 to 15 neighbor edges depending on mesh connectivity. That means each collapse requires you to compute an average of 10 edge errors. This roughly is the same the MCA does for each set.<br /><br />Your idea of guiding the random function is intriguing. However, it may come to some cost. The blind luck approach could still be faster. But no way to tell without measuring it.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-70919478340930813292011-10-28T14:29:06.208-04:002011-10-28T14:29:06.208-04:00If I understand correctly, you still initialize th...If I understand correctly, you still initialize the cost for each vertex when you initialize this subroutine. I wonder, if you couldn't guide the routine a bit at this step, by generating a smoothed surface representing the level of detail needed in an area (the bunny's nose, to use an example from the paper). <br /><br />You could then use the curve to weight the probabilities as you randomly pick your points, favoring areas where mesh simplification is more likely to be beneficial instead of using a uniform random function. It wouldn't require any significant additional overhead (it's just an extra couple operations every time you generate a random number, and the curve generation can be done while you are initializing the mesh), and could make it that much more efficient...Abnaxisnoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-19869931943654039612011-10-28T09:15:15.062-04:002011-10-28T09:15:15.062-04:00@Sam: Thanks.
All the code is in C++ using OpenC...@Sam: Thanks. <br /><br />All the code is in C++ using OpenCL and OpenGL. I create the basic architecture blocks in 3ds max. It is all running in Windows.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-42147920400931706532011-10-28T07:38:14.047-04:002011-10-28T07:38:14.047-04:00Looks like great work you are doing here.
This is...Looks like great work you are doing here.<br /><br />This is something I want to learn myself. How are you creating it (like what programming language are you using, and are the procedural building's parts first made in a 3D program and then mixed together procedurally etc)?Sam Gouldhttp://idominateu.co.uknoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-577993207575849242011-10-27T22:54:37.223-04:002011-10-27T22:54:37.223-04:00Really great article! Lots of the stuff on here is...Really great article! Lots of the stuff on here is too high level for my skill set, but clever posts like this and your "Political Landscape" post I really like!Potadohttp://www.blogger.com/profile/11644512374496365488noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-86111303991384299532011-10-27T13:57:11.575-04:002011-10-27T13:57:11.575-04:00@Julien: I mean you need to re-sort the edges that...@Julien: I mean you need to re-sort the edges that have changed. You can have a heap with the lowest error edge on top, but as soon as you collapse this edge two vertices will change their positions. The remaining edges sharing these two vertices change too. It can go easily from 5 to 15 edges affected. That means the error metric that was initially computed for them is not valid anymore. As soon as you recompute their errors, you need to see where they go in the heap. After all, one of the changed edges may become now the best next choice and hence top of the heap.<br /><br />This step is actually in the slides you point (slide 48):<br /><br />"update the mesh and the keys"<br /><br />Updating the keys means recomputing the errors for the edges that were changed and see where they end up in the heap.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-25361947626810120892011-10-27T13:28:11.295-04:002011-10-27T13:28:11.295-04:00Again: You don't need to sort all the edges. Y...Again: You don't need to sort all the edges. You can just use a heap data structure for that. A description of the complete algorithm (using a heap) can be found here: http://www.cs.mtu.edu/~shene/COURSES/cs3621/SLIDES/Simplification.pdfJulien Koenenhttp://www.blogger.com/profile/00644404202920811734noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-7322163203350878712011-10-27T13:02:56.681-04:002011-10-27T13:02:56.681-04:00@Julien: You don't need to reevaluate all of t...@Julien: You don't need to reevaluate all of them, only the ones that were affected by the previous collapse. Then you would need to re-sort them, cause their error metrics would have changed. This is what the optimized serial greedy method does. Still the MCA approach is twice as fast.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-13306282271534573882011-10-27T12:46:24.186-04:002011-10-27T12:46:24.186-04:00If you collapse an edge you don't have to re-e...If you collapse an edge you don't have to re-evaluate *all* other edges in your mesh because the only changed edges are the newly introduced ones. So why not just use a Heap (ordered by the maximum error) and add the new edges into the heap (which should be O(log N)) -> repeat and you have a nice O(N log N) algorithm)Julien Koenenhttp://www.blogger.com/profile/00644404202920811734noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-5794569472019242452011-10-27T12:24:02.711-04:002011-10-27T12:24:02.711-04:00@Worthstream: I actually did this for a while, do ...@Worthstream: I actually did this for a while, do a first pass using MCA and when the number of remaining edges was small enough switch to greedy. I did not make a big difference error-wise so I ended up removing the greedy phase and relaxing a little bit the final polygon budget. <br /><br />What also happens is that very often the 1% remaining edges would never collapse anyway cause they fail a maximum error test. This maximum error is an input parameter to the method, meaning that you are not only interested in the triangle count, some minimum quality has to be preserved. In this case as you approach the finish line you start having sets of edges that produce no collapse at all, because they all fail the maximum error test. When this happens too often, the algorithm knows it is time to stop.<br /><br />Anyway in the MCA paper I linked they show some error metrics between the two methods, it is not bad at all.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-26073278240964917302011-10-27T11:57:12.184-04:002011-10-27T11:57:12.184-04:00Great article!
About the chances of errors:
In t...Great article!<br /><br />About the chances of errors: <br />In the last iteration there are 1001 edges remaining and the algorithm picks 10 out of those.<br /><br />The chances to correctly pick the worst edge are 10 to 1001 or around 1% or around 99% to have an error in the simplification process.<br /><br />What about switching from the greedy heuristic to the original algorithm once the chances to make an error are too high? (tens of thousands edges remaining perhaps?)Worthstreamhttp://www.worthstream.netnoreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-35534720820838248542011-10-27T10:59:50.389-04:002011-10-27T10:59:50.389-04:00I remember looking at some different libraries, no...I remember looking at some different libraries, not very sure about MeshLab, but I think they use a QEM greedy serial implementation. I looked at qslim, OpenMesh, MeshLib and several others. They were either too slow or bloated with feature creep. But the main issue is even the good ones will impose a mesh structure on you.<br /><br />The Multiple Choice simplification method is only a couple hundred lines of code, I think it is better to write it yourself and get to know it pretty well. It will take you probably one day of work and it will pay later, since you will be able to control it and fine tune it yourself.MCeperoGhttp://www.blogger.com/profile/17586513342346629237noreply@blogger.comtag:blogger.com,1999:blog-3779956188045272690.post-33582161100918301442011-10-27T10:39:00.491-04:002011-10-27T10:39:00.491-04:00Just a question, but have you tried some open sour...Just a question, but have you tried some open source mesh simplification algorithms like the one in MeshLab?ALoopingIconhttp://www.blogger.com/profile/10223359091507522354noreply@blogger.com