On 9/09/2010 4:52 AM, Simonson, Lucanus J wrote:
Your algorithm could find a home in boost itself as a "strategy" for polygon clipping that can be passed into the Boost.Geometry generic interface. If that interests you I suggest you join the Boost.Geometry project and mailing list and pursue contributing your code. I expect that your participation would be welcomed by the other member of the project, certainly I welcome it. It is possible to make a floating point implementation of Vatti, such as yours, numerically robust and that is something we could pursue as a community in the Boost.Geometry project.
Thanks Luke for your kind welcome. I wasn't hoping for Clipper to be integrated into the Boost project in any way, I just chose the Boost License for it's simplicity. I think that multiple solutions for tasks in a code library is generally confusing to users, though I can see that sometimes two is justified. However, I'm very open to some form of collaboration where the better ideas from each algorithm might be integrated into one solution, assuming that's possible.
I like the looks of this page. It looks like you are doing what I would call the "right" kind of testing with randomly generated circles and polygons. I approve of beating GPC at its own game of intersecting Great Britain with a spiral. Bravo. GPC is substantially robust, however, not 100%, but darn close. While you are making your comparison you might want to mention that. The reason GPC is so reliable stems from its suboptimal sweepline data structure. The tree data structure recommended by Vatti and Bentley-Ottman has its invariant violated by numerical robustness errors while the list GPC maintains tolerates errors better.
I believe Clipper is currently substantially more reliable that GPC because I don't think GPC properly deals with complex intersections. I'm not aware of the tree data structure recommended by Vatti et al. (I couldn't find any mention of it in Vatti's original paper.) My implementation uses linked and double linked list data structures which are described in Vatti's paper. I would be very interested if you could provide a link to Vatti's and Bentley-Ottman's discussion of a tree structured approach. Concerning robustness - I believe numerically robust problems only arise when complex intersections are encountered. Specifically, how does an algorithm cope with mulitple intersections occuring in very close proximity. Simply relying on 'tolerance' when using floating point values will guarantee failure at some point. On failure, intersection operations are performed in an incorrect order resulting in incorrect edge sorts. This fortunately is easily detected after each sweep. While, not a final solution, I have an update that's close to release that tests the edge sorts before actually processing the intersections. If the test sort order is wrong, the 'tolerance' value is adjusted (both up and down) and retested. Fortunately, this testing has only a minor impact on performance (%10). In my stress tests*, and using this 'tuned' tolerance approach, the incedence of test failure drops to about one in a million tests. (*The test involves generating random subject and clip polygons with 100 vertices each rotated at varying degrees where the vertices are specifically tuned to generate may complex intersections.) Given that complex intersections in clipping operations are infrequent events in normal use (which is why GPC rarely blinks in normal use), I would argue that the likelihood of users encountering a failure with Clipper would be very much lower than one in a million. You also mentioned scaling as an issue with floating point implementations. I agree that it is an issue and that input polygons with very small or very large coordinate values should be scaled. Anyhow, a fully numerically robust solution is still my final goal.
I can review your code if you like. I expect that with 15 years of experience programming in any language your C++ will be reasonably well written in terms of procedural or object oriented style. Lack of generic programming style is not a fatal design flaw in my book. I can quickly identify performance problems due to C++ usage to help you get the most out of the langauge. We can add templates for coordinate data type and so forth to get it up to standards required by Boost.Geometry should you choose to pursue contributing your code.
I would certainly welcome your review of my code. However, I'm not seeking to have it accepted into Boost (for the reason I mentioned above) unless others feel strongly that this is waranted.
Please read the archives of the boost dev mailing list to learn more background on the geometry work being done in boost over the past several years and to find the link to get added to the Boost.Gometry (ggl) mailing list if you are interested.
I'll do that. Thanks again for your welcome and encouraging feedback. Angus