Angus wrote:
Simonson, Lucanus J http://search.gmane.org/?author=Simonson%2C+Lucanus+J&sort=date wrote:
"I've never seen a comparison with clipper before, I've never heard of clipper and would like to learn more. Can you provide a link or reference? I found a sourceforge site with minimal description of clipper, but it left me with a lot of unanswered questions."
Hi Luke. I'm the author of Clipper and am happy to answer your questions as best I can, but perhaps I can preempt some of them with a little background about Clipper and myself. Firstly, I've only recently created the Clipper library (ie within the last 3-4 months) and that may explain why you haven't heard of it. I originally wrote it in Delphi Pascal, but I've translated it into C++ too, and has been released with a Boost license. I wrote Clipper because I wanted a polygon clipping library (in Delphi) that wasn't encumbered with restrictive licensing. I'm still refining the library and have several updates in the pipeline (including polyline-polygon and polybezier-polygon clipping). Clipper is based on the Vatti clipping algorithm which places no restrictions on the types of input (subject and clipping) polygons. However, Vatti's original paper completely omits any discussion of 1. how to handle what I call 'complex' intersections - intersections involving more than 2 edges; and 2. how to handle overlapping horizontal edges. My Clipper library extends the Vatti algorithm by addressing both these issues. Clipper also extends the Vatti algorithm by implementing the NonZero polygon filling rule for input polygons (as well as Vatti's EvenOdd filling). I'm aware of one limitation of my Clipper library: because it uses floating point math it isn't 'robust' in the strict sense. However, errors (which are extremely rare) are properly trapped and reported as a failed clipping operation.
Vatti is the right algorithm to implement. GPC implements a sort of crippled version of Vatti that uses a suboptimal data structure for the sweepline. I expect that you are using a tree based sweepline data structre (as Vatti suggests) to implement the optimal sweepline algorithm. My own implementation is similar to Vatti in some ways, and I cite Vatti in my journal article. In addition to handling the high order intersection points and high order overlapping colinear line segments I also handle self intersecting and self overlapping polygons as well as multiple overlapping polygons in a input. I implement a Positive polygon filling rule, which is equivalent to the NonZero filling rule for typical cases, but for cases where there are holes that project outside of polygons or polygons with "bow-tie" self intersections it is different and more mathematically consistent than the NonZero or Odd filling rule. I take extreme care in my implementation to provide a 100% robust solution. This has forced me to take a divide and conquer approach to line segment intersection instead of sweepline. I then feed the fully intersected line segments into sweepline to compute the Positive filling rule and a second pass to form output polygons. The sweepline that forms polygons is substantially similar to Vatti's algorithm. I do extensive calculations to check the results of arithmetic are numerically trustworthy. The three passes instead of one, sub-optimality of divide and conquer and high constant factor cost of providing robust arithmetic all have performance drawbacks when compared to a straightforward implementation of Vatti. Also I limit my coordinate data type to integer (similar to polyboolean, but actually robust as opposed to purportedly robust). I would expect that your implementation could well be faster than mine at all scales of inputs except for the cases where yours fails due to numerical robustness problems. While you state that such errors are rare, the reality is that as the scale of the data being processed increases the probability of such an error occuring approaches 100%. In my testing non robust algorithms (such as polyboolean) have a limit in scale of data beyond which they are completely unreliable and effectively useless. This undermines the value of algorithmic optimality. Polyboolean is not optimal, by the way, and is comparable to GPC in this respect. Right now we have already yet another implementation of polygon clipping accepted to boost as part of the new Boost.Geometry (formerly GGL) library that is not numerically robust, and not algorithmically optimal. However, it has much better chances than an optimal Vatti sweepline implementation of successfully producing an output because sweepline is a somewhat tempermental algorithm and numerical errors are often fatal conditions, whereas the Boost.Geometry algorithm tolerates them better. My expectation is that for large cases your implementation should be faster than the Boost.Geometry algorithm, but more likely to fail due to numerical errors. Your algorithm could find a home in boost itself as a "strategy" for polygon clipping that can be passed into the Boost.Geometry generic interface. If that interests you I suggest you join the Boost.Geometry project and mailing list and pursue contributing your code. I expect that your participation would be welcomed by the other member of the project, certainly I welcome it. It is possible to make a floating point implementation of Vatti, such as yours, numerically robust and that is something we could pursue as a community in the Boost.Geometry project.
In my testing ( see http://www.angusj.com/delphi/clipper.php#features ), Clipper has a number of advantages over GPC, a well known clipping library that's also based on Vatti's algorithm. Clipper - 1. is significantly faster; 2. properly handles complex intersections; 3. accepts both EvenOdd and NonZero filling input polygons; 4. doesn't require payment for commercial use.
I like the looks of this page. It looks like you are doing what I would call the "right" kind of testing with randomly generated circles and polygons. I approve of beating GPC at its own game of intersecting Great Britain with a spiral. Bravo. GPC is substantially robust, however, not 100%, but darn close. While you are making your comparison you might want to mention that. The reason GPC is so reliable stems from its suboptimal sweepline data structure. The tree data structure recommended by Vatti and Bentley-Ottman has its invariant violated by numerical robustness errors while the list GPC maintains tolerates errors better.
The SourceForge site for Clipper is: http://sourceforge.net/projects/polyclipping/ About myself: I've been writing computer code for 30yrs as a hobby. (I was a medical practitioner before I retired 15 yrs ago due to illness.) For the last 15 years I've been using Delphi (Object Pascal) and until now have never used C++. The C++ translation of my Delphi Clipper library is the first code I've written in C++ and should explain any unorthodoxy in my C++ coding style. (I'm aware that the C++ code is currently about 15% slower that the Delphi code so I suspect I'm not making the most of this language.) I'm very happy to receive constructive feedback and hope I can answer any questions too.
I can review your code if you like. I expect that with 15 years of experience programming in any language your C++ will be reasonably well written in terms of procedural or object oriented style. Lack of generic programming style is not a fatal design flaw in my book. I can quickly identify performance problems due to C++ usage to help you get the most out of the langauge. We can add templates for coordinate data type and so forth to get it up to standards required by Boost.Geometry should you choose to pursue contributing your code. Please read the archives of the boost dev mailing list to learn more background on the geometry work being done in boost over the past several years and to find the link to get added to the Boost.Gometry (ggl) mailing list if you are interested. Thanks, Luke