That's a worthwhile goal to be sure. When I discussed scalability I meant algorithmic complexity and ability to process large data volumes in input geometry in a single test case. Ahh, thanks for the clarification. I haven't yet done any significant testing of processing very large data volumes, nor am I competent to determine an O value, so I'll defer to your estimation there.
Also, I believe GPC does handle high order intersections because I'm quite certain I was generating large numbers of them in the tests I was running that GPC seemed to handle just fine. I could be wrong of course, but I suggest you double check that GPC doesn't properly deal with complex intersection. I know it doesn't deal with overlapping or self intersecting polygons at all, but that is a separate issue from polygons that have vertices in common or polygons that abut along edges but do not overlap. GPC is used quite extensively and paid for by many people, we should be responsible about making claims about its short comings.
I was careful to qualify my statements about GPC with "I don't think" and "I believe". However, I accept that this still isn't fair treatment and I should be sure before making these claims. Let me simply state that under my stess-testing, GPC fails about 4% of the time (and with extreme stress testing - rounding random vertices to nearest 100 instead of nearst 10 - it fails far more frequently). The full source code for my stress-tests can be downloaded here: http://www.angusj.com/delphi/clipper.php