Hi, My application, overlay, constructs and destroys millions of cpp_rational variables. As overlay executes, its memory continually grows. My variables don't have that many digits, but each time that they are constructed or destroyed, they are slightly wider or narrower. W/o studying the cpp_rational source, I surmise that it uses a heap that gets badly fragmented. Apart from the space, each single allocation is probably taking ever more time. That's annoying because overlay otherwise would take linear time. What should I do? Are there easy ways to control this problem? Are other rational implementations better? Boost gives me a choice of several. Are there better implementations outside boost? ----- Not that it's relevant, but FYI, my application is to overlay two planar graphs, intersecting all the edges to find the output faces. There might be 100,000 edges. Roundoff error is a serious problem that causes topological impossibilities. The idea is to use rationals to prevent roundoff. Upon reading each input float, I use a continued fraction approximation to convert it to the closest rational justified by the number of significant digits. That entails constructing and destroying variables of varying widths as the approximation converges. This is probably what is thrashing the memory. However if I don't do that, the values start by having perhaps 20 digits, which doubles each time as line equations are computed and lines intersected. Perhaps I could restructure my code to avoid that. However IMHO, it is quite well structured now, and making it ugly to be fast would be a pity. Help? Thanks.