Georgios Samaras wrote:
Thanks for the replies, now it is clearer. However, the minimum dimension I have in my datasets is 100 and I want to run NN with dimension 10000.
So, how I am going to initialize a point?
template <int CompileTimeDimension> void fill(point& tmp) { for(int j = CompileTimeDimension - 1 ; j >= 0 ; --j) { bg::set<CompileTimeDimension-->(tmp, 5); } }
Even I can see that the above code won't even compile. Writting manually bg::set<0>(tmp, 5), ..., bg::set<10000>(tmp, 5) does not really sound a good idea. So what should I do?
Maybe boost nearest's neighbors are meant to be used in higher dimensions?
I'm curious about the result! I didn't test Boost.Geometry for such big
dimensions. One point will take 40kB, right? 1M of points would require
40 GB of memory. I'm wondering how big will be the difference in
performance of e.g. simple searching in std::vector in O(N) vs rtree
kNN. How many points do you want to store in the rtree? If the number is
too small you may not see the difference.
In order to handle arbitrary dimensions (more or less) you might replace
the for-loop with the compile-time recursive template (see the code
below). Assuming the compiler would be able to compile it (it will take
some time).
E.g. VS2010 stops on CompileTimeDimension=3030 with the "error C1001: An
internal error has occurred in the compiler" and a note: "compiler is
out of heap space". It probably would be possible to somehow increase it
with some compler parameter but I wouldn't recommend it.
So if you're sure that you're unable to specify some dimensions for
which your program should work or at least make the possible range of
dimensions shorter, e.g. you know that there can only be {100, 1000,
10000} or at least [100, 200, 300, ... 10000], then Boost.Geometry is
probably not suitable for you needs. Or maybe someone else has some
other idea?
#include