Hi! I am trying to use the BGL to create random graphs with 10^6 nodes and approximetly 10^12 edges. My system is: Debian Sarge with Kernel 2.6.8 on a Pentium M 1.6GHz with 1 Gig of RAM g++-3.4.2 boost 1.32.0 latest draft tarballs I am not using the STL libaries from SGI, but the standard libstdc++6 My first shot with the adjacency_list ate all my main memory. Which isn't really surprising. But adjacency_matrix failed also when I passed 10^5 nodes. Running the program within gdb gives me at least the following error message: 0x080c0fec in boost::detail::get_edge_exists<char> \\ (edge_proxy=@0x73fbdc28) at adjacency_matrix.hpp:84 84 return edge_proxy; The basic program layout is: typedef adjacency_matrix< undirectedS > graph_t; int main(int, char*[]) { // number of vertices const std::size_t vertices = int(1e5); // probability of a connection between two vertices const double pc = 0.3; // yielding number of edges to be distributed randomly const std::size_t edges = int(pc * vertices * (vertices - 1) / 2); std::size_t i = 0; while(i++ < edges) { if ((i % int(1e5)) == 0) cout << "Inserting edge no. " << i << "; completed " \\ << (double(i)/double(edges)) * 100.0 << "% ..." << endl; graph_t::edge_descriptor e; bool inserted = false; do { graph_t::vertex_descriptor a = rand_gen(), b; do { b = rand_gen(); } while (a == b); tie(e, inserted) = add_edge(a, b, g); // if insert fails, then this link already exist -> redo! } while(!inserted); } } Any help would be really appreciated. Thanks in advance! Greetings, Sebastian
Sebastian Weber wrote:
Hi!
I am trying to use the BGL to create random graphs with 10^6 nodes and approximetly 10^12 edges. My system is:
That's about 2^3^12 = 2^(3*12) = 2^36 edges, which looks more that 32 bit processors can address.
Debian Sarge with Kernel 2.6.8 on a Pentium M 1.6GHz with 1 Gig of RAM g++-3.4.2 boost 1.32.0 latest draft tarballs I am not using the STL libaries from SGI, but the standard libstdc++6
My first shot with the adjacency_list ate all my main memory. Which isn't really surprising. But adjacency_matrix failed also when I passed 10^5 nodes.
Adjacency matrix requires 10^5^2 = 10^10 = 2^3^10 = 2^(30) bytes, I think.
Running the program within gdb gives me at least the following error message:
0x080c0fec in boost::detail::get_edge_exists<char> \\ (edge_proxy=@0x73fbdc28) at adjacency_matrix.hpp:84 84 return edge_proxy;
That's not an error message ;-)
Any help would be really appreciated.
It looks like you should either get 64 bit processor with many gig of memory, or use some other approach. - Volodya
On Nov 17, 2004, at 3:04 AM, Sebastian Weber wrote:
I am trying to use the BGL to create random graphs with 10^6 nodes and approximetly 10^12 edges. My system is:
That's a _really_ dense graph. An adjacency matrix is clearly the right representation for it, but note that with ~10^12 edges you're on the scale of a terabyte. Unless you have way more memory that I, you'll need to distribute the graph across a whole bunch of computers. We've been going in this direction for our own research into the Parallel BGL, but have focused only on sparse graphs with a distributed adjacency_list, so we won't be much help at the moment :(. If you can, your best bet is to scale back the problem size dramatically. You might be able to reduce the memory requirements of adjacency_matrix a bit by allowing it to use a vector<bool> instead of a vector<char>. Doug
participants (3)
-
Doug Gregor
-
Sebastian Weber
-
Vladimir Prus