[REBOL] Depth first search
From: nate:wordplace at: 4-Dec-2001 10:18
I am trying to make a web crawler that traverses the links of a site up to a
depth of 3. First a block of data is created that contains all the pages to
visit with duplicates eliminated. When I run this program on my RedHat linux
7.2 box it starts out fine, then slows to a crawl, and starts using 60-80% of
the memory on my machine. I think the problem is that I am passing a copy of
the giant block to every recursive call to a function. **Isn't there any way
to pass a reference to a data structure that will be modified instead of
copying it every time and then returning a new structure?**
Thanks
Nate