[REBOL] Re: Rebol - core dump
From: arolls:bigpond:au at: 16-Jan-2001 0:27
I vaguely remember an earlier post like this.
Someone suggested at the time to occasionally do a 'recycle.
Try recycling, say, every 50 iterations and see how it goes.
I'm running your script here to try it out and see if I get
the same problem foreach version
["REBOL/Core 2.3.0.3.1 24-Jun-2000"
REBOL/View 0.10.38.3.1 28-Nov-2000
]
Is your list of urls private? (consider posting [to me]).
Anton.
> I'm trying to run the script below on rebol/core (also tried view).
>
> The script continually reads http pages from a web server and reports any
> difference in subsequent reads to a log file. The script works fine but
> crashes after 3-4 hrs hours with a segmentation fault or core dump - I'm
> running Mandrake 6.1 Linux & Rebol/core 2.3.0.4.2.
>
> Any ideas about how I can solve the problem - I should be testing the web
> server for a crash problem & all I can get is Rebol crashing :-((
REBOL []
urls_to_check: make block! read %stress_urls.txt
result_blk: []
old_result_blk: []
secure [file allow]
forever [
foreach [url description] urls_to_check [
if error? url_result: try
[checksum read make url! url]
[url_result: disarm :url_result]
switch type?/word url_result [
integer! [
url_result: join "Checksum = " [to-string url_result]
]
object! [
url_result: join "Access Error = " [to-string url_result/code]
]
]
print url_result
print newline
append result_blk join url [" " description " " url_result newline]
]
if result_blk <> old_result_blk [
write/append %results.txt join now [newline result_blk newline]
]
old_result_blk: copy result_blk
clear result_blk
]