Beginner Help
[1/5] from: sieglec::hotmail::com at: 9-Jan-2004 12:04
I would appreciate your assistance in creating a script. I would like to
download a list of binary files from a site and save the list to my hard
drive.
The files are binary:
http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file10152004010900000000001001
<http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file
=10152004010900000000001001&page=1&scale=40> &page=1&scale=40
http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file10152004010900000000001001
<http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file
=10152004010900000000001001&page=2&scale=40> &page=2&scale=40
http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file10152004010900000000001001
<http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file
=10152004010900000000001001&page=3&scale=40> &page=3&scale=40
When these URL's are viewed one may save the files as PNG's.
What I would like to do is loop to 50 while saving these URL's as PNG's to
my hard drive with different names based on the page, and with the current
date until the error "Error: Unspecified error" is received:
WSJ_20040109_1.png
WSJ_20040109_2.png
WSJ_20040109_3.png
I know I would have to use a look but am having a hard time getting it to
work.
Thank you!
[2/5] from: sieglec:hotm:ail at: 9-Jan-2004 12:14
Sorry, the proper URL is:
http
<http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file
=10152004010900000000001001&page=1&scale=40>
://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file=1015
2004010900000000001001&page=1&scale=40
http
<http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file
=10152004010900000000001001&page=2&scale=40>
://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file=1015
2004010900000000001001&page=2&scale=40
http
<http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file
=10152004010900000000001001&page=3&scale=40>
://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file=1015
2004010900000000001001&page=3&scale=40
[3/5] from: SunandaDH:aol at: 9-Jan-2004 12:21
Hi Chris,
> I would appreciate your assistance in creating a script. I would like to
> download a list of binary files from a site and save the list to my hard
> drive
Play around with something like this (and watch the line wraps on the url
when you cut'n'paste):
file-list: [
http://online.newspaperdirect.com/NDImageServer/ndimageserverisapi.dll?file=10152004010900000000001001&page=3&scale=40
..other urls here
]
for nn 1 length? file-list 1
[print ["reading file " nn]
temp: read/binary file-list/:nn
write/binary join %file- [nn ".png"] temp
]
print "Done!!"
And watch out for any copyright violations on those images too.
Sunanda<
[4/5] from: sieglec:hotmai:l at: 9-Jan-2004 13:36
Thanks Sunanda.
[5/5] from: antonr:iinet:au at: 11-Jan-2004 17:43
You can also generate your url:
repeat n 50 [
;?? n
url: join http://online.new....&page= [n "&scale=40"]
?? url
request-download/to url join %file- [n %.png] ; <- requires View
]
The download can also fail, (eg. due to a 60 sec timeout) so, if the
images are not too big, then you can catch the error and try again:
if error? set/any 'err try [request-download ...][
; error occurred!
; need to try again
]
Anton.