Mailing List Archive: 49091 messages
  • Home
  • Script library
  • AltME Archive
  • Mailing list
  • Articles Index
  • Site search

[REBOL] FTP large files (Answering my own question)

From: doug::vos::eds::com at: 5-Apr-2002 15:28

About FTP of large files. Here is a quote from the rebol documetation... (sorry I did not read...) Transferring Large Files Transferring large files requires special considerations. You may want to transfer the file in chunks to reduce the memory required by your computer and to provide user feedback while the transfer is happening. Here is an example that downloads a very large binary file in chunks. inp: open/binary/direct out: open/binary/new/direct %big-file.bmp buf-size: 200000 buffer: make binary! buf-size + 2 while [not zero? size: read-io inp buffer buf-size][ write-io out buffer size total: total + size print ["transferred:" total] ] Be sure to use the /direct refinement, otherwise the entire file will be buffered internally by REBOL. The read-io and write-io functions allow reuse of the buffer memory that has already allocated. Other functions such as copy would allocate additional memory. If the transfer fails, you can restart FTP from where it left off. To do so, examine the output file or the size variable to determine where to restart the transfer. Open the file again with a custom refinement that specifies restart and the location from which to start the read. Here is an example of the open function to use when the total variable indicates the length already read: inp: open/binary/direct/custom reduce ['restart total] You should note that restart only works for binary transfers. It cannot be used with text transfers because the line terminator conversion that takes place will cause incorrect offsets.