For an Optimum Binary Read?
[1/3] from: tim:johnsons-web at: 13-Sep-2000 13:37
Hello:
I like to pose a question for some comment :
I'd like to read a binary index file.
This file will contain 32-byte offsets into a database.
This file could have over 100k offsets in it, so could be over 400k in size.
I think it would be less than optimum to read one integer
at a time, but it could also be a drain on system resources
to read the entire 400k into memory.
If my presumption is correct, then the alternate would be
to use a buffer and read a part of the file at a time.
As in :
bin-buf: make binary! 256
What then could be an optimum read size?
Comments and opinions would be appreciated.
Thanks
Tim
[2/3] from: al:bri:xtra at: 14-Sep-2000 21:13
Tim wrote:
> This file could have over 100k offsets in it, so could be over 400k in
size.
> ...
> What then could be an optimum read size?
Read the entire file into memory, and let the OS Virtual Memory determine
what's best. It can do a far better job than a person can do.
Andrew Martin
ICQ: 26227169
http://members.ncbi.com/AndrewMartin/
http://members.xoom.com/AndrewMartin/
[3/3] from: tim::johnsons-web::com at: 14-Sep-2000 8:47
Thank you Andrew:
[Al--Bri--xtra--co--nz] wrote:
> Tim wrote:
> > This file could have over 100k offsets in it, so could be over 400k in
<<quoted lines omitted: 3>>
> Read the entire file into memory, and let the OS Virtual Memory determine
> what's best. It can do a far better job than a person can do.
Since this will run on Linux mostly, I'm inclined to concurr.
:)Tim
Notes
- Quoted lines have been omitted from some messages.
View the message alone to see the lines that have been omitted