r3wp [groups: 83 posts: 189283]
  • Home
  • Script library
  • AltME Archive
  • Mailing list
  • Articles Index
  • Site search
 

World: r3wp

[Core] Discuss core issues

JaimeVargas
24-Aug-2005
[1777]
find-each: func [dataset [series!] value /local result][
    result: copy [] 
    parse dataset [

        some [set word string! (if find word value [append result word]) 
        | skip]
    ] 
    result
]


>> find-each ["Jaime" 1 "Carl" 2 "Cyphre" 3 http://google.com"Ladislav"] 
"a"   
== ["Jaime" "Carl" "Ladislav"]
Pekr
24-Aug-2005
[1778x4]
I wonder if that one will be faster than loop?
that should be easy to test, will do so tomorrow ...
I remember someone used 'parse in the past as a trick to get pointer 
to binary data in rebol ...
it was somethin with images IIRC ...
Geomol
25-Aug-2005
[1782]
Anton, I got access to your include.r now. Interesting option to 
only include certain functions or words from a script!
Anton
25-Aug-2005
[1783x2]
oh.. ? I think it's essential.
.. to avoid inadvertent pollution of your target context (usually 
the global context). This should avoid many bugs. It always annoyed 
me when C coding that when I included a library for a particular 
function, I had to check that library to see what else I was including. 
Then of course some libraries include things from other libraries...
Geomol
25-Aug-2005
[1785]
This reminds me of Tao Elate. In that OS, all library functions are 
small VP asm files on disk. So if you use e.g. printf, only that 
function and not the whole stdlib is loaded in memory. The same function 
is also shared among all running programs minimizing memory overhead. 
Genius, as I see it!


Something like that can be implemented in REBOL with the use of objects 
in objects (that are not multiplied in mem). It's the way, e.g. the 
feel object is implemented in View. To be really efficient, only 
the functions (in the object) needed should be included into mem 
from disk.
Ladislav
25-Aug-2005
[1786]
note: that feature can be imitated using

    make object! [#include %somedefinition.r]


when using my INCLUDE. The only trouble is, when the author of the 
script does some "nonstandard" things. Then it may not be protective 
enough, which is the case of Anton's include too, where you have 
to rely on the discipline of the original author.
eFishAnt
27-Aug-2005
[1787x4]
when you need a reduce/deep and there isn't one, what do you use 
instead?
I want to reduce something inside a nested block      reduce [ 'blah 
[  to-word "desired-literal-word" ] ]  ;sorta thing
reduce [ 'blah reduce [  to-word "desired-literal-word" reduce [to-word 
"deep-literal
-word"]] ]   ;ss this works...just talking to myself...nevermind
hmmn, reduce/deep or reduce/nested would be more elegant nonetheless.
Volker
27-Aug-2005
[1791x2]
what are you doing?
is compose/deep/only an option? Also a reduce/deep would be short, 
if you need it.
eFishAnt
27-Aug-2005
[1793]
trying to reduce a set of nested blocks ... compose/deep/only would 
not reduce the inner blocks, but leave them as they are...
Volker
27-Aug-2005
[1794]
compose/deep [ (a) [ (b) ] ] ; would work
eFishAnt
27-Aug-2005
[1795]
I guess reduce/deep would not be very hard to implement...was just 
surprised there isn't one already...;-)
Volker
27-Aug-2005
[1796]
depends if ou code some themplate, then maybe compose/deep. if its 
data, maybe better reduce/deep.
eFishAnt
27-Aug-2005
[1797]
compose makes strings...I am trying to get it down to literal words.
Volker
27-Aug-2005
[1798]
No, compose makes blocks. If  you mean "flattens blocks", use /only.
eFishAnt
27-Aug-2005
[1799]
going from VID to literal-words...to do comms syncing...so say a: 
"cat" and b: "dog"  I want [cat [dog]] NOT ["cat"["dog"]]
Volker
27-Aug-2005
[1800]
compose [ blah [  (to-word "desired-literal-word") ] ]
eFishAnt
27-Aug-2005
[1801]
[blah [(to-word "desired-literal-word")]]   ;is what returns, instead 
of  [blah [ desired-literal-word]]
Volker
27-Aug-2005
[1802x7]
sorry,
  compose/deep [ blah [  (to-word "desired-literal-word") ] ]
and to be defensive, compose/deep/only.
A first version:
reduce-deep: func[blk /local][
 blk: reduce blk
 forall blk[
  if block? blk/1[
   blk/1: reduce-deep blk/1
  ]
 ]
 blk
]

probe reduce-deep [ 1 + 2 [ 3 + 4 ] ]
probe reduce-deep [ a: [1 + 2] a/1] ; limitation, does not work
reduce-deep: func[blk /local][
 forall blk[
  if block? blk/1[
   blk/1: reduce-deep blk/1
  ]
 ]
 reduce blk
]
probe reduce-deep [ 1 + 2 [ 3 + 4 ] ]
probe reduce-deep [ a: [1 + 2] a/1] ; limitation, does not work
[3 [7]]
[[3] 3]
better. perfect one is hard.
(drop the "limitation, does not work" in last post)
this breaks then, because inner blocks are evaluated first:
  a: 2 probe reduce-deep [ a: 1 [a]]
eFishAnt
27-Aug-2005
[1809]
that's interesting, I figured the order of evaluation might be why 
there isn't on...why I instinctively thought of /nested as a refinement...thanks 
for the insights.
Volker
27-Aug-2005
[1810x4]
problem here is, we need do/next to know how long one experession 
is. but before do/next, we can not reduce subblocks. that should 
be done only for one expression. but we do not which blocks before 
do/next.. maybe it should be really inbuild?
but if you don't do such tricky assigining in reduce, this should 
work
What did i write? Trying explanation again.. We need do/next to know 
which blocks are in the next expression. Then expand only these. 
But by using do/next, we have already used the old, unreduced blocks..
So the workarounds is to expand all blocks at once, either before 
reducing that level or after. breaks a bit semantic, means do not 
depend on evaluation-order.
eFishAnt
28-Aug-2005
[1814]
thanks...sorry, got pulled off by my daughter to google when her 
power would be restored...your thoughts here are greatly appreciated, 
as always, Volker.
Maarten
28-Aug-2005
[1815]
I went down this road once. Using do/next terribly slows things down. 
I think that you need parse for this.
Brett
29-Aug-2005
[1816x3]
Just passing. Thought that the function would be interesting, came 
up with this draft.
reduce-deep: func [
	block [block!]
	/local queue result file eval-state sub-result
][

	; Initialise
	queue: make list! 10
	result: reduce []


 ; Loop until we exhaust all unfinished blocks and we exhaust current 
 block.
	until [

		either empty? :block [
			; If finished this block, but more to process - add our
			; result to higher block result and set that block as current.
			if not empty? queue [
				set/any 'sub-result get/any 'result
				set [result block] queue/1
				insert/only tail result get/any 'sub-result
				queue: remove queue
			]
		][

			; Process current block item.

			either block? block/1 [
				; Save current level to process later,
				; set this new block as current.
				queue: head insert/only queue reduce [result next block]
				result: reduce []
				block: block/1
			][
				; Evaluate item.
				eval-state: do/next block
				insert/only tail result eval-state/1
				block: eval-state/2
			]
		]

		all [tail? block tail? queue]
	]

	; Return final result
	result
]
Not robustly tested.
JaimeVargas
30-Aug-2005
[1819]
How can I access a network drive?
Graham
30-Aug-2005
[1820]
I'm sure it's just  %uncname/
JaimeVargas
30-Aug-2005
[1821]
Humm. Not working here.
Graham
30-Aug-2005
[1822]
perhaps it's %/uncname
Volker
30-Aug-2005
[1823]
IIRC /myserver/volker/...
Anton
31-Aug-2005
[1824]
umm.. tried leading double slash ? eg. %//uncname/
Graham
31-Aug-2005
[1825x2]
My primary scsi hard drive in my server killed windows 2003 somehow. 
 It fails the seagate diagnostics though the seatools can't identify 
the drive.  I put in another scsi drive but even though they all 
have different ids, the new windows 2003 server dies when I reattach 
the old drive ( i want to recover files off it ).  It just reboots 
even in safe mode.  Anyone any ideas as to why it should do that?
Unfortunately BartPE doesn't have the driver for the scsi controller 
so I can't use that to examine the drive.