Mailing List Archive: 49091 messages
  • Home
  • Script library
  • AltME Archive
  • Mailing list
  • Articles Index
  • Site search
 

[REBOL] Re: Cross-platform (was NTFS Stream Fun)

From: joel:neely:fedex at: 8-Aug-2002 11:33

Hi, Ashley, My purpose was two-pronged, so let me separate the pointy parts... ;-) [atruter--hih--com--au] wrote:
> While I agree with you at the academic / theoretical level, this > just doesn't make commercial sense in many cases. Make no mistake, > I am a *big* fan of cross-platform tools, but remember that these > tools are designed to make things easier for the *developer* not > the end user. One only needs look at the reception of Java by > developers vs users to see that cross-platform is almost > irrelevant to the end-user (Not trying to start a Java debate here > folks ;) ). >
I understand your point, but disagree with the conclusion in the following way: there are differences in technical issues which only developers may "understand", but which ultimately surface as visible characteristics to the user. For example, I recall a situation from my past in which an application had been in use on a small, increasingly antiquated box until such time that the increasing workload (and decreasing reliability) forced the team to move the application to a new platform. As usual, the term "forced" is appropriate, because the conversion wasn't done until the situation had become a near-emergency. The application had some characteristics/technology which were specific to the old platform, and therefore extra time was required to convert those aspects to the new environment. The user wailed and moaned mightily at the fact that developer time and effort was going into a system upgrade that offered no new, desired features. (I started to say "squealed like a scalded pig", but that didn't sound sufficiently respectful! ;-) The developers felt it was important to minimize the number of "moving parts" and get the previous functionality up and stable on the new platform before making any changes to provide different functions that the previous version. My point is that we live in an increasingly networked computing world, where scaling, access to resources, improving performance, deploying with "thin clients", etc. are all significantly degraded when we allow proprietary features to get in the way of ease of mobility of our code. I certainly understand that (currently!) the majority of end-user desk boxen are running some form of Windows, but even the option of moving cycles back and forth between clients and servers is often a useful tool in tuning system behavior, scaling (or degrading!) gracefully, etc. It is also the case that economic conditions, the commoditization of computer hardware, and the rate of progress in computing technology (Moore's Law is alive and well) all give weight to the idea that I should be able to run today's code on tomorrow's box with as few assumptions and constraints as possible. Ironically, one can argue that the very success of Microsoft was due to its determination to provide a uniform, consistent layer of functionality that was independent of the underlying hardware. This is consistent with the overall pattern of the development of the computing field, where a melange of ad hoc hardware, O/Ss, assembly languages, and custom variations of "standard" languages has slowly given rise to a layered model in which variations at one level (which may be perfectly legitimate *AT*THAT*LEVEL* for a variety of reasons, including performance, cost, etc.) are hidden beneath an abstracted interface that lets the next layer up deal with only the concerns that are relevant at that next layer. Because Microsoft made PC hardware a commodity and attempted to establish Visual CP/M (called "Windows" in their literature ;-) as the universal platform, the user could stop worrying about the insides of his/her box (assuming there were enough memory, the hard drive were big enough, etc...) Along comes the 'Net. Now the next stage of evolution is to make the details of the client-vs-server distinction, the operating system, etc. irrelevant in the face of open protocols (TCP/IP, FTP, HTTP, etc.) This is, of course, resisted mightily by monopolists who are loathe to cede their place of privelege which they hope would allow them to be the gatekeepers for all of computing. I suggest that it is in our users' best interests for us to look out for those interests even in areas where the user has no direct expertise. That's how we add value, after all. I don't know enough biochemistry to know precisely how the medicine operates in my body, but I *DO* expect my doctor to know enough about it to prescribe the best medication for me, to avoid simultaneously prescribing drugs with harmful interactions, etc. My other point is that using loss of platform-neutrality as argument against open source is invalid IMHO, for two reasons: 1) The opportunities for breaking neutrality are already all around us, and it is our professionalism/attitude/commitment that keeps us from breaking neutrality, not the properties of our tools. 2) Availability of open source tends IMHO to make a tool available on even *MORE* platforms, thereby further promoting widespread use and enhancing the recognition that neutrality is in the best interest of users and geeks alike.
> > The best way to avoid becoming Nazgul is to decline Sauron's > > offer of a ring, no matter how shiny (and powerful) it may be. > > Yes, but Frodo had to "accept" it (and its corrupting effects) for > the short term so as it could be destroyed in the long run. ;) >
Yes, but it almost killed him in the process (and *DID* kill others). The odds aren't good. ;-) And, of course, there's the fact that they really didn't have the option to "just say no" as do we. -jn-