Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Mark Leighton Fisher (4252)

Mark Leighton Fisher
  (email not shown publicly)
http://mark-fisher.home.mindspring.com/

I am a Systems Engineer at Regenstrief Institute [regenstrief.org]. I also own Fisher's Creek Consulting [comcast.net].
Friday November 30, 2007
01:01 PM

Tim Berners-Lee, the World Wide Web, and the Dexter Model

[ #35016 ]

Tim Berners-Lee's abandonment of the Dexter Model for hypertext a hypertext model where all links must be resolvable at all times was (IMHO) the single biggest factor in creating a successful World Wide Web.

Before the Web, hypertext systems were assumed to have all links resolvable at all times. This was not a robust design. Now, you would think this would be more robust than the Web but it fails even for single-file hypertext systems. Early in my career, I realized that computer systems were not 100% reliable, so if wanted to create software that failed safe (or at least failed soft), you had to account for errors at every step of the way. A single-file hypertext system can still fail if access to the single file is disturbed. Across the Internet, where all computers on the Internet have not been all up at the same time since the late 1970's (and possibly not even then), you cannot build a Dexter Model hypertext system because not all of your links can be resolved all of the time.

Microsoft's Help system has become much more usable since they went to a Web (i.e. HTML) based-system. At the risk of being redundant, even if you have a lint program to verify all hypertext links and destinations, file access errors will derail your hypertext system when you use a all-resolvable-all-the-time design (and I don't know if Microsoft had such a lint tool).

It boils down to handling failures with at least a small amount of grace. Unix/Linux systems handle errors much better than Microsoft Windows 1.0-3.x systems because processes can handle out-of-bounds memory errors better (Windows NT and its descendants fall in-between Unix/Linux and 16-bit Windows). I once wrote a Perl 4-based server that would run for months at a time because it could either recover gracefully from an error or stop gracefully upon an error. The Web runs as well as it does because the software systems handle link errors with a small amount of grace, rather than just throwing up their hands or dying horribly. Thank Tim Berners-Lee and his fellow designers for the reliability of the Web we have today.

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.