Slash Boxes
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

scrottie (4167)


My email address is Spam me harder! *moan*

Journal of scrottie (4167)

Friday June 10, 2005
06:51 AM

Why we need RISC - mourning the death of the workstation

[ #25134 ]
Assuming that x86 CISC is faster and cheaper than RISC, and consumer hardware has completely outstripped high-performance specialized computing hardware, we should be sad. This means no one is doing research on entirely new sorts of things that people can have at home. It means the child ate his mother. Oh, but what home computer, so fantastic as it is, possibily want from high performance computing? Why not just kill it off? In the past we got from the workstation makers: GL 3D graphics acceleration; floating point processors; memory management units; and they popularized the GUI, Ethernet, multiheaded systems, remote access, TCP/IP, and, of course, Unix.

Without workstations, we probably never would have thought to add any of this to our computers. There wouldn't be a demonstrated use for it, as mass produced consumer hardware doesn't get new features unless there's a demonstrated use and demand for it. The only thing the home computer has done is make innovation affordable and faster - more RAM; more disc; faster 3D; faster Ethernet.

This brings me to Apple. Apple, being relatively high-end, has thus far pushed innovation. They introduced firewire, demonstrating the utility of high-speed serial networking and it's application to video editing. Before then, PC video capture devices were unusable, and no one was talking about streamlining interfacing camcorders to computers. Apple also had external drives (SCSI) ages before PCs did, and they brought dual CPU systems into the home. Without that, BIOS makers wouldn't have seen the utility of booting from USB devices (which were external harddrives at the time), and then the whole flash keychain dongle revolution never would have happened. You might argue that Apple will continue to innovate - and indeed they might. Switching to x86 alone doesn't kill innovation (despite historical evidence). However, the reasons that prompt a company to switch to x86 often do kill innovation. The companies want to lower costs and go mainstream -- Apple, in this case, is trying to more directly take on Microsoft's market (Jobs said as part of his speech he's ready to go after Microsoft). That means getting rid of a lot of that overhead associated with doing things differently (heh, think different my ass). It means streamlining production and not offering things that can't demonstrate to pay for themselves. It means cutting R&D off at the knees - just like HP recently did. And Apple has been having such great luck recently with just development (programming), why bother with research?

Jobs said that the spirit of Apple lives in the operating system -- he wasn't saying that right before he ditched MacOS9. It sounds an awful lot like Apple doesn't want to be Apple any more. And with things like the iPod, they don't need to be the Apple of yore. The problem is, no one wants to be who they were - Sun doesn't want to make RISC workstations, they want to sell x86 boxen and peddle Java related goods. IBM doesn't want to computers so much as they want to make guts for gamesystems (IBM, go ask Motorola how that went for them with Atari). Everyone else - SGI, HP (including Compaq and Digital), etc - just want to be resellers for Intel now.

No high-end workstations means power-users aren't dabbling with companies ideas of the future and thereby supporting research. And of course we can't afford the quarter million dollar RISC starter systems that IBM, SGI, and Sun sell (okay, Sun is still a little better).

It's impossible to imagine what the future would have brought us if we hadn't killed the workstation. But now what do we get? Will Intel give us innovation? They're offering us DRM - ooh, yippie. What about Gateway? Gateway Country and Windows ME - super, thanks. We've fucked ourselves. We have no where to turn to. And the few players left -- the ones everyone delegated everything to -- are becoming increasingly hostile to hobbyists who want to innovate for themselves. Cray is innovating more than the lot of them, and the way we treat Cray, do we deserve it? Will we even manage to keep them around? Of course, all of this is the result of turning the computing experience into a mechanically produced, canned product for old people and record industry execs, cynically writing off the future as disinteresting or not immediately marketable. You think HP's Carly Fiorina was bad? Well, Carly's views are about the same as the rest of the industry-exects -- short-sell the future.


P.S.: I'm leaving comments in search of sympathy and insight even though I've never in my life seen a worthwhile comment anywhere. Since I tend to berate commenters, I suggest you not post unless you're ready to be berated.
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
More | Login | Reply
Loading... please wait.
  • Is a hardware vender -- they don't care what OS you run on the thing and happily releases technical information (at one point, Sun was interesting because it was selling Unix cheaply for the day)

    Has no conflicts of interest against its customers -- they've entered into no deals where they've promised to restrict what their customers can do, be it listen to music or attempt to use an ISP other than one of the three for which icons are on the desktop

    Supports the hardware and operating system for any us

  • Good, relevant historical review.


    Isn't graphics technology now being adequately driving by the gamers? If the Unix workstation market migrates completely to commodity hardware with generic high-speed bus, doesn't that make commodity adoption of new niche developments easier?
    # I had a sig when sigs were cool
    use Sig;
    • Wew, wew, wew! We have a winner! The "completely missed the point" award goes to n1vux. No, you dimwit. I was drawing a contrast between "brand new things" and "just speeding up what already exists". Graphics card makers, driven by gamers, aren't trying anything new. The basic architecture for 3D acceleration was laid down by SGI in the mid 1990's. Adding fans doesn't count as innovation. My question was, who is making entirely new computer achitecture that might someday go mainstream after being an expensi
      • Who's calling whom names, youngster?

        Why do you assume innovation is only possibly on novel engineering workstations? Because that's all you've ever seen and you haven't read history of this and other industries? Or just stupid?

        Do you assume that progress can only come from cycling back around the "it's a new architecutre" merry-go-round? CISCRISC and "Let's create another layer of Cache" recapitulate the phylogeny as much as break new ground, but the new engineers are quite impressed with their inventiven
        # I had a sig when sigs were cool
        use Sig;
  • Systems Software Research is Irrelevant [], according to Rob Pike, due to Unix. Similar perspective in a different area.

    Guess the future is bland.

    Also, strange how most everyone else seems to have more luck with their commenters.