Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Software GUI Hardware

Where Have All The Cycles Gone? 854

Mai writes "Computers are getting faster all the time, or so they tell us. But, in fact, the user experience of performance hasn't improved much over the past 15 years. This article takes a look at where all the precious processor time and memory are going."
This discussion has been archived. No new comments can be posted.

Where Have All The Cycles Gone?

Comments Filter:
  • by fembots ( 753724 ) on Monday February 07, 2005 @05:13PM (#11600541) Homepage
    2% word processing
    3% gaming
    5% internet
    90% feet warming
    • by BeyondALL ( 248414 ) on Monday February 07, 2005 @05:17PM (#11600588) Homepage
      that's why I run Seti@Home in the winter :)
    • by Anonymous Coward on Monday February 07, 2005 @05:24PM (#11600698)
      My girlfriend has a cycle every month. It causes me problems, so I can imagine it must cause some problems for the computer as well.
  • by marika ( 572224 ) on Monday February 07, 2005 @05:14PM (#11600556)
    I used to be able to fit the system, the finder, mac write and 3 X 8K docs on a floppy. Now I could barely fit a word document.
    • by bongoras ( 632709 ) * on Monday February 07, 2005 @05:22PM (#11600657) Homepage

      But you can still fit an entire book on a floppy if you use LaTeX. The morale of the story: Don't want a slow, bloated system? Then tough it out and don't use one. But don't complain when you have to type:

      \begin{enumerate}
      \item Open a terminal window by right clicking on your desktop and selecting ``Open Terminal''
      \item In that window, become root by typing {\tt su}
      \item Now put a blank CD in your drive and burn the iso image to it by typing \\
      {\tt cdrecord -dev=0,0,0 cdimageyouwanttoburn.iso}
      \end{enumerate}

      instead of clicking the bullet button or asking a paperclip to make a list. It's all a matter of what you want. There are plenty of lean, mean systems out there. Don't bitch about UI slowness unless you are willing to use a plain-text console with "screen" "mutt" and "elinks" as your main applications.

      • Comment removed (Score:4, Insightful)

        by account_deleted ( 4530225 ) on Monday February 07, 2005 @05:27PM (#11600729)
        Comment removed based on user account deletion
        • by ph43drus ( 12754 ) on Monday February 07, 2005 @09:13PM (#11602705)
          There is. I'm running Slackware 10.0 (can't wait to get my 10.1 disks! w00t!). I use vim+LaTeX for all my document prep needs on the command line (use an xterm if you so choose). As others have mentioned, LyX isn't bad.

          I'm a physics major at UW, so I do a decent bit of scientific work on my computer. I use GNUPlot, XFig and the Gimp to generate drawings for lab reports and whatnot.

          I'm typing this from Firefox running on the Xorg6.7.0 server+WindowMaker 0.91. The key here is to use a lightweight window manager. Blackbox and fluxbox are other good choices (light, usable, not fugly (cough, fvwm, cough)). If you have to have that desktop environment, go with Xfce.

          The only gap that I occasionally feel in my user experience is a good spreadsheet. I haven't found one. KSpread, OpenOffice Calc and Gnumeric either are or require the use of heavy GUI software which we are trying to avoid (KDE and Gnome are not as big as XP, but far too big to run comfortably on my system). I've glanced at Siag, but haven't really tried it out (I don't know scheme and don't have the time to figure it out right now--see physics undergraduate work).

          I use mutt or pine, depending on which email address I'm checking. Thunderbird looks promising for being light and good, if you want a GUI based email client.

          Recompile your kernel to match your hardware (trim the fat and optimize for your processors), and turn off any extra servers that you don't need (don't need telnetd, ftpd, &c. running? Turn off inetd--it's also more secure). Customize your boot sequence to only start and load that which your system needs and those things which you use.

          I also boot to the command line and don't run xdm or the like. I do a lot of work from the command line, and X+light WM doesn't take long to start. It is, again, one less thing wasting clock cycles on my machine.

          For reference, I'm running my Slack 10 system on an Abit BP6 with two PIII 866MHz processors underclocked to 650MHz (long story... Has to do with the fact that the BP6 doesn't technically support the PIII). I've got 384MB of RAM and a GF4 video card. It is lightning fast. The only exception to this is when I'm running X with the closed nVidia drivers (damn thing has a 3MB kernel module... grrr...), but that only adds a hang of a couple seconds when switching between X and the consoles, and that's it. If I'm not playing Quake or dealing with 3D visualization stuff, I can use the OSS driver (2D accel only), and get rid of even that performance problem.

          So, yes, the middle ground is there, and it rocks. My computing experience is awesome, my slightly dated hardware is rock solid and perfectly responsive. Take a good, customizable Linux distribution, run light weight software, turn off stuff in the background and run a lean, mean, customized kernel, and you'll reclaim those lost cycles as interface responciveness. I suggest Slackware [slackware.com] for this. FreeBSD, Debian, and any other Linux distro which is aimed at power-users will be good for setting up a configuration like this.

          Mandrake, RHAT (RHEL & Fedora), SuSE and any other user-friendly type distro is ill-suited to this, IMO. Not that you can't, but my experience with these distros and their high-level admin tools is that if you try to do something too different from the default, it gets extra hard. So, Slackware and the like just end up being simpler, and now you know what Slack users mean when they say "it's simple." So stop giving us funny looks when we say it.

          Jeff
      • by Trogre ( 513942 ) * on Monday February 07, 2005 @05:58PM (#11601101) Homepage
        Or just use a WYSIWYG LaTEX tool like this one [lyx.org] to do all that nasty coding for you :)

    • by EnronHaliburton2004 ( 815366 ) on Monday February 07, 2005 @05:26PM (#11600717) Homepage Journal
      Now I could barely fit a word document.

      And how much of that bloat in Word is useful information?

      If I open word, type the letter 'a', and save the document, it's a 20K document.

      If you type 'a' 2000 times, it's still a 22K document.

      What the heck is in that other 99.9% of the document?
      • by cosmo7 ( 325616 ) on Monday February 07, 2005 @05:38PM (#11600862) Homepage
        Here's that MS Word native format:

        64 bytes: Cryptic Masonic signature
        64 bytes: Reserved for Carnivore
        8KB: Macro playground
        8KB: Random extracts from King James Bible
        64 bytes: Run-length encoded document contents
        8KB: Uncompressed copy of above for compatibility
      • by Smidge204 ( 605297 ) on Monday February 07, 2005 @05:42PM (#11600894) Journal
        Well you have to be careful.... some file systems have minimal increments in file sizes. For example, on my NTFS formatted system, a plain text document with one "a" in it is officially 4KB, even though there is onlt one byte of data in it.

        This is not an excuse fro a BLANK MSWord document being 19,456 bytes of course. But there is "useful" data in there...

        I'm running Win2K, and if I right click on the file and sepect "Properties", there is a summary tab that displays all the info stored in that 19k. (You might have to click "Advanced")

        The data includes:
        -Title
        -Subject
        -Category
        -Keywords
        -Template name
        -Page Count
        -Word Count
        -Character Count
        -Line Count
        -Paragraph Count
        -Scale (No idea what this means)
        -"Links Dirty?" (No idea what this is... maybe it's true if there's porn links in it?)
        -Comments
        -Author (From computer info)
        -Last Saved By... (From computer info)
        -Revision Number (Number of saves?)
        -Application
        -Company Name (From registration info)
        -Creation Date (Seperate from file system creation date)
        -Last Saved Date (Seperate from file system modified date)
        -Edit time

        Now is this ACTUALLY useful? I dunno. It might be in some situations. There should be an option for not saving this metadata though, for security if not for file size.
        =Smidge=
      • Back in 2000 or so, I was trying to figure out why Word docs wore so bloated so I looked at them in a hex editor. I noticed a ton of NULLs in the document. So I wrote a simple C program to count the NULLs.

        Believe it or not, something like 60% of the document was NULLs.

        So its not really bloated, its just full of nothing :)
  • by fembots ( 753724 ) on Monday February 07, 2005 @05:15PM (#11600567) Homepage
    Launch a few applications simultaneously and time their start-ups. Try it again in five years to see whether the time has improved.

    I think it'll be the same, given the same machine.
  • by R2.0 ( 532027 ) on Monday February 07, 2005 @05:15PM (#11600568)
    spyware, trojans, p2p apps.
  • by Anonymous Coward on Monday February 07, 2005 @05:15PM (#11600570)

    Mr. Seebach points out that "computers are, in fact, doing more than they used to. A lot of the things computers do are fairly subtle, happening beneath the radar of a user's perception. Many functions are automatic and, as discussed in last month's column, you could probably do without some of them."

    This recalls an analogy drawn by a recent Economist article [economist.com]. Unlike most automobile analogies popular among Slashbots, this one is actually rather appropriate: "By the 1930s, ... the car had become more user-friendly and ready for the mass market. ... [T]he makers' increasing skill at hiding the technology from drivers ... meant that cars got hugely more complex on the inside, because most of the tasks that had previously been carried out by drivers now had to be done automatically. This presented drivers with a radically simplified surface, or 'interface' in today's jargon."

    Given this lesson drawn from history, I disagree with Seebach's conclusion that "the worst is probably over" in terms of code bloat and complexity. Computers still have a long way to go before they can approach the ease of use and stability we demand of every other consumer appliance in our lives.

    The aforementioned article requires a paid subscription to view, so in the interests of convenience, I'll reproduce it here.

    --

    SURVEY: INFORMATION TECHNOLOGY

    Now you see it, now you don't

    Oct 28th 2004
    From The Economist print edition

    [Image] [economist.com]

    To be truly successful, a complex technology needs to "disappear"

    THERE has never been anything quite like information technology before, but there have certainly been other complex technologies that needed simplifying. Joe Corn, a history professor at Stanford University, believes that the first example of a complex consumer technology was clocks, which arrived in the 1820s. Clocks were sold with user manuals, which featured entries such as "How to erect and regulate your device". When sewing machines appeared in the 1840s, they came with 40-page manuals full of detailed instructions. Discouragingly, it took two generations until a trade publication was able to declare in the 1880s that "every woman now knows how to use one."

    At about the same time, the increase in technological complexity gathered pace. With electricity came new appliances, such as the phonograph, invented in 1877 by Thomas Alva Edison. According to Mr Norman, the computer-design guru, despite Mr Edison's genius for engineering he was a marketing moron, and his first phonograph was all but unusable (in fact, initially he had no particular uses in mind for it). For decades, Mr Edison fiddled with his technology, always going for the most impressive engineering solution. For instance, he chose cylinders over discs as the recording medium. It took a generation and the entry of a new rival, Emile Berliner, to prepare the phonograph for the mass market by making it easier to use (introducing discs instead of cylinders) and giving it a purpose (playing music). Mr Edison's companies foundered whereas Mr Berliner's thrived, and phonographs became ubiquitous, first as "gramophones" or "Victrolas", the name of Mr Berliner's model, and ultimately as "record players".

    Another complex technology, with an even bigger impact, was the car. The first cars, in the early 1900s, were "mostly a burden and a challenge", says Mr Corn. Driving one required skill in lubricating various moving parts, sending oil manually to the transmission, adjusting the spark plug, setting the choke, opening the throttle, wielding the crank and knowing what to do when the car broke down, which it invariably did. People at the time hired chauffeurs, says Mr Corn, mostly because they needed to have a mechanic at hand to fix the car, just as firms today need IT staff and

  • by winkydink ( 650484 ) * <sv.dude@gmail.com> on Monday February 07, 2005 @05:16PM (#11600577) Homepage Journal
    'memory is cheap'
    'disks are fast'
    'processors are fast'

    nobody cares about optimizing code anymore.
    • by cortana ( 588495 ) <sam@robots[ ]g.uk ['.or' in gap]> on Monday February 07, 2005 @05:21PM (#11600641) Homepage
      That's just what the assembley programmers said when researchers were moving to C. ;)
    • by dattaway ( 3088 ) on Monday February 07, 2005 @05:22PM (#11600665) Homepage Journal
      Back in the old days, popular applications were stand alone, written in assembly, and made to fit within a single code segment. Since resources were small, much care was taken to get the most out of so little. Software back then were simple like motorcycles; they had the basics bolted to a simple frame and off it went. Today we have software written with stock libraries, made to work with all kinds of resources and standards, and required to work with large filesystems and memory maps. Applications back then fly today, but seem like a small insect when it comes to functionality.
      • ### Applications back then fly today, but seem like a small insect when it comes to functionality.

        Well, a 10 year old NeXTSTEP computer can do a lot of stuff that I still can't do with Gnome and KDE, and yet it is still faster doing some things than todays computer. Same with Inkscape, I tried to play around with some of the stuff I did on a P90 with 24mb RAM in CorelDraw years ago, Inkscape turned out to have huge problems rendering the stuff on a 1Ghz Athlon with 768mb RAM, was almost unusable. Its true
    • by Rorschach1 ( 174480 ) on Monday February 07, 2005 @05:23PM (#11600685) Homepage
      Nobody except embedded programmers. My biggest project of late runs on an 8-bit, 8 MHz CPU with about 7k of Flash and 192 BYTES of RAM. Not megs, not kilobytes, but bytes. That's equivalent to less than three lines worth of text. And the code's written in C, rather than assembly, so while it's easier to maintain, it takes more effort to make sure it stays efficient.

      I think all programming students should have to code for a system like this. It gives you a MUCH greater appreciation for what the compiler is doing for you, and what the consequences of simple changes can be.
      • by KiltedKnight ( 171132 ) on Monday February 07, 2005 @05:49PM (#11600990) Homepage Journal
        I think all programming students should have to code for a system like this. It gives you a MUCH greater appreciation for what the compiler is doing for you, and what the consequences of simple changes can be.

        I agree completely. I've done some programming for OS-9, and when we were creating some software libraries, we had to do was worry about things like program footprint size and memory allocation/deallocation. We were using a cross-compiler and doing development in C and C++. Something as simple as the order in which you declare the variables could make a noticeable difference in program size. Memory allocation and deallocation had to be done by the top level of the program. The support libraries had to be written to accept a memory block to use and how large it was. The last thing we wanted to do was use up the 4MB of RAM (which had to hold the OS, plus any programs you were running) we had by making large chunks of it unusable because it first was malloc()'ed, then free()'ed. We didn't want to risk having whatever garbage collection scheme existed to be able to properly operate... assuming there even was one. (This was 1997.)

        Of course, if you want speed, you have to learn to take advantage of the "short circuit" of && and ||. While nobody's really going to notice the several nanoseconds you might use up by doing !strncmp(str1, str2, n), when you process millions of rows from a database, it can make a big difference by not forcing a program pointer jump by saying
        if (str1[0] == 'a' && !strncmp(str1, str2, n))...

        The mindset we have now is a direct result of the prevailing attitude that memory is cheap and processors get faster. A friend of mine is no longer asked to interview prospective candidates because he would always ask questions about optimizing code and making it run faster. The candidates nearly always had the look of a deer caught by headlights, and these supposedly knowledgable programmers (interviewee AND interviewers) couldn't answer these questions.

      • A little tip from when I wrote 8-bit embedded code...

        Our c compiler had an output format that would list the c code and resulting assembly language intermixed. I wrote a quick little program that would read this, count the bytes of code per line, strip the assembly, and then just print out each line of C with the byte count at the beginning of the line.

        This was easier to look over and you could see if some c expression was really bloated - I'd then go and simplify the code.

        For example, I've been disassem
    • by DeadVulcan ( 182139 ) <dead.vulcan@nOspam.pobox.com> on Monday February 07, 2005 @06:51PM (#11601694)

      nobody cares about optimizing code anymore.

      You can optimize in many different ways: for run-time performance, maintainability, extendibility, usability, compatibility, and probably a bunch of other ways I can't think of just now.

      Many of these are at odds with each other. And since computers are getting faster, I think it's perfectly reasonable to start trading off run-time performance with some of these other things.

  • by brian0918 ( 638904 ) <brian0918@gma[ ]com ['il.' in gap]> on Monday February 07, 2005 @05:16PM (#11600580)
    Windows 3.1 and Notepad run nice and fast on my 3.2GHz 8GB RAM box.
  • What about.... (Score:4, Interesting)

    by BWJones ( 18351 ) * on Monday February 07, 2005 @05:16PM (#11600581) Homepage Journal
    Computers are getting faster all the time, or so they tell us. But, in fact, the user experience of performance hasn't improved much over the past 15 years. Peter looks at where all the processor time and memory are going.

    Ummmmm..... No.

    A number of years ago, I had a project that required three days for each calculation. Just for kicks, when I got my dual G5, I ran the same calculation with the same parameters and it was complete almost instantaneously. Yes, yes....I know..memory bound performance versus disk swapping of memory space, but at the time, the memory on that system was maxed out (128 MB for $5000).

    I also know that one of the games I helped work through beta (Halo) would absolutely not run on much hardware older than a few years ago.

  • Clippy (Score:5, Funny)

    by rlp ( 11898 ) on Monday February 07, 2005 @05:17PM (#11600597)
    Someone needs to ask Clippy what he's doing with all those spare cycles.
  • by litewoheat ( 179018 ) * on Monday February 07, 2005 @05:18PM (#11600611)
    There's been so mauch change in the past few decades that people keep expecting the same amount of change everywhere. Many people know nothing else. UI as developed by PARC then refined by Apple and Micrsoft hasn't really changed much except for evolutionary steps. There's no revolution coming. Cars have been driven the same basic way since the Model T. Its my firm belief that there will not be revolutionary things such as the printing press, radio, tv, and the Internet coming withing the next 100 or so years. Its time to start refining what we've created rather than look to supplant it.
  • A few things (Score:5, Informative)

    by macklin01 ( 760841 ) on Monday February 07, 2005 @05:18PM (#11600613) Homepage

    Some good things that have eaten more memory and cycles (all of which have improved the user experience, as opposed to what the summary states):

    1 Programs that check your work as you go (e.g.: autocalculate on spreadsheets)

    2 More help dialogs, things watching for cameras, and whatnot to smooth the user experience.

    3 More use of IM and other software in the background much of the time.

    4 Services running so that it's faster to sort and search files, open your favorite programs, etc.

    In short, lots of stuff running to make your experience smoother, even if it doesn't look like it's doing much more.

    Some bad things:

    1 More viruses, etc.

    2 The mandantory virus scanner that has to run in the background all the time because of (1)

    3 All the crap adware that installed more than it used to be.

    These are just a few of the trends I can think of . -- Paul

  • by nharmon ( 97591 ) on Monday February 07, 2005 @05:19PM (#11600617)
    Where have all the cycles gone, long time passing
    Where have all the cycles gone, long time ago
    Where have all the cycles gone, gone to spyware everyone.

    When will they ever learn?
    When will they ev-ear learn?

    • by Buran ( 150348 ) on Monday February 07, 2005 @05:43PM (#11600919)
      Where has all the spyware gone? Long time passing?
      Where has all the spyware gone? Long time ago
      Where has all the spyware gone?
      Gone to spammers, everyone.

      When will we ever learn?
      When will we ever learn?

      (Apologies to Mikhail Sholokhov, Pete Seeger & parent poster)
  • by Hamlet D'Arcy ( 316713 ) on Monday February 07, 2005 @05:19PM (#11600618) Homepage
    Right now I have 12 windows open.

    So a lot of my cycles are going to managing my ability to work in several programs at once. My old iBook at home allows me to have all of two windows open at once... and with noticable performance drops.
  • by toddestan ( 632714 ) on Monday February 07, 2005 @05:21PM (#11600645)
    That would be the number one way to waste cycles on any low end system nowadays. I swear, I've seen P4's with Intel graphics run slower than PIII's with even a mediocre card in it.
  • by iMaple ( 769378 ) on Monday February 07, 2005 @05:21PM (#11600646)
    But, in fact, the user experience of performance hasn't improved much over the past 15 years

    Now that really depends on what you would call 'user experience'.
    Compare a file manager 15 years ols (PC Tools had one right .. for DOS) to the KDE/ Gnome file managers (Ok MC looks the same still :) ).
    Compare pine to Thunderbird.(though I still use pine on my old laptop :) )
    Compare Usenet clients or say Lynx to Firefox,
    Compare Doom 3 to Pac Man .
    Comapre the fancy graphics on OS X to Win 3.1 or whatever OS Mac had then

    No Sirrr I say the user experience of performance HAS changed. Maybe not directly proportional to the Proceessor speed increase (due to code bloat ?) but still its much much better. Thats my $0.02 .
  • by G3ckoG33k ( 647276 ) on Monday February 07, 2005 @05:21PM (#11600650)
    Hello world of today is larger than ten years ago
    • title Hello World Program (hello.asm)
      ; This program displays "Hello, World!"

      dosseg
      .model small
      .stack 100h

      .data
      hello_message db 'Hello, World!',0dh,0ah,'$'

      .code
      main proc
      mov ax,@data
      mov ds,ax

      mov ah,9
      mov dx,offset hello_message
      int 21h

      mov ax,4C00h
      int 21h
      main endp
      end main

      "submitted by: bronson@engr.latech.edu (Patrick Bronson)"
    • It's true, 380K (WTF Mate?) [gnu.org] vs 83K in '93. Still, whatever happened to just good old
      #include <stdio.h>

      main()
      {
      for(;;)
      {
      printf ("Hello World!\n");
      }
      }
      ?
  • Heh (Score:5, Insightful)

    by FiReaNGeL ( 312636 ) <`moc.liamtoh' `ta' `l3gnaerif'> on Monday February 07, 2005 @05:22PM (#11600651) Homepage
    I think that the article writer just realized what a lot of computer buyers don't : CPU speed != more performance, ESPECIALLY when you look at graphical display and Word processing (at least he didn't include "web surfing speed").

    Where are my CPU cycles and memory going on my AMD 3500+ and 1Gig 400MHz DDR Ram? Most of the time, nowhere. 1% CPU usage, commit charge 150 megs / 1 gig. Honestly, if you don't use CPU intensive apps, there's a limit to the 'improvement' you can expect in 'graphical display' and 'word processing' speed. But sales rep will tell you otherwise, for sure.
    • Re:Heh (Score:3, Insightful)

      by hackstraw ( 262471 ) *
      Where are my CPU cycles and memory going on my AMD 3500+ and 1Gig 400MHz DDR Ram? Most of the time, nowhere.

      Thats true and untrue, depending on how you look at it. I have a CPU meter on my personal machine and I expect it to be hovering around 0. Right now its just got a little of activity going as I type this because it is spell checking as I type each word. Much less than 10% of the CPU. (I wish OSes came with these things so that people were aware of what was going on with their machine like if its
  • Uhmmm, no. (Score:5, Funny)

    by Scratch-O-Matic ( 245992 ) on Monday February 07, 2005 @05:24PM (#11600694)
    I didn't rtfa, but..no. Nine years ago I used to start my word processor (Ami Pro!), then go take a leak while it loaded. What a BS claim.
  • by coffeeaddict007 ( 813128 ) on Monday February 07, 2005 @05:24PM (#11600697)
    I started out on a 8088 processor 11-12 years ago. Now I am using a dual proc G5 at work, which is so fast I can no longer blame the computer for my coffee breaks. It takes a good bit of video rendering to keep it busy long enough for me to get a coffee refill.
  • by grahamsz ( 150076 ) on Monday February 07, 2005 @05:25PM (#11600710) Homepage Journal
    Generally (at least on my less used windows box) i have most of the following running most of the time....

    1) VoIP Client
    2) Messaging Client
    3) Word Processor
    4) Multiple Web Browsers
    5) Email Client
    6) Probably some graphics or photo editing tool
    7) Something playing music

    In addition there are various other background processes like desktop indexing, things watching for my digital camera being plugged in, smart start stuff...

    Linux is probably worse since i keep Apache and often Tomcat running all the time.

    Back in the day, this was never how it was done. You'd optimize config.sys to get the absolute max amount of free conventional memory.

    Multitasking has improved to the point that many users probably run close to 100 processes at any point in time..

    prstat here says i'm on a system with
    Total: 3741 processes, 6739 lwps

    Fair enough it's a shared box, but that scale was impossible a decade ago.
  • Mac OS X (Score:5, Insightful)

    by weave ( 48069 ) * on Monday February 07, 2005 @05:28PM (#11600745) Journal
    An interesting thing I've noticed since I got a laptop running OS X is that it seems every new release and patch for it "improves performance" in some area.

    So Apple is bucking the trend, or their first versions of OS X were an inefficient piece of crap and they are just now optomizing it.

    • Re:Mac OS X (Score:3, Interesting)

      Yes and yes. Apple got religion a couple years ago and got on the profiling tools [apple.com] bus internally.

      Mac users are demanding and impatient. All that typical slowness you see logging in, opening apps, closing windows, etc., with no feedback on XP makes Mac users want to pluck their eyes out.

      You can come out with something quite elegant, like iPhoto 2, but if the performance isn't there that's all you're going to hear about. Mac users will whine incessantly until it's fixed.
    • Re:Mac OS X (Score:5, Informative)

      by mj_1903 ( 570130 ) * on Monday February 07, 2005 @06:36PM (#11601539)
      I think it's a bit of both.

      Apple pushed to get OS X released to the public and so they followed the belief of "make it work then optimise". Today we can see the fruits.

      An example of this is Quartz. Quartz basically had all the components you needed in 10.0 to do some great on screen rendering and it was reasonably fast. Through each iteration of Mac OS X though it has improved. In 10.1 the speed of the code was improved. In 10.2 we had partial acceleration via the GPU. In 10.3 more optimising. In 10.4 we can see they have completely pulled apart sections of Quartz and rewritten it as well as buffering it all onto the graphics card. That is but one example though, there are plenty of others.

      On the other hand, apps like iPhoto and GarageBand were really sluggish and the system reflected that. Mac users cried foul and now you have iPhoto 5 which is blazingly fast and literally all the apps have been following that trend. I know as a developer myself I spend a good 20-30% of my time optimising code simply so users get the speed that they are now used to. It's good, we needed it, especially when we were stuck on the G4's. Now with the G5's it's just icing on the cake.
  • by bani ( 467531 ) on Monday February 07, 2005 @05:28PM (#11600750)
    just like users will manage to fill most of the storage space available (no matter how large that may be), user tasks will manage to fill most of the cpu available (no matter how fast the cpu is).

    the subjective performance of overall data processing hasn't changed much, but that's just because task complexity has increased as cpu speed increased.

    15 years ago, most applications were far less computationally complex than they are today. it has little to do with code bloat.
  • In the old days, ... (Score:4, Interesting)

    by Great_Geek ( 237841 ) on Monday February 07, 2005 @05:30PM (#11600769)
    In the old days, things were done with only a few cycles:

    Apple II (1 MHz 6502) did animated graphics with sound and controlled floppy access while polling the keyboard (The Bard's Tale)

    Amiga (14 MHz 68000) had complete GUI, multi-tasking, on 256K RAM.

    The old saying that "Intel giveth, Microsoft taketh" is about right. The CPU's have gotten faster, with the Microsoft O/S taking more and more cycles to do the same thing.

  • DCTI (Score:5, Funny)

    by Leroy_Brown242 ( 683141 ) on Monday February 07, 2005 @05:35PM (#11600830) Homepage Journal
    distributed.net [distributed.net] is where all the smart CPU cycles have gone! :)
  • by Thunderstruck ( 210399 ) on Monday February 07, 2005 @05:37PM (#11600850)
    My friends ask me why I haven't upgraded my 400mhz machine in years. Look at all this new stuff they say, look at all this eye candy. Look at these great new games.

    And then I load up my MUD client, with simple, 16 color text in a 12 point font. This is my favorite game.

    And then I load up my word processor, AbiWord, which renders as fast as I can type and has a nice spell-checker. This is my favorite word processor.

    And then I load up Kmail, Mozilla, and all the other "normal applications" which have never had a problem with virii or worms.

    And after all this they realize, the problem with my computer is THEIR expectations, not my software and hardware.

    (And then they ask me when I'm going to replace my rotary phone... I can't win them all.)

  • by merlin_jim ( 302773 ) <{James.McCracken} {at} {stratapult.com}> on Monday February 07, 2005 @05:42PM (#11600898)
    Is that it is (like nearly every other system out there) an economical system; it's all about supply and demand.

    I this case there's a supply of plenty of clocks. There's an (existing) demand for a certain level of performance; if the supply outstrips that demand, then the supply is devalued, and consequently the programmers don't spend as much time conserving that resource.

    Or to put it another way, programs behave like a gas with respect to responsiveness and user expectation; they expand to fill the available space.

    Or to reword it another way (quoting from the article): computers are, in fact, doing more than they used to. A lot of the things computers do are fairly subtle, happening beneath the radar of a user's perception. Many functions are automatic and, as discussed in last month's column, you could probably do without some of them.
  • by nick_davison ( 217681 ) on Monday February 07, 2005 @05:44PM (#11600923)
    I haven't had to set an IRQ or DMA setting in years. I've not had to mess with himem or any other arcane memory configs and boot disks, restarting my entire system each time I want to run a different game.

    Each time I plug in a new joystick and it just works, each time I plug in a new digital camera and it's just there as another drive, each time I alt-tab out of a game, check a walkthrough website, then alt-tab back, I think back to the old days where code was really efficient and didn't do any wasteful background tasks like that.

    I remember helping a friend with a C++ assignment, via the net. Each time, she'd have to exit her telnet program, run Borland's C++ compiler from the command line, check the output, quit the compiler, reopen telnet, reconnect to the MUD we were talking over, then describe what had happened. Now... She'd just show me what's on her desktop via Messenger while we kept chatting.

    And if some cycles get used up doing weird UI gimicks that I'll never use - like making the UI scalable so the partially sighted can use it, I'm willing to trade that.

    For all those reasons, I'm more than happy that my 2^(years / 1.5) faster PC "wastes" all of those extra cycles. And that's before we get on to things like built in spell checkers and real time code debugging as I write it.

    I don't want a 2^(years / 1.5) faster experience. I want all those cycles put in to making things work closer and closer to how I just expect them to work.

    I don't know about anyone else but I can't code 2^(years / 1.5) faster so I wouldn't be able to keep up with that damn responsive text based compiler. On the other hand, I am that much faster overall as I now call an API that adds all that "bloatware" instead of having to code my own damn mouse drivers, my code is largely debugged on the fly and I can't remember the last time I lost several days just trying to format a newsletter in to columns.

    So, before saying the cycles are wasted:

    Pick an every day but semi complex task that people do now. For example: For a homework project, go on line, grab half a dozen graphics and ten blocks of text from those websites, put them all in to a stylishly laid out newsletter format. Do that on a P4, then do it on an a DOS PC from 15 years ago.

    See if matching the same quality of work doesn't take you 2^10 times as long on that old PC, assuming you can even do it at all.

    Those cycles aren't wasted. Sure, we do the same basic tasks but we do them with vastly more flexability and don't have to waste days of our lives wrestling with configs to do what we now consider simple tasks. That's where the speed is.
  • by vertigo ( 341 ) on Monday February 07, 2005 @05:45PM (#11600939) Homepage
    I somewhat disagree with the premise of the article. 10 years ago, a barely usable word processor took 15 seconds to start up. I remember Wordperfect 6 being a terrible resource hog on my measly pc. Compare that with the load-times and responsiveness of AbiWord and a lot of progress has been made. The same with Mozilla/Firefox. Netscape 4.0 took something like a minute to load on my 386dx40/4mb iirc. Firefox takes 2 seconds (amd 1800+, 512mb) and has much more functionality. So, in my opinion, user experience of performance has improved. I think people might have forgotten how bad stuff really used to be performance-wise.
  • by neomage86 ( 690331 ) on Monday February 07, 2005 @05:45PM (#11600942)
    5 years from now some *nix (most likely the latest iteration of MacOS) on parallel cell CPUs. 10 years, a unified virtual machine so applications will be OS independent(I think by Microsoft) 15 years Computing will be a commodity. I will pay $N a month for computer usage (based on what specs I want my computer to have). There will be ubiquitous dumb terminals (everywhere from home, to work, to school) each will have a massive (by today's standards) data pipe. You authenticate and it will become your computer. Your desktop of choice, your files, your preferences. There will be few massive datacenters, so everything is amazingly scalable and centralized. If I want to do video editing, I will have access to supercomputers worth of power while I need it, and while word processing I will use almost nothing. Right now, it looks like Google will run these datacenters.
  • by MobyDisk ( 75490 ) on Monday February 07, 2005 @05:45PM (#11600944) Homepage
    Software spending CPU cycles hiding complexity is a good thing. Software spending CPU cycles hiding simplicity is a bad thing. Many times, "wizards" are used that make things harder than the manual process. For example:

    The dekstop files & folders paradigm is fine if marketing dweebs stop designing wizards that hide simplicity in a layer of complexity. What if I had a maid who said "I see you just set a piece of paper on your desk? Do you want me to file it for you? Great, I'll just shred this original while I'm at it, and you can conveniently ask me to find it whenever you need it!"

    Example 1:
    My dad plugs in his digital camera, and it displays a camera wizard. Great! It asks for the album name and places it in a convenient album with a nice slide-show.

    The next day, he wants to edit one of the pictures, or copy it, or rename it. Too bad. Because it's now in a proprietary format in an album management program. The wizard was completely unnecessary. It have been easier for him to create a folder and drag the files into it. It would have functioned in the normal way files and folders work. He would know where they are, and could open, email, rename, delete, etc.

    Another example:
    My mom inserts a CD and Media Player asks her if she wants to rip the files to the media library. It even does a CDDB lookup and names the albums accordingly. Great! So where's that .MP3 file? No? Maybe it's a .WMV file? .OGG? .WAV? No... it's in the media library. And there it lies forever. You can't play it with anything else. Now I show her how to use CDEX, and click the CDDB button, then the RIP button, then whoa! And she can do whatever she wants with it.

    Now I want to email that file. But I can't. Because it's not in a file on the file system, it's hiding in some "convenient" media library for me. And I want to view the pictures in the order the camera took them.

  • Vested Interests (Score:3, Insightful)

    by benjamin_pont ( 839499 ) on Monday February 07, 2005 @05:45PM (#11600945)
    Don't you think that if we all had zippy computers with slim, efficient operating systems and applications that made modest use of resources, and had only the features people wanted, then there would be a lot of bankrupt technology companies and unemployed programmers since no one would be upgrading their systems (much)?

    * Excerpt from Aldous Huxley's Brave New World *

    "We condition the masses to hate the countryside," concluded the Director. "But simultaneously we condition them to love all country sports. At the same time, we see to it that all country sports shall entail the use of elaborate apparatus. So that they consume manufactured articles as well as transport. Hence those electric shocks."

    "I see," said the student, and was silent, lost in admiration.


    By the way, current number of mouse-clicks to configure viewing an MS Outlook sender in a given color:

    17

    Don't $top that fat, gravy-train from rolling! Keep the bloatware coming!

  • Start-up time (Score:3, Interesting)

    by Trogre ( 513942 ) * on Monday February 07, 2005 @05:55PM (#11601062) Homepage
    It would be interesting to graph system startup times year-by-year with then-standard distros running on then-standard hardware. I suspect start-up times haven't changed significantly since the 70's.

    Does anyone here recall the famous if not accurate "Whoa, Win95 boots in under 3 seconds!!!" usenet thread?

    Startup time is currently an area where the likes of Windows XP excels over Linux. On an Athlon 2600+, XP takes 6 seconds to boot (and become usable) whilst Fedora Core 3 takes closer to 90 seconds.

    Yes, both use prelinking (or prefetch if you like), but linux distros still don't load independent services in parallel, and I suspect Fedoras prelinking is far from optimized.

  • by UnknowingFool ( 672806 ) on Monday February 07, 2005 @06:09PM (#11601244)
    at where all the precious processor time and memory are going.

    1) pr0n
    2) Sharing pr0n.

  • by WoTG ( 610710 ) on Monday February 07, 2005 @06:35PM (#11601532) Homepage Journal
    I hate talk about bloat, slowness, and lazy programmers. For the same tasks, computers are far easier to use today than even 5 years ago. Never mind the fact that we can also do about 10 times as many different tasks now, than we did when I first started using PCs 10-15 years ago.

    Yes, if I could find a floppy drive, and get a dos boot disk to boot, I could theoretically run a wickedly fast instance of WordPerfect 5.1. I wouldn't be able to surf the web, send email, listen to MP3's, work wirelessly, or work with graphics though -- and yes, graphics (e.g. diagrams) do have a proper place in day to day work!

    Do people even remember the non-PNP days? IRQ's, IO Ports and the rest? Non-multitasking? Non-Memory protected (i.e. complete OS crashes from app errors?). These issues didn't seeem so bad back then since "that's the way it was", but now, I dread ever having to deal with those limitations again. Futzing with IRQ's for an hour just to make a modem stop locking up a PC is not my idea of productivity.

    Hardware is cheap. Time isn't. I just hope we keep finding more ways to make my use of computers even easier.
  • by rsw ( 70577 ) on Monday February 07, 2005 @06:46PM (#11601645) Homepage
    I don't know about you guys, but back when I was running linux on a 33 MHz 80386, my kernel compiles went overnight. Now they take, oh, ten minutes.

    I'd say that's an improvement, wouldn't you?
  • by harlows_monkeys ( 106428 ) on Monday February 07, 2005 @08:19PM (#11602491) Homepage
    I disagree that the user experience of performance has not improved over the last 15 years. 15 years ago, I was using a Mac II.

    When I bought a Centris 650 in the early 90's, it was noticably faster--so much faster that I brought it to work to show my boss, as I was sure he would not believe my stories of how fast it was.

    This same thing has happened to me with every generation of PCs, too...it's not just a Mac thing. I buy a new machine, and marvel at how much faster it is.

    Furthermore, I can go the other way to verify this. I still have my Centris 650 in storage, and booted it up a couple years ago. It was so slow that I could not believe that I ever found such a slow machine usable.

    What is really going on is that it doesn't take us long to get used to a fast machine, and since we normally never go back, we don't realize just how much faster things are now.

  • Thats easy (Score:4, Funny)

    by BCW2 ( 168187 ) on Monday February 07, 2005 @09:31PM (#11602818) Journal
    I had a new Dell 3.4 come into the shop last week. It was slower than our cash register box(450 PII). Of course, if any box out there had 64 processes running at start up it would be a bit slow. The customer had the box for 3 weeks. First scan with Ad-Aware = 2803 critical items. A new store record. Plus 247 on Spybot, 8 virii, 15 trojans. I'm really surprised it didn't blue screen at boot (had 2 of those last week).

    Crap uses up processor time.

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...