Things That Turbo Pascal Is Smaller Than 487
theodp writes "James Hague has compiled a short list of things that the circa-1986 Turbo Pascal 3 for MS-DOS is smaller than (chart). For starters, at 39,731 bytes, the entire Turbo Pascal 3.02 executable (compiler and IDE) makes it less than 1/4th the size of the image of the white iPhone 4S at apple.com (190,157 bytes), and less than 1/5th the size of the yahoo.com home page (219,583 bytes). Speaking of slim-and-trim software, Visicalc, the granddaddy of all spreadsheet software which celebrated its 32nd birthday this year, weighed in at a mere 29K."
Killer App (Score:3, Interesting)
Re: (Score:2)
You could fit Visicalc into the codespace of an Arduino. Good luck fitting much of a spreadsheet and the systems variables into 2K of RAM though.
Re: (Score:3)
Thankfully it's still covered by copyright, or the whole market would implode!
Pascal v/s C (Score:3)
But under 'C' it felt somehow very natural and easy to understand, never had a problem with 'C' pointers, and data structures. R.I.P. DMR, your book really opened my eyes to the wonderful world of computing.
Re: (Score:2)
Really? I felt it was the other way around: pointers on Pascal felt intuitive, while pointers in C made my programs more prone to failure.
Re:Pascal v/s C (Score:4, Interesting)
It was the same with me. I learned Turbo Pascal and knew about pointers, but only when I switched to C I realized that pointers are numbers, like indexes in an array.
There were a lot of things that were easier to understand in C than in Pascal. For example scanf and printf were just library functions, while in Pascal readln and writeln were parts of the language. Also, what "#include " did was perfectly clear - a simple text substitution, i.e. the same as if I had gone the header and copy-pasted its contents in the .c file, while in Pascal when you write "uses crt;" I wasn't sure what actually happens. The fact that text was an array of numbers was not clear to me while I was using pascal, what with all the Chr and Ord function to move from Character to Integer, and strings were part of the language and were like blackboxes.
Re: (Score:3)
Text is not an array of numbers. That's why a lot of applications required heavy-handed porting to unicode. Text is an array of characters. I don't routinely need Chr nor Ord (nor their equivalents in any other language). Their use is pretty much limited to a somewhat hacky implementation of base conversion for I/O, and even there it's a safer bet to use a lookup array unless you really need to optimize memory use.
If you have a string and want ASCII output (or input), you use a codec to do it. Any non-trivi
Re: (Score:3)
That's because you learned what pointers were in Pascal first. I did that too, Pascal first, then C. C made sense because I knew what pointers were.
--PM
Re: (Score:2)
Bytes? (Score:5, Funny)
Bytes are kind of weird. Can't they give these number in terms of Library of Congress?
Re: (Score:2)
[deadpan]
I'm confused. They say the size was 24K. What's a K? Is that a typo, shouldn't it be M?
[/deadpan]
Fraction of an M (Score:2)
Re: (Score:2)
I don't know how many Lines Of Crap the library of congress has, does anyone know?
So what? (Score:2, Insightful)
First thing that comes to mind is: so what? This whole argument that smaller is better is crap. The reason that software is bigger these days is that it does more for you. How productive was the GUI for Turbo Pascal (it sucked), how good were the other tools that came with it (nonexistent), how fast were the release cycles (about the same as today). So with what people call bloat we get better tools that make us more productive thereby driving down the cost of software development.
Or to put it another w
Re: (Score:2)
Chill. It's a typical /. entry about how grass was greener (I testify, by the way, being a relative geezer).
Re: (Score:2)
"touch" does more than a Pascal compiler?
Re: (Score:2)
Or to put it another way: if you really don't like bloat, when are you going to trade in your car and start driving to work in a hot wheels?
I think a more accurate comparison might be to ask why you need a Hummer to take your two kids to school when you can do it equally well in a Ford Focus which is half the size and probably 10 times as fuel efficient?
The reality is that people don't optimize software because they generally don't have to outside of embedded programming. Even software for iOS and Android can be horrendously bloated in terms of the resources it uses. Interestingly, it's console hardware limitations that have been leading to bet
Buying one SUV vs. two vehicles (Score:2)
I think a more accurate comparison might be to ask why you need a Hummer to take your two kids to school when you can do it equally well in a Ford Focus which is half the size and probably 10 times as fuel efficient?
If there are in fact situations where one needs a Hummer or similar SUV, is it cheaper to fuel an SUV than to buy an additional small passenger car for trips that don't need the SUV?
Re: (Score:2)
Re:Smaller IS better (Score:3)
Sorry, your post went a little south with the Vista debacle. Because MS had to fast track a recovery of a whole new architecture, the result was not optimized at all ... and netbooks got crushed. Windows 7 is more sensible because they did have time to snip out a lot of the junk code.
Currently we're disparaging the need for tight code, but give it one skipped cycle of Moore's law and suddenly the software side will have to take up the slack. Currently it's the mobile phones with their weaker processors that
Re: (Score:2)
Currently it's the mobile phones with their weaker processors that are preventing "dock your phone into a workstation shell" from being the universal desktop in your pocket.
Are you sure it's just the processor, or is it Apple's restrictions on what gets accepted into its App Store?
Re: (Score:2)
Intellisense/autocompletion is the only IDE tool that is vaguely useful (though vastly overrated). You can use vim, or hell, notepad, to do all your coding, for real projects, along with gcc and make.
Code is/can be compact (that's why demos can be so small) and even in today's world, there's still value in knowing how yo code compactly.
Re: Code Compactly in Notepad (Score:4, Interesting)
Speaking of bloat, there's a humorously insightful article here about http://www.trygve.com/doomsday.html [trygve.com]
Those WYSIWYG creators produce the most gawdawful code full of
Re: (Score:2)
The reason that software is bigger these days is that it does more for you.
It depends on whether I actually want that "more". For example, I use mpc-hc for playing music files because I only want a media player, not a media player+CD ripper+MP3 encoder+CD writer+music store+something else. So, for me, iTunes are bloated because the software has all those functions I do not usually need. When I need them, I can start another small program to do that function.
Re: (Score:3)
>The reason that software is bigger these days is that it does more for you.
Like Clippy?
Re: (Score:2)
Smaller is usually better for a given functionality. Your point that software does more is valid too.
To interject with my own babble, why does a composited desktop require OpenGL? That stuff can be done with a few K of highly optimized C or ASM code. Another example of bloat in the name of easy to implement - never mind the added depen
Re: (Score:2)
The reason that software is bigger these days is that it does more for you.
That doesn't explain everything. In fact, it doesn't even explain very much. The only thing on that chart that does more than Turbo Pascal is the Erlang parser: the rest either deals with simple Unix commands or libraries, or is simple documentation (which doesn't do anything by itself).
Or to put it another way: if you really don't like bloat, when are you going to trade in your car and start driving to work in a hot wheels?
The Hot Wheels car doesn't do what I want. De-bloating software means cutting out unnecessary crap, not necessary things.
Small, yes, but keep some perspective... (Score:2)
Sure, these programs were small, but try to keep in perspective that they were leveraging the OS to get their compactness. It's kind of like saying a "Hello, World" GUI app is only 3 lines of code.... sure, 3 lines, plus 35 megs of library files running atop 1 gig of OS support. The .exe may only be 2k, but good luck getting that to do anything without serious support.
Re: (Score:2)
In the case of Turbo Pascal, probably pretty much everything was in that .exe. Don't forget there was no dynamic linking of libraries, let alone much in the way of libraries on MS-DOS in the first place. There would have been various int# calls within the executable to call the "operating system" but the OS (BIOS + DOS) itself wasn't all that much bigger.
Certainly in the case of the likes of spreadsheet programs for 8 bit systems, at most you probably had a 16K ROM in addition to the program itself, so stil
Re: (Score:2)
Even DOS provided a lot of high level file, screen and keyboard I/O functionality.
Yes, DOS itself wasn't that big, and, hey, I've implemented entire 8-bit systems with significant RS232 based I/O on 16K PROMs with no OS, so, yes, it can be done smaller.
I just get tweaked when somebody calls out "look ma, it's only 4K" and they're sitting on top of some ginormous library that's doing everything for them.
Not an EXE, it was a COM file, people (Score:4, Informative)
My suspicion is/was that the RTL (run-time library) was hand-coded in assembly language and from .COM file sizes of stuff compiled with Turbo Pascal 3.0 that RTL ran maybe about 10-12K. That is, the Turbo Pascal image had the hand-coded RTL in the first 12 K of the image and the rest -- editor and Pascal compiler -- were written and compiled in Turbo Pascal and occupied the rest, which was about the size/scale of a simple editor and a Pascal compiler based on the complexity of source codes for those things that were "around." The cool thing, especially on dual floppy disk PCs, was that the 39K was everything, no overlays, no nothing else. The 12K RTL got plopped into the COM file compiled from your source codes.
The thing about it is that yeah, yeah, you had the limitations of Pascal, the Small memory model, 64K data segment, and Borland didn't even get the 8087 math coprocessor support right (inline instead of high-overhead function calls to a math library) until Turbo 4, which wasn't anywhere as kewl as Turbo 3 from the standpoint of compactness. But you could develop useful apps with this thing on a dual-floppy machine.
The other thing about this is the Pascal language. I had a conversation with a dude who was selling some 3rd party library for the Turbo Pascal ecosystem who expressed the view that hate the begin-end, hate the quirky use of semicolon as a statement "separator" instead of 'terminator", hate the bondage-and-discipline aspects (although the Turbo dialect of Pascal solved the fixed-length string problem and gave you enough overrides to the Pascal type safety to allow it to do anything C can), Pascal is the Ur Single-Pass Compiler language. I guess the Arch language of simple parsing at the expense of stupid looking source would be Lisp, but Pascal was close behind in terms of simple syntax and simple compiler implementations. Back in the day before we had Cray Y-MPs on our desk as we effectively do today, that compilation of large programs in the time of a sneeze instead of a long coffee break was a huge, huge productivity booster that made up for whatever people hated about Pascal.
So ol Nicky Wirth was a smart dude when he invented Pascal, and Anders Hejlsberg (Philippe Kahn was just the front man) was also a smart hacker in coming up with Turbo 3, and you have to give the man his propers in hackerdom. For what it is worth, Hejlsberg crossed over to the Dark Side and is credited as the Chief Architect behind the abortive Microsoft Java ecosystem J-somethingoranother from which came the good Visual Studio versions, C#, and all of that.
Re: (Score:2)
Turbo Pascal 5.5 had overlay file support, that was something like a dynamic linking library. Don't know whether older versions could do that, never used them.
Re: (Score:3)
Re: (Score:2)
"they were leveraging the OS to get their compactness"
You do realize 1986 Turbo Pascal ran on DOS, right? Back when developers wrote directly to the monitor and disk drives? When we had to have an intimate relationship with the stack? When there were no threads and one process owned the entire CPU?
Leverage that, young whippersnapper, and get off my lawn.
Re: (Score:2)
DOS did more than people realize. I'm not saying Gates was a genius or anything, but there's a reason he got a lock-in, if DOS was truly useless, people would have re-coded their own drivers.
I seem to remember purchasing a third party graphics library in the early '90s, MetaWindows or something like that, it gave me a access to accelerated graphics that DOS wouldn't, without having to re-code for every graphics card on the market. But, DOS still took care of the file system and keyboard (I think MetaWindo
Re: (Score:2)
Re: (Score:3)
Think about this the entire image of pascal and dos would run in what we would consider to be 'minimum cpu cache' these days...
Oh. Maybe 3 years ago there was a comment in Slashdot where some guy said he was able to disconnect RAM of a running system on the fly and Windows 95 ran for a good while straight from the cache of a modern CPU. The core components did fit there.
This just in (Score:5, Insightful)
Software grows to fill the available ram.
Code is always a tradeoff between codesize, development time and ram needed for execution. I'm fairly sure you can optimize code today to a point that would put those programs (which were optimized 'til they squeaked to squeeze out that last bit of performance) to shame, but why? What for? 30 years ago, needing a kilobyte of ram less was the make or break question. When drivers weighed in the 10kb range and you still calculated which ones you absolutely need to load for the programs you plan to run, where you turned off anything and everything to get those extra 2 kb to make the program run. Today, needing a few megabytes of ram more is no serious issue. And mostly because it just really doesn't matter anymore. Do you care whether that driver, that program, that tool needs a megabyte more to run? Do you cancel it because it does? No. Because it just doesn't matter.
We passed the point where "normal" people care about execution speed a while ago. Does it matter whether your spreadsheet needs 2 milliseconds longer to calculate the results? I mean, instead of 0.2 you now need 0.202 seconds, do you notice? Do you care? Today, you can waste processing time on animating cursors and create colorful file copy animations. Why bother with optimization?
Because, and that's the key here, optimizing code takes time. And that costs money. Why should anyone optimize code if there's really no need for it anymore? And it's not the "lazy programmers" or the studios that don't care about the customers. The customers don't care! And with good reason they don't. They do care about the program being delivered on time and for a reasonable price, but they don't care whether it needs a meg more of ram. Because it just friggin' doesn't matter anymore!
So yes, yes, programs back in the good ol' days were so much better organized and they used ram so much better, they had so much niftier tricks to conserve ram and processing time, but in the end, it just doesn't matter anymore today. You have more ram and processing power than your ordinary office program could ever use. Why bother with optimization?
Re: (Score:2)
Today, you can waste processing time on animating cursors and create colorful file copy animations. Why bother with optimization?
Because some people have netbooks with Atom CPUs, and some people still use really old PCs because they're paid for. Clock-for-clock, a 1.6 GHz Atom is roughly comparable to a 1.6 GHz P4.
Re: (Score:2)
Even 1.6GHz single core is already way more than you could possibly need for pretty much any office application. We outgrew the office needs a long, long while ago. My guess would be somewhere around the P2/P3 era.
We need "faster" to run more crap, that's pretty much it. And yes, running a bloated OS on that atom clock will not result in satisfactory performance. It's not the program that needs more ram or more HD space that's the problem here, though. It's not even the program's CPU hunger. It's the many,
Uninstalling helps but isn't perfect (Score:2)
It's the many, many little programs that litter your ram, from drivers for god knows what kind of freaky hardware you rarely if ever use, to "enhancements", and the omnipresent crud you get for free with every new Netbook.
Uninstalling unused programs helps, and it's the first thing I try when family members tell me a computer is running slow. But the fact remains that some programs won't run acceptably on a netbook or an old P4 PC even after I've uninstalled all that shit. I've seen applications fail to run because their forms are laid out for a monitor at least 768 pixels tall. I've seen Adobe Flash fail to keep up on Flash videos, and I've seen Firefox fail to keep up on HTML5 videos. I've seen native games fail to run bec
Re: (Score:2)
Old PCs should run with software from the same era. You can't expect today's software to run on hardware from 10 years ago, or else why stop there? Why not ask for Windows 7 to run on a 486? OSX on an Apple II?
As for Atoms, they have no problem running Windows 7 or the latest Linux distros.
Re: (Score:2)
Explain please how using less ram and less HD space saves any energy.
Making a new computer costs energy (Score:2)
Re: (Score:2)
Sorry, won't fly.
Using SSD instead of HDD would not fly for the obvious reason I gave in the entry of my first post: Space in Ram or HD will be filled by programs. There are plenty of SSDs today that have more than enough room to hold OS, office and data files. Saying that the HD footprint of a program keeps you from buying a SSD is like saying that the size of your computer keeps you from buying a nicer, but smaller, apartment closer to the city. It's not the program that keeps you from buying a SSD. It's
Re: (Score:2)
The point of saving ram is that you can get away with actually having less of it available in the first place, thereby saving energy that would be needed to power the extra transistors that would have made up the unused portion of ram.
Of course, this is much more important for things like embedded devices than it is for PC's.
Newer editors support more languages (Score:3)
Take a simple text editor from 10 years ago and compare it to a modern one. The modern one doesn't really do much more
The old editor supports only 8-bit encodings of left-to-right characters, which means English and a few other European languages. The new Pango-powered editor supports UTF-8, large character inventories, stacked diacritics, and bidirectional writing, which allows for every national language on the planet including Chinese, Arabic, and other official languages of emerging economies.
Re: (Score:2)
Energy consumption.
Speed
I hated Turbo Pascal (Score:4, Interesting)
In my first job, I was responsible for developing a programming environment for a Pascal-like language that included a visual editor, interpreter and debugger. I remember my boss showing up in my office and showing me an ad he had cooked up, with big, bold lettering saying "Runs in 256 kB!"
As a young developer, it was one of the tougher moments in my life to admit that we were going to need a full 512 kB.
It was difficult living in a world where Turbo Pascal ran comfortably on a 64 kB machine.
Coincidence (Score:2)
39,731 is the exact number of milliseconds it takes to lose interest in an Apple fanboi blog.
Pictures are large (Score:2)
Yes, it does stand to reason that something from 1986 is smaller than IMAGE files (yahoo's homepage, wiki's C++ page, PDFs, etc). That 256-bit color depth means we need 256 bits to define the colors after all.
Instead of comparing to PDFs, image files, and things written 30 years later, why not compare to contemporaries?
I would be tempted to call this s slashadvertisement, but they're not even advertising something. Where's the 'pointlesslyIdle' tag when we need it?
Re: (Score:2)
Instead of comparing to PDFs, image files, and things written 30 years later, why not compare to contemporaries?
He also compares it to the 'touch' command. Not even close in being an editor, let alone compiler as well. And still bigger.
Re: (Score:2)
Sidekick (Score:2)
that chart (Score:5, Funny)
At 88,225 bytes, the image showing the comparison is also bigger. Oh the irony
Turbo Pascal Rocked (Score:2)
"Doesn't this violate the laws of thermodynamics?" (Score:2)
That's what my friend and boss Wayne Holder said about Turbo Pascal when I demo'd it for him back when it first came out in the early 80s. It wasn't just that TP was vastly smaller than any other Pascal (C, FORTRAN, etc.) compiler out there, it's that it compiled much, much faster -- in some cases, an order or two of magnitude faster. ..bruce..
Despite being Pascal, it was tight. (Score:4, Interesting)
Turbo Pascal was pretty sweet, even though it came from Borland, and even if it was Pascal. It could compile 5,000 lines of code in the blink of the eye. Embedding assembly into it? No problem. It didn't care. The editor was supreme as well. Even when I stopped using TP, I still used the editor every day for a decade after the fact because it could do absolutely everything.
I'm not sure where all the hating is coming from, because TP did not generate hugely bloated executables. The only problem with it was that it eventually was discontinued, so special hacks like paspatch were required to patch TP compiled executables on the P II and higher to allow them to run.
It was actually closer to 512K with all of its dependencies, but it was damn fine.
Re:Quite sad how bloated everything is (Score:4, Insightful)
When I think back to playing vast adventure games, like Below the Root, that amazingly fit on two sides of a 5.25" floppy, but the same game now would probably be written to take up a CD-ROM, even using the same graphics. Programmers have lost the ability to optimize.
I think they have the hability (some of them at least) but don't have the need or the time.
Re: (Score:3)
Re:Quite sad how bloated everything is (Score:5, Interesting)
And that attitude is why we lost the phone and tablet markets. There was a time when Linux was perfect for older systems... the sort of specs that also happen to match up with new small platforms. But we got that 'screw em, let them buy a real computer' attitude and now /bin/touch on my Fedora 15 laptop is 60856 bytes. The little gadget in my XFCE tray to allow me to control the backlight is currently reporting 6200K in resident set. XFCE is supposed to be the 'lighter' alternative to the GNOME freak show. Ever wonder why Google passed all the userland by and made their own for Android? Well now you know and your attitude is what caused it.
Nokia was stupid enough to believe they could build small devices by reusing parts of the Linux desktop, they failed. Good grief, look how much bloat is in little things like esd or pulseaudio. Megabytes of resident set sitting around in case something wants to make a sound? In hardware that had as little as 64MB Ram (Nokia N770 tablet) that sort of resource misuse killed them.
There was a time when System V UNIX would run on machines with a MB or two of RAM, with terminals hanging off serial ports and a couple tens of megabytes of hard drive could run a retail operation.
Yes there is something to be said for trading developer time for hardware. The time to do that is vertical apps and other applications where the number of deployed systems is small compared to the developer hours available. In a mass deployed application the developers should be required to care a little more about what they are asking millions of users to throw away to the great God of the upgrade treadmill.
Re:Quite sad how bloated everything is (Score:4, Insightful)
Then again, today's programmers would rather import the whole STL just to be able to use one String, rather than take 15 minutes to write their own class. (oops, they couldn't write one in 15 minutes ... oh well ...)
Now everyone is writing their own String class, you have to pay them for that effort. That 15 minutes may not seem like a lot, but if everyone is doing shit like that all the time, the costs will add up. Also, at some point you will want to interoperate with some third party library. Wouldn't it be great if there were some sort of standardized String class so you don't have to convert from your String to their (inevitably screwy) String class? Repeat this for many datastructures and third party tools and libraries.
Higher level languages didn't arise for the hell of it; if we needed to be worried about 128k of RAM, we'd still be writing code like we did in the old days. Now, we don't have to (minus certain domains), so why not trade space for time / money? We make all kinds of optimization trade offs already; ease of maintenance tends to not be one we often think of.
Re:Quite sad how bloated everything is (Score:4, Insightful)
First, you only have to have ONE person in your org write that custom string class that does exactly what you want, no unpredictable side effects, no bloat.
That 15 minutes pays itself back almost immediately, both in easier debugging (less code to debug) and quicker compile times,
I wouldn't say it's less code to debug, but more, because now you have to maintain your string class.
Second, 3rd party libraries are always going to be a problem, but usually you just give them a pointer to (a copy of) the data structure, never to your class. No big deal. I null-terminated string (c-style string) is a null-terminated string. A string with the first n bytes giving the actual size (a pascal-style string, can also be used for BLOBs) is a string with the first n bytes giving the actual size. These are the two standard ways of modeling string data.
These are the standards, but Crazy Ass Corp Super Deluxe Hyper-whatever Library is going to do whatever you're not doing, and in a way that you can't just point to the internal C string. Never underestimate vendors: instead of your nice, 1960s style null terminated array of 1-byte characters, they're going to an array of 64-bit integers where they've packed in multiple 8-byte characters, but have decided to leave the last byte of each 64-bit integer as 0xFF for future use and they use EBDIC. Yes, this example is highly contrived and nonsensical, but never underestimate the inability of your colleagues to write software.
This doesn't even touch on the STL's various algorithms to e.g. loop over all characters in a string and perform a function. Again, it's easy to write it yourself, but the STL is written in a nice, general fashion that makes it easier to interoperate, makes it easier to understand what is going on, and doesn't require you to continuously reinvent the wheel. Yes, once you write your BetterString class, you can reuse it, but over time you will keep adding functionality to it until it becomes std::string, and your office on the other side of the country may not know and may have written their own, etc.
But you don't pay for it over and over again: your target audience is running machines with at least 512MB of RAM, very likely 1GB of RAM at least, and many will have at least 2GB of RAM. Saving, say, 100K of RAM by not using the well defined library string class is not a useful optimization, outside of more specialized problem domains. Outside of your underpowered embedded type systems or extreme high performance game with custom memory allocation or massively parallel real time trading application where each 0.000001 nanosecond of delay costs you trillions of dollars, that 100K is dwarfed by every other facet of your program for anything nontrivial. Grandpa's PDP-11 can't even run your target OS(es), why do we care that our program might be RAM-lean enough to fit in its memory?
Re: (Score:3)
So rather than lock on just a small section, if using the STL you end up having to go with more coarse-grained locks because otherwise subtle bugs emerge under load.
You honestly don't believe that handling thread synchronization at container/primitive level is the right thing to do, do you?
You honestly don't think it's good architecture even needing a lot of locking in parallel code, do you?
Re: (Score:3)
Desqview... thank you for bringing up such awful memories. I remember lusting after that stupid thing for so long, only to discard it about 20 minutes after installation.
OS/2 Warp ftw.
Re: (Score:2)
Programmers have lost the ability to optimize.
The ability and the need (programmers of embedded systems may be an exception). I think nothing of allocating an input buffer that is larger than the entire memory of the first machine I worked on (a z80 box with 2k memory).
New NES games (Score:3)
The ability and the need (programmers of embedded systems may be an exception).
That and dedicated TV games [wikipedia.org], which commonly use [wikipedia.org] an architecture not unlike that of the Nintendo Entertainment System. New games are still being developed for the NES [nintendoage.com], and many are roughly the size of Turbo Pascal or smaller.
Re: (Score:2, Informative)
Obviously Your machine is poorly configured. My $400 Dell notebook starts Win7 in about 15 seconds.
Re: (Score:2)
I call BS. You can't even get through the BIOS bullshit on a Dell in 15 seconds.
Re: (Score:2)
Obviously Your machine is poorly configured. My $400 Dell notebook starts Win7 in about 15 seconds.
Yeah, then you spend three minutes waiting for all the crap to load after you log in.
Re: (Score:2)
Waking from sleep does not count.
Re: (Score:2)
Crazy. Even today I see review code where programmers have implemented exponential sorting algorithms. It's almost as if the existence of faster CPUs and larger memory has enticed some to be extremely lazy. But it's never enough for some. Example, my MacBook Air goes from "ding-chord!" to signed in and usable in about 15 seconds. For Windows 7 on the very same machine the number is about 3:30, including 30 seconds of watching a fucking white cursor blink on an otherwise black screen! What the hell is it trying to do?
Do programmers still implement sort algorithms? I thought whatever library/framework they're using took care of sorting for them?
Re: (Score:2)
There's always somebody who thinks they're smarter than the standard libraries. And you'll always get stuck fixing their "innovations."
Re: (Score:2)
They don't set out to implement a sort algorithm, they do it unknowingly. They don't recognize that the problem they are working on involves sort, but yes, generally any library/framework sorting call they could otherwise have made would have sufficed--and saved them a lot of time...
Re: (Score:3)
Cute, but I would argue that the concept of optimizing slow code after the fact is a prima facie thought crime. The biggest performance gains come from choosing the right algorithms in the design stage.
Re:MOD PARENT DOWN!!! (Score:4, Informative)
That 30 seconds of cursor blinking? The bootloader hack that you used to make your unlicensed copy of Windows 7 think that it's genuine is waiting for feedback from the bios. The 30 seconds is how long it takes to give up and continue booting. There are other/newer hacks that avoid that issue.
Re: (Score:2)
Wrong. Valid, licensed, genuine, blah blah blah. Google "black screen blinking cursor"...
Re:MOD PARENT DOWN!!! (Score:5, Insightful)
"It's almost as if the existence of faster CPUs and larger memory has enticed some to be extremely lazy"
Or just made them focus on getting stuff done rather than implementing optimisations no one will ever notice.
Oh, and your MacBook startup vs. your Windows startup? That's because Windows supports an ever changing set of hardware configurations and retains support for legacy software. Your MacBook has the luxury of retaining a relatively small set of hardware configurations and Apple being happy to chuck backwards compatibility out the Window.
Sure Windows is slower to boot up but it works on more hardware and has superior backwards compat. Sure your MacBook has poor backwards compat. for older Mac software and wont ever support some hardware configurations, but it's got a better startup time. Those are the tradeoffs you face with this sort of thing.
Surely you understand this though if you're an optimisation guru, that you know, it's all about tradoffs? Or perhaps if you're one of those that's all about optimisation whatever the cost in man hours and however negligible the benefits then you don't understand that it's all about picking the right balance.
So no, don't "MOD PARENT DOWN!!!". You have a rose tinted view of an era when all software was ultra-optimised by super non-lazy ninja programmers, I remember it more as an era where software still took longer to load and performed far more poorly than it does now, crashed far more often in far more fatal manners, had far more dangerous security flaws like root access exploits rather than just SQL injection exploits, and where usability was out the window as you had to spend hours configuring your system to even get it to run a game or whatever.
I don't think the past was really as rosy as you think.
Re: (Score:3)
It doesn't matter if vendors write the device drivers, the device drivers still have to interface with the OS and that means the OS providing services the drivers can interact with to, you know, do something useful.
Apple has no qualms about deprecating support for older or obscure hardware whilst Windows always does it's best to cope. Windows also supports hardware you see in industry which Apple would never dream of bothering to support. Both have their pros and cons - MacOS X is sleeker and faster as a re
Re: (Score:2)
If you call "installing Win7 by choosing the default options and attaching to a local domain" screwing it up, I guess I screwed up by choosing M$ products...
Re:MOD PARENT DOWN!!! (Score:4, Informative)
That's your probelm, right there "attaching to a local domain". Windows does piles of things when attached to a domain it otherwise doesn't do. It seems slow, but most likely it is a bunch of network timeouts waiting for something that will never happen.
Quite simply proven really. Put in the wrong password on a non-domain computer, and it comes back instantly. Same on a domain computer, and time it. It first has to check to see if the domain controller is there, if there is a new password, and then fall back on the locally cached hash.
It is also constantly sending out device discovery information, publishing and receiving info about who has printers and such, and on startup this information has to be collated from scratch (or so the OS thinks).
You can look into administration a little and optimize your startup to stop doing some of these things, which I would recommend even if you don't care abount speed.
39K ? Luxury! (Score:4, Insightful)
Back in my day we had Basic running on a 2K altair. Kids these days don't know the meaning of a kilobyte.
Re: (Score:3)
> Kids these days don't know the meaning of a kilobyte.
Shouldn't that be a kibibyte?
Re:39K ? Luxury! (Score:5, Funny)
Only if you enjoy choking on penises.
Re:39K ? Luxury! (Score:4, Insightful)
No, it most certainly should not. That forced nomenclature is worse than what it ostensibly tries to solve.
Re: (Score:3)
How? It resolves ambiguity.
It doesn't, because most people won't use a retarded name like 'kibibyte' or whatever the heck it is. So when Joe User says they have four gigabytes of RAM in their PC you still have to know that they mean four gigilobytes and not four billion bytes.
It's a dumb idea, the name sounds like some kind of metrosexual bar snack, and it's increased ambiguity because you no longer know what people mean when they say 'gigabyte'.
Re: (Score:2)
(which is very sad, I agree)
When I have children, they'll only have access to a C64 with a Retro Replay or possibly a Chameleon Cart. That way, they will learn how a computer actually works, unlike kids (those under the age of 30) today
Re: (Score:3)
Blasphemy. My son's first project will be a paperclip computer. I have the original book and everything :)
http://lab16.wordpress.com/2009/01/29/paperclip-computer/ [wordpress.com]
Re: (Score:2)
No, they will learn how an obsolete computer works, if they even bother with it, which they almost assuredly will not.
I have a cousin in high school who in his spare time develops video games for obsolete computers because making graphics for them is a lot easier than making graphics for a 3D engine.
Re:Quite sad how bloated everything is (Score:5, Insightful)
How easily we overlook the difference between "bloated" and "quantity of useful information".
Just the words on this page (no markup, no graphics, and after a few comments) would have exceeded the capacity of your beloved 5-1/4 floppy. That's only the raw information, without bloat.
My first screen (a DECScope) had 12 lines x 80 columns each (I couldn't afford the 2K RAM that would have given me 24 x 80.) The screen I'm reading this on can display over 2 million RGB pixels. Calling things "bloat" is like telling me I should honor a display that's less than the size of the "close [X]" icon, because 12x80 isn't "bloat".
By the same twisted logic, Turbo Pascal itself was bloatware, and I thought it produced horribly slow and big code. Assemblers were where the real efficiency lay, and they were a lot smaller than 39K.
Nostalgia is fine. But leave it in the past.
Re: (Score:2)
Just the words on this page (no markup, no graphics, and after a few comments) would have exceeded the capacity of your beloved 5-1/4 floppy. That's only the raw information, without bloat.
I dunno, sometimes there seems to be a lot of bloat even in the raw data around here.
Re: (Score:2)
I've been thinking the exact same thing.
I saw the comment where someone used some video game that fit two floppies back in in 198x as example of how programmers forgot how to optimize. Tell that to the developers of Rage [wikipedia.org] (a game that comes on 3 DVDs) who manage to stream hundreds of megabytes of texture data at 60 frames per second.
Stylized games need less VRAM bandwidth (Score:2)
Tell that to the developers of Rage (a game that comes on 3 DVDs) who manage to stream hundreds of megabytes of texture data at 60 frames per second.
Perhaps the point is that an art style that requires "stream[ing] hundreds of megabytes of texture data at 60 frames per second" isn't the only art style for a fun video game.
Re: (Score:3)
Turbo Pascal itself was bloatware, and I thought it produced horribly slow and big code.
Different memories here, and Wikipedia gives support: "The Turbo name alluded to the speed of compilation and of the executables produced. The edit/compile/run cycle was fast compared to other Pascal implementations because everything related to building the program was stored in RAM, and because it was a one-pass compiler written in assembly language. Compilation was very quick compared to that for other languages (e
Re: (Score:3)
Just the words on this page (no markup, no graphics, and after a few comments) would have exceeded the capacity of your beloved 5-1/4 floppy.
Huh? Have you read any documents that are 1.2 million characters long? There are about 2000 characters per page, in standard text. A plain text document that would fill a floppy would be about 600 pages long. Generously assuming that one comment is about one page of text, that's a lot of comments, somewhat substantially more than, "a few." Not so many complete threads on Slashdot get 600 comments.
So, no, your hyperbolic statement is incorrect.
Re: (Score:2)
TI-89 (Score:2)
Re: (Score:2)
Useful has to be taken in context - even if TP is lame by today's standards, it was state of the art for its time - gave you a competitive edge when you used it.
Was a time when a hand-axe was a prized tool because poorly sized or shaped rocks just couldn't get the job done as quickly.
Re: (Score:2)
hardware is relatively cheap so too little people care about optimizing the size of stuff
One might come up with a theory that postulates that marketing has developped a toolset to counter Moore's law, secretly handing out incentives to take care of bloating code being one evil.
CC.
Re: (Score:3)
You could argue that software uses more memory and cpu-cycles, but apart from that how has quality decreased? Letting Java (or any other language/libraries) do stuff for you decreases the number of errors you make. Surely using a library is better than everyone writing their own string comparison algorithms? People make mistakes, for every n lines of code not written, a bug is averted.
You seem to hate Java but it's not that much slower than other languages such as C despite the 'bloat'. http://shootout.alio [debian.org]