Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming

Donald Knuth Worried About the "Dumbing Down" of Computer Science History 149

An anonymous reader writes: Thomas Haigh, writing for Communications of the ACM, has an in-depth column about Donald Knuth and the history of computer science. It's centered on a video of Knuth giving a lecture at Stanford earlier this year, in which he sadly recounts how we're doing a poor job of capturing the development of computer science, which obscures vital experience in discovering new concepts and overcoming new obstacles. Haigh disagrees with Knuth, and explains why: "Distinguished computer scientists are prone to blur their own discipline, and in particular few dozen elite programs, with the much broader field of computing. The tools and ideas produced by computer scientists underpin all areas of IT and make possible the work carried out by network technicians, business analysts, help desk workers, and Excel programmers. That does not make those workers computer scientists. ... Computing is much bigger than computer science, and so the history of computing is much bigger than the history of computer science. Yet Knuth treated Campbell-Kelly's book on the business history of the software industry (accurately subtitled 'a history of the software industry') and all the rest of the history of computing as part of 'the history of computer science.'"
This discussion has been archived. No new comments can be posted.

Donald Knuth Worried About the "Dumbing Down" of Computer Science History

Comments Filter:
  • Sheeit, journalist (Score:3, Insightful)

    by Anonymous Coward on Friday December 26, 2014 @11:26AM (#48675663)

    When you find yourself disagreeing like this to Don Knuth, of all people, and essentially calling him a myopic old coot, for reals nigga you better check yourself. The article doesn't even try to make a counterargument to itself.

    • by bmajik ( 96670 ) <matt@mattevans.org> on Friday December 26, 2014 @12:15PM (#48675903) Homepage Journal

      If you look up "appeal to authority" logical fallacy, there is an exception for Donald Knuth:

      "footnote: It is never fallacious to properly cite Donald Knuth in lieu of providing your own argument."

    • I thought the article unfairly patronized Knuth too, but it makes an interesting point -- that if the kind of history Knuth wants to see is to continue, it will have to be done in Computer Science departments. It also provides examples of other disciplines, like mathematics, that have made space for historians internally.

      I'd love to see this happen, but I don't know if it's politically viable within the academy.

  • Knuth is right. (Score:5, Insightful)

    by Anonymous Coward on Friday December 26, 2014 @11:29AM (#48675667)

    One of the problems this causes is the lack of appreciation for the mathematics that defines computer science, and computers.

    The end result is politicians making stupid laws and judges making stupid rulings...

    With stupid patents on software being the stupid result.

    • One of the problems this causes is the lack of appreciation for the mathematics that defines computer science

      How does mathematics define computer science?

      • Re: (Score:3, Informative)

        by Anonymous Coward

        Discreet mathematique are the basis for computing

        • by ShanghaiBill ( 739463 ) on Friday December 26, 2014 @01:21PM (#48676291)

          Discreet mathematique are the basis for computing

          Not at the semiconductor junction level. It is quantum wave functions at that level. Talking about computers as discrete devices, and ignoring the quantum physics, is just dumbing it down. Kids should not be learning programming until they can independently derive both Schrodinger's equation and Heisenburg's matricies.

          • by Anonymous Coward on Friday December 26, 2014 @01:58PM (#48676489)

            The physics does NOT define Computer Science. Computer Science has nothing that depends on transistors, or tubes, or levers and gears.

            Computers can be designed and built, and computing performed, at many different levels of physical abstraction.

            You can do computer science all on paper for fucks sake.

            Ever heard of this guy called Alan Turing?

            Knuth is right, the ignorance, even among technical people, is astounding

            • Not to mention the lack of humor.
          • Re: (Score:3, Insightful)

            by Mr. Slippery ( 47854 )

            Discreet mathematique are the basis for computing

            Not at the semiconductor junction level.

            You are confusing computing with computers. Indeed, a "computer" used to be a human being implementing algorithms with a mechanical adding machine, and then were tube-based electrical systems, and in the future may use something wholely other than semiconductors; computing, however, remains the same. A bubble sort is still a bubbble sort.

          • by plopez ( 54068 )

            You do not need electronics to compute or even build computers. You can use water, electrical circuits, wooden blocks, etc. But in any case, the Mathematics does not change. Just the implementation.

          • Electronics are only the substrate on which we (currently) perform (most) computing, not the computer itself. They're not even the first substrate that we've ever used for such things: ENIAC is widely credited as the first all-electric computer, but hybrid electronic-mechanical devices had been in use for several decades before then. Pure mechanical devices have also been used in limited capacities, even if we discount the works of Charles Babbage (since his Analytical Engine was never actually built).

            But e

      • Re: (Score:2, Informative)

        by Anonymous Coward

        It is the mathematics that IS the computer. Studies on math proofs, generation of ballistics tables, navigation tables...

        And the realization that the ideal person to carry out mathematical steps is someone without imagination, can't be distracted, and doesn't lose their place... In sort, a "mechanical calculator"... Thus the invention of calculators - going back to the Abacus (1300BC), Napiers Bones for multiplication (1612AD), and slide rules (1622AD), first mechanical adding machine (Pascal, 1642), the Le

      • Nobody has defined "computer science" in a clear way, at least not with any kind of consensus behind it.

        • Entire field is a bit young and I always thought that 'computer science' is a bit awkward term. But IME everyone agrees that it's study of algorithms and their implementation in computing devices. It's basically applied math.
          • by tlhIngan ( 30335 )

            Entire field is a bit young and I always thought that 'computer science' is a bit awkward term. But IME everyone agrees that it's study of algorithms and their implementation in computing devices. It's basically applied math.

            It's a more misunderstanding of science and engineering. Computer science is like other sciences - biology, physics, chemistry, etc. They're concerned about studying the theory of their branch of science, so for computer science, it's about computability - can you do something? And in w

            • But purely theoretic discipline shouldn't include word 'computer' since it's actual physical device. Design of actual physical devices is domain of engineering. That's why I think it's awkward. Theoretical part is covered by disciplines like informatics, cybernetics and algorithmic complexity theory. I'm not sure if we even need an umbrella term for them. Many other disciplines are used in design of computing devices, those that I named appeared in response to demands of design of computing devices historic
            • by Tablizer ( 95088 )

              can you do something? And in what kind of time/space constraints?

              But the practical bottleneck is usually human (coder) grokking, not direct physics. And the human mind is poorly understood. It's a field thus closer to psychology than chemistry. Some want "Computer Science" to shy away from mind-centric issues; which would keep it "pure", but probably less useful as a tradeoff.

      • Re:Knuth is right. (Score:5, Insightful)

        by beelsebob ( 529313 ) on Friday December 26, 2014 @01:31PM (#48676357)

        If you don't know how mathematics defines computer science, then you need to go and study more computer science.

      • Re:Knuth is right. (Score:5, Informative)

        by swilly ( 24960 ) on Friday December 26, 2014 @05:28PM (#48677503)

        Computer Science is a pretty broad area of study, but I consider these three problems to be the most fundamental.

        Computability: What can a computer do and what can it not do? There are an uncountably infinite number of problems that a computer cannot solve (undecidable), and only a countably infinite number of problems that a computer can solve (decidable). Fortunately, most of the interesting problems are decidable.

        Complexity: If a computer can do something, how efficiently can it be done? This goes beyond the Big O you are taught as an undergraduate, and considered language spaces such as P, NP, PSPACE, EXPTIME, and so on. It also considered not only computation time but space (unfortunately, few undergraduates are introduced to space constraints of algorithms, a great interview question is to list an example of a sorting algorithm that takes constant space).

        Equivalence: Given two algorithms, do they perform the same computation? Meaning that given the same inputs they will always produce the same outputs (and yes, side effects are also outputs)? A less strict (but of more practical importance) is whether or not a program meets a specification.

        Computability and complexity are both important parts of the theory of computation, which is usually built on top of Language Theory, which is itself built on top of Set Theory. The hardest problem is modern mathematics may be P = NP, which is also a Computer Science problem. The third problem requires creating mathematical proofs using Formal Logic. It is also an excellent example of an undecidable problem, meaning that there is no general algorithm that can perform it for every program (in other words, it's something that a computer cannot do).

        In addition to Set Theory and Formal Logic, Computer Science relies heavily on Boolean Algebra, Graph Theory, and other areas of Discrete Mathematics. Computer Science is inherently cross-disciplinary, but at its core it is closer to Mathematics than it is to Engineering or Science.

        • In addition to Set Theory and Formal Logic, Computer Science relies heavily on Boolean Algebra, Graph Theory, and other areas of Discrete Mathematics. Computer Science is inherently cross-disciplinary, but at its core it is closer to Mathematics than it is to Engineering or Science.

          You miss the parts that are very close to Linguistics and Information science: Ontologies, Information retrieval, Semiotics, and the all-important Human-Computer Interaction - how to build a computation environment that's efficient for humans to interact with. Maybe this is not a well-defined problem in a mathematical sense, but it's at the core of all programming activity beyond the level of micro-instructions.

          This is not merely cross-disciplinary work; those are also essential parts of the science of comp

    • Re:Knuth is right. (Score:5, Interesting)

      by Penguinisto ( 415985 ) on Friday December 26, 2014 @12:29PM (#48675963) Journal

      One of the problems this causes is the lack of appreciation for the mathematics that defines computer science, and computers.

      The end result is politicians making stupid laws and judges making stupid rulings...

      With stupid patents on software being the stupid result.

      Umm, dunno how else to say it, but honestly? Ignorance of mathematics isn't the cause of stupid laws and policy around technology; lobbyist money, bullshit ideological agendas, and self-serving BS flowing from big tech corporations would be your most likely sources for that.

      I'm perfectly willing and eager to be proven wrong on this, but I figure in the list of causes? Ignorance of CompSci-oriented mathematics is waaaaaaaaaaaaaay down on the list of causes for stupid governmental tech policy, somewhere around "Clippy".

      • by plopez ( 54068 )

        "Ignorance of mathematics isn't the cause of stupid laws and policy around technology..."

        I agree somewhat but I think ignorance allows bad policy to exists. Some of what you hear coming from hucksters would, IMO, be akin to repealing the laws of gravity or building a warp drive. I am very skeptical of things such as facial recognition, trawling email to stop terrorist attacks, data mining, anything that handles data which is not ACID compliant, missile defense, and AI (AFAIAC it will never truly exist) thou

  • by patniemeyer ( 444913 ) <pat@pat.net> on Friday December 26, 2014 @11:30AM (#48675673) Homepage

    I get the gist of it but the summary is so mangled that it doesn't really make much sense.

  • by careysub ( 976506 ) on Friday December 26, 2014 @11:34AM (#48675701)

    Which is: there are no good technical histories of computer science.

    Read TFA - he spends the majority of the article explaining in detail why Knuth is right - that there are indeed no good technical histories of computer science, and little prospect of any.

    Where Haigh takes issue with Knuth is in arguing that the histories of computers and software, which are not technical histories, are nonetheless valuable in their own right, and thus Knuth's dismay at their publication is misplaced. But he otherwise agrees with Knuth has to say.

    • by Dracos ( 107777 ) on Friday December 26, 2014 @12:59PM (#48676173)

      there are indeed no good technical histories of computer science, and little prospect of any.

      I see the posthumous reactions to Steve Jobs and Dennis Ritchie as indicators that Knuth is absolutely right. Jobs, who was essentially just a marketing asshole, gets every manner of fanfare commemorating his "world-changing" achievements. Ritchie on the other hand is almost completely ignored in the media, even though he is one of the giants upon whose shoulders Jobs undeservingly stood.

      I bet anyone here would agree that co-authoring UNIX is a far more important event than being the iPod/iPhone taskmaster.

      • by Anonymous Coward

        Jobs, who was essentially just a marketing asshole

        I don't want to read your history books. You're doing exactly the thing the article is complaining about.

      • by plopez ( 54068 )

        " far more important event than being the iPod/iPhone taskmaster"

        I am actually going to argue that point with you. From an applications POV; IPods, IPads, iPhones, and OS X took the capabilities of Unix, and a host of other technologies, and leveraged them into something with a huge impact on social order, business, art, and science. Having Unix at the ready was important. But creating those products took the power of Unix into a whole new realm.

        By analogy, knowing 2 Ca3SiO5 + 7 H2O ---> 3 CaO.2SiO2.4H2O

        • You are saying Jobs = Apple, which is not true. Jobs without Wozniak and thousands of other engineers and scientists, would have a been a sode-pop seller... In other words, he created nothing and took credit for the creations of other people. Google Jobs' insistence of having his name on every Apple patent...

          Now, Ritchie, he co-authored Unix. He did not manage the creation of Unix. And he created the programming language C. Personal accomplishments.

          See the difference?

          • Apparently, Jobs was a bullying perfectionist. Without him, the excellent products Apple created would not have existed anywhere. Whether he damaged people's lives in the process is something I don't know, but there are bosses out there with his personality and practices that have left behind human mental carnage.
          • by Rakarra ( 112805 )

            You are saying Jobs = Apple, which is not true.

            Perhaps, but Apple without Jobs in the 80s and 90s was not a pretty sight. Apple right now is stagnating, riding upon past accomplishments.

    • Yep, the whole article is basically "we couldn't make any money if we actually wrote history about the thing you're interested in, so... tough tits"

      • Yep, the whole article is basically "we couldn't make any money if we actually wrote history about the thing you're interested in, so... tough tits"

        Which isn't true......or rather, maybe they couldn't, but a more competent writer surely could.

  • by cruff ( 171569 ) on Friday December 26, 2014 @11:36AM (#48675709)
    Seems to me he's already sold out. :-) http://www.ibiblio.org/Dave/Dr-Fun/df200002/df20000210.jpg [ibiblio.org]
  • by Anonymous Coward on Friday December 26, 2014 @11:36AM (#48675711)

    Steve Jobs invented computers, smartphones, innovation, and minimalism.

    What more does one need to know about computer science history?

    • Steve Reich invented minimal music. Steve Jobs just invented the second record company called Apple (the first one being invented by The Beatles - not anywhere as historical as Sun records). Alan Turing invented the Turing Machine - after Conrad Zuse had built a working computer. Al Bell is credited with the telephone. Innovation was patented by the biblical author of Ecclesiastes.

  • This doesn't seem like a tough call. I have four volumes of Knuth on my shelf (just found 4A existed, so its cover is still pretty fresh), and I refer to them frequently. Even the oldest ones (though I did buy a fresh copy of Vol 1 after it was updated). It's my first stop when I need to start researching an algorithm, and often I don't need to go further.

    OK, now Thomas Haigh. Googled him. Checked his credentials. PhD dissertation in the sociology of computer science. Umm, OK. Think I'll go analyze th

    • by methano ( 519830 )
      Give the poor guy a break. He's got a PhD in the sociology of computer science. He's got to do something with that degree. This seems about as useful as anything you can do with a PhD in the sociology of computer science. What else is he gonna do?
    • Anyone who venerates TAoCP like you do is a drone formatted by the establishment, an elitist, or both.

  • i'm more worried about the dumbing down of the mankind in general.

  • by ripvlan ( 2609033 ) on Friday December 26, 2014 @11:53AM (#48675785)

    I returned to college several years ago after a 20 year hiatus (the first 6 years were my creative period). My first time around I studied what might be called pure Computer Science. A lot has happened in the industry after 20 years and I very much enjoyed conversations in class - esp with the perspective of the younger generation. I found it fascinating how many kids of today hoped to enter the gaming industry (my generation - Zork was popular when I was a kid and Myst was a breakout success on a new level). Kids today see blockbuster gaming as an almost make it rich experience - plus a "real world" job that sounds like fun.

    But more interesting was the concepts of Computer Engineering vs Computer Science. What is Science vs Engineering? Are software "engineers" really scientists? Do they need to learn all this sciencey stuff in order to enter the business school? I attended a large semi-well-known University. Back in the '80s the CS department was "owned" by the school of business. Programming computers was thought to be the money maker - only business really used them with a strong overlap into engineering because computers were really big calculators. However it was a real CS curriculum with only 1 class for business majors. Fast forward a dozen years and CS is now part of the Engineering school (with Business on its own). The "kids" wondered why they needed to study Knuth et al when they were just going to be programming games. What about art? Story telling? They planned on using visual creative studio tools to create their works. Why all this science stuff? (this in a haptics class). Should a poet learn algorithms in order to operate MS-Word?

    Since computers are ubiquitous they are used everywhere. I tell students to get a degree in what interests them - and learn how to use/program computers because...well..who doesn't use a computer? I used to program my TI calculator in highschool to pump out answers to physics & algebra questions (basic formulas).

    Are those who program Excel Macros computer scientists? No. Computer Engineers? no. Business people solving real problems? Yes/maybe. The land is now wider. Many people don't care about the details of landing a man on the moon - but they like it when the velcro strap on their shoes holds properly. They receive entertainment via the Discovery Channel and get the dumbed down edition of all things "science."

    When creating entertainment - it needs to be relatable to your target audience. The down and dirty details and technicalities interest only a few of us. My wife's eyes glaze over when I talk about some cool thing I'm working on. Retell it as saving the world and improving quality - she gets it (only to politely say I should go play with the kids -- but at least she was listening to that version of events).

    I think that the dumbing down of history is ... well.. history. There was this thing called World War 2. The details I learned in grade school - lots of details. Each battle, names of important individuals. Today - lots of history has happened in the meantime. WW2 is now a bit dumbed down - still an important subject - but students still only have 8 grades in school with more material to cover.

    My brain melts when I watch the Discovery Channel. I'm probably not the target audience. The details of historical science probably interest me. The history of Computing needs to be told like "The Social Network."

    • by ledow ( 319597 ) on Friday December 26, 2014 @01:23PM (#48676295) Homepage

      It's even simpler than that.

      Computers are a tool. That's what they were designed to be, that's what they are. You can use them, or not.

      Computer science isn't about using a tool. It's about creating a tool that's useful, and enhancing existing tools.

      Every idiot can pick up a hammer and bash a nail it. Not everyone could forge a hammer-head, wedge it strong enough into well-prepared wooden handles, etc.

      That you can use the tools made by others to get rich - it's undeniable. It's also very rare and down to little more than chance. And gaming is the one that attracts young minds because they are ALL users of games and games devices.

      But there were a bucket of clone games before the existence of and while things like Minecraft, Angry Birds, etc. were being developed.

      The programmer who wrote the map editor for Half-Life probably couldn't put a level together. But a 3D artist can take that tool and slap it together even if he doesn't really understand what a shader is. It's two entirely separate areas that people STILL confuse.

      Want to play games? Go ahead. You just need a computer. Want to write games? You have to become a coder, or use the tools other coders have written for you. Want to write the tools? You have to be a coder. Working in IT in schools, I get a lot of parents tell me their kids are "good with computers" and should be in the top IT classes, etc. and what university should they go to to write games? I advise all of them against it, when they come from that angle. Because immediately my first question is Have you ever written one? No. Then find another career path. Or go away, write one, come back in six months and ask me again.

      The parents get miffed, but they are the ones that have come to me for the advice. And yet, the ones who COULD make it in computer science, they don't need to ask. They know where they're heading. They can knock up something in an afternoon or tell you how to go about it.

      Using the tools can be a skill. I wouldn't want to be up on an oil rig handling some specialist device to build the platform, and it probably takes years of on-the-job and other training to do it properly and safely. But the guy who designed it? You'll probably never see him. If he turns up on the oil rig, it's in a hardhat and business suit to look at the job, and then he's gone.

      Everyone can use a basic tool. Some can use a complex tool skilfully. Others can design and make the tools in the first place. It applies to all walks of life and all careers, though, not just IT.

      You can no-doubt drive a car. But you'll never win a rally no matter how good you think you are. And though you might be able to cobble together parts to make something that moves, to build and design the car to similar specifications from nothing takes decades of experience and a high level of skill.

      You can no-doubt browse the web on your computer. But you'll never run your own network effectively. And though you can cobble together parts to make something that works, to build and design the chips, the protocols, the electrical specifications, etc. takes decades of experience and a high level of skill.

      You can no-doubt play some kind of instrument. But you'll never be a concert performed. And though you can cobble together parts to make something that makes a good sound, to build and design and PLAY the instruments properly takes decades of experience and a high level of skill.

      You can no-doubt draw. But you'll never be an artist. And though you can cobble together parts to make something that looks good, to knock up a work of art takes decades of experience and a high level of skill.

      We just need to separate the idea in people's heads. Using a computer is different to "being good" with a computer. which is different to "knowing" about the computer, which is different to programming the computer, which is different to designing the computer.

      The deeper you go, the more skill and knowledge you need.

      Wor

      • Computers are a tool. That's what they were designed to be, that's what they are. You can use them, or not. Computer science isn't about using a tool. It's about creating a tool that's useful, and enhancing existing tools.

        Well said.

    • by dbIII ( 701233 )

      My brain melts when I watch the Discovery Channel

      Offtopic, but I decided to watch a "documentary" made by those guys on Charles Manson. They made him look like some evil genius instead of some loser that tried to kill people out of spite when things went wrong with his music career, got the wrong ones and was caught by Park Rangers for vandalism.

  • Sorry, but when has anyone in the field been "good" at documentation? I'd say the best "history" we got is probably just to pull all the comments off of the Linux code, or the dev groups, but that wouldn't be safe for work. If someone were to look at the comments in any major program you'd probably come to the conclusion that we're all mental patients, and criminals being hired out of some asylum. Best to leave the history as Lovelace, then Tuning, and then nothing, but Chaos.
    • Sorry, but when has anyone in the field been "good" at documentation?

      Knuth himself, ever since he wrote WEB for literate programming.

  • Comment removed based on user account deletion
  • Since the mid 2000s I feel like I've been seeing a lot more BFI solutions, BAD BFI solutions, than I did back in the '90's. I guess back then you had to use some finessee in your programming to get the performance you needed out of the system. Either that or I'm working with more bad developers lately. I suppose that's also possible.
  • it's everywhere (Score:3, Insightful)

    by Virtucon ( 127420 ) on Friday December 26, 2014 @12:19PM (#48675913)

    we've raised at least two generations of self obsessed, no attention-span kids who want instant gratification. Retards like Justin Bieber who today tweets that he bought a new plane. As the later generations grow into the workforce and into fields like journalism, history and computer science it's no small wonder they want to reduce everything down to one liners or soundbites. Pick your field because these kids started with censored cartoons and wound up with Sponge Bob. Shit, even the news is now brokered into short paragraphs that just say "this shit happened now onto the next.."

    Screw that! Yeah I'm getting older so get the fuck off my lawn!

    • we've raised at least two generations of self obsessed, no attention-span kids who want instant gratification.

      As a Gen X'er, I blame the Baby Boom generation for today's mess. It'll get worst as this sorry lot retires and discovers that life doesn't owe them squat.

    • If I bought a place, I'm pretty sure I would at least send a little word about it to my friends and family . That would be a major life event.

  • Look no further than the web "development" industry which seems to be all the media and colleges can conceive of programming entailing. It generally employs people who've never even opened a half decent computer science book, never mind read one. More often than not they've done some irrelevant degree or qualification (social "science", graphics design, whatever) that just about proves they can walk upright and breath through their noses at the same time, and somehow they manage to end up "coding" HTML and

    • I have a two-year A.S. degree in computer programming, which required one web "development" course in HTML. Since this was an online course with no hard deadlines, I waited until the very last day to complete all the assignments in six hours before taking the final exam. Having taught myself HTML with a text editor, it was a breeze to ace the exam. My only complaint was that all classes were taught in Java since the school couldn't afford to renew the Microsoft site license for Visual Studio. The Linux inst

  • History is what the winners make it. So what to believe and who to believe is irrelevant - what you need to know if you was _there_ and saw it happen - that is where real history comes from.
    • by donaldm ( 919619 )

      Was easier for me to read.

      The article would be a less onerous read if the web designer had a basic understanding of typesetting. I will give him a hint.

      1. Use a serif font. Sans serif fonts are ok for captions not sentences.
      2. Make your writing black on white since it's easier to read. Not grey on white
      3. Have white space margins to the left and right of your web page. Again easier to read.
      4. The PDF and Digital copies are ok but a little thought could make them much more readable. As an example do one column or if you really need column
      • This will take care of your first two suggestions:

        Right click on page; From pop-up menu choose Web Developer -> CSS -> Disable Styles -> Disable All Styles.

        (YMMV may vary depending on your browser and your installed extensions.)

        An alternative is to edit their CSS to address the first three of your points.

        My point is: there are plenty of tools you can use to make web pages appear how you would like them to appear, such as: browser extensions; custom style sheets (e.g., Firefox's userContent.css); an

    • Gaga comes, the audience is mostly women.
      Knuth comes, the audience is mostly men.

      Clearly, Google's pro-woman hiring strategy is not having the desired effects.

  • There's a gem of a documentary about the history of computing before the web.

    The Machine That Changed the World is the longest, most comprehensive documentary about the history of computing ever produced.
    It's a whirlwind tour of computing before the Web, with brilliant archival footage and interviews with key players — several of whom passed away since the filming.

    Episode 1 featured Interviews with, including but not limited to:
    Paul Ceruzzi (computer historian), Doron Swade (London Science Museum), Ko

    • P.S. if the BitTorrent tracker doesn't work, edit the Torrent and use udp://open.demonii.com:1337

  • Donald Knuth Worried About the "Dumbing Down" of Computer Science History

    Whether CS education is appropriate to all people who do computed-assisted technical work is very irrelevant to me since practical forces in real life simply solve that issue.

    The problem I care about is a problem I seen in CS for real. I've met quite a few CS grads who don't know who Knuth, Lamport, Liskov, Hoare Tarjan, o Dijkstra are.

    If you (the generic CS grad) do not know who they are, how the hell do you know about basic CS things like routing algorithms, pre and post conditions, data structures,

  • Every bloody math book blah blahs about the great figures of math and I am sure that I have heard about the story of Gauss summing 1 to 100 in grade school 5050 times. I have pretty much zero interest who figured out the for loop, and pretty much zero interest in holding Knuth as some kind of Euler and forcing generations of programmers to learn even how to pronounce his name. Much of computer science happened because it was ready to happen.

    Even in science many people were just doing the right work in th
    • Knuth did a nice job of articulating why he wants to look at the history of things at the beginning of the video. Those reasons might not resonate with you but he does have definite reasons for wanting technical histories (not social histories which pander to "the stupid") to be written.

"For the love of phlegm...a stupid wall of death rays. How tacky can ya get?" - Post Brothers comics

Working...