Forgot your password?
typodupeerror
Programming Education

Back To 'The Future of Programming' 214

Posted by Soulskill
from the coding-at-88-mph dept.
theodp writes "Bret Victor's The Future of Programming (YouTube video; Vimeo version) should probably be required viewing this fall for all CS majors — and their professors. For his recent DBX Conference talk, Victor took attendees back to the year 1973, donning the uniform of an IBM systems engineer of the times, delivering his presentation on an overhead projector. The '60s and early '70s were a fertile time for CS ideas, reminds Victor, but even more importantly, it was a time of unfettered thinking, unconstrained by programming dogma, authority, and tradition. 'The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor. 'Because once you think you know what you're doing you stop looking around for other ways of doing things and you stop being able to see other ways of doing things. You become blind.' He concludes, 'I think you have to say: "We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'"
This discussion has been archived. No new comments can be posted.

Back To 'The Future of Programming'

Comments Filter:
  • by Joe_Dragon (2206452) on Friday August 09, 2013 @01:44PM (#44523661)

    Time for real apprenticeships in tech and not years of theory?

  • Re:Short version? (Score:4, Interesting)

    by Alwin Henseler (640539) on Friday August 09, 2013 @01:52PM (#44523763) Homepage

    You must be new here. That "pretentious philosophical BS" is like the spark in a fuel-and-oxygen filled chamber. It ignites into a heap of comments, and those comments are the actual story. Who needs an article when you can browse +5 funny / informative / interesting and -1 trolls?

    As for the linked articles, that's just a cleverly disguised DDoS botnet setup. Some figured it out, but few seem to care the /. botnet is still operating. Heck, I'm even contributing people-time to it (on top of CPU cycles).

  • Re:70s yeah right! (Score:5, Interesting)

    by phantomfive (622387) on Friday August 09, 2013 @01:59PM (#44523847) Journal

    The future of programming, from the seventies, it's all hippie talk...

    What you don't understand is, in ~1980 with the minicomputer, Computer Engineering got set back decades. Programmers were programming with toggle switches, then stepped up to assembly, then started programming with with higher level languages (like C). By the 90s objects started being used which brought the programming world back to 1967 (Simula). Now mainstream languages are starting to get first-class functions. What a concept, where has that been heard before?

    Pretty near every programming idea that you use daily was invented by the 80s. And there are plenty of good ideas that were invented back then that still don't get used much.

    My two favorite underused (old) programming ideas:

    Design by contract.
    Literate programming.

    If those two concepts caught on, the programming world would be 10 times better.

  • by bAdministrator (815570) on Friday August 09, 2013 @02:02PM (#44523901)

    .. is that C was seen as a major setback by Frances E. Allen and others.

    It [C] was a huge setback for--in my opinion--languages and compilers, and the ability to deliver performance, easily, to the user.

    Source:
    Frances E. Allen
    ACM 2006 Conference
    http://www.youtube.com/watch?v=NjoU-MjCws4 [youtube.com]

    The context here surrounds abstractions and not allowing users (programmers) to play with pointers directly (C, and later, C++), which is a setback concerning optimization, because of the assumptions/connections you make about/with the underlying machine.

    If you want to learn more about the ideas of the 1960s and 1970s, I highly recommend looking up talks by Alan C. Kay ("machine OOP" which is Smalltalk in a nutshell), Carl Hewitt (actor model), Dan Ingalls, Frances E. Allen (programming language abstractions and optimization), Barbara Liskov ("data OOP" which is C++ in a nutshell), and don't stop there.

  • by Qbertino (265505) on Friday August 09, 2013 @09:27PM (#44527917)

    I think he got it wrong why we got lost.

    It's not because we didn't or don't know. It's because software was free back then. Hardware was so bizarly expensive and rare that no one gave a damn about giving away software and software ideas for free. It's only when software was commercialised that innovation in the field started to slow rapidly. The interweb is where it was 18 years ago because ever since simply because people are busy round the clock 24/7 trying to monetise it rather than ditching bad things and trying new stuff.

    Then again, x86 wining as an archtecture and unix as software model probably does have a little to do with it aswell. We're basically stuck with early 80ies technology.

    The simple truth is:
    CPU and system development need's its iPhone/iPad moment - where a bold move is made to ditch out decade old concepts to make way for entirely new ones!

    Look what happed since Steve Jobs and his crew redid commodity computing with their touch-toys. Imagine that happening with system architecture - that would be awesome. The world would be a totally different place in 5 years from now.

    Point in case: We're still using SQL (Apollo era software technology for secretaries to manually access data - SQL is a fricking END-USER INTERFACE form the 70ies!!!) as a manually built and rebuilt access layer to persistance from the app level. That's even more braindead than keeping binary in favour of ASM, as given as example in the OPs video-talk.

    Even ORM to hide SQL is nothing but a silly crutch from 1985. Java is a crutch to bridge across plattforms because since the mid 70ies people in the industry have been fighting turf wars over the patented plattforms and basically halted innovation (MS anyone?). The sceomorphic desktop metaphor is a joke - and allways has been. Stacked windowing UIs are a joke and allways have been. Our keyboard layout is a provisionary from the steam age, from before the zipper was invented (!!). E-Mail - one of the bizarest things still to be in widespread use - is from a time when computers weren't even connected yet, with different protocolls for every little shit it does, bizar, pointless, braindead and arcane concepts like the seperation of MUA, editor and seperate protocolls for sending and recieving - a human async communication system and protocol so bad it's outclassed by a shoddy commercial social networking site running from webscripts and browser-driven widgets - I mean WTF??? etc... I could go on and on ...

    The only thing that isn't a total heap of shit is *nix as a system, and that's only because everything worthwhile being called Unix today is based on FOSS where we can still tinker and move forward with babysteps like fast no-bullshit non-tiling window managers, complete OpenGL accelerated avantgarde UIs (I'm thinking Blender here), workable userland and OS seperation and a matured way to handle text-driven UI, interaction and computer controll (zshell & modern bash).

    That said, I do believe if we'd come up with a new, entire FOSS hardware arcitecture "2013" with complete redo and focus on massive parallel concurrency and build a logic-and-constraint driven touch-based direct-maniplation-interface system - think Squeak.org completely redone today for modern retina touch display *without* the crappy desktop - that does away with seperation of filesystem and persistance seperation and other ancient dead-ends, we'd be able to top and drop *nix in no time.

    We wouldn't even miss it. ...

    But building the bazillionth web framework and next half-assed x.org window manager and/or accompaning windows clone or redoing the same audio-player app / file manager / UI-Desktop toolkit every odd year from bottom to top again appears to be more fun I guess.

    My 2 cents.

  • by SuricouRaven (1897204) on Saturday August 10, 2013 @01:40AM (#44528877)

    Some of the problems were pointed out:
    - The device access model is still stuck in the ISA age, when peripherals were just wired up to the address and data buses. That isn't how things are done now - even the PCI-e 'bus' is actually a series of high-speed serial links. This means that all device drivers have to run in kernel memory space. Stability and security problems result.

    - The 16-bit 'real' addressing mode. Another relic of the past, but still can't be abandoned without breaking the boot process. Lose that, and you could lose some complexity in silicon.

    - Even the 32-bit mode could arguably go. The only upside it has over 64-bit is slightly lower memory usage when there are a lot of pointers being used, and it's a real headache at the OS level maintaining two variations on every library to support both 32-bit and 64-bit programs. Lose 32-bit, and you lose a load more complexity. Also means you could lose PAE as a bonus.

    - There are opcodes for handling BCD. These are just completly pointless.

"You need tender loving care once a week - so that I can slap you into shape." - Ellyn Mustard

Working...