Back To 'The Future of Programming' 214
theodp writes "Bret Victor's The Future of Programming (YouTube video; Vimeo version) should probably be required viewing this fall for all CS majors — and their professors. For his recent DBX Conference talk, Victor took attendees back to the year 1973, donning the uniform of an IBM systems engineer of the times, delivering his presentation on an overhead projector. The '60s and early '70s were a fertile time for CS ideas, reminds Victor, but even more importantly, it was a time of unfettered thinking, unconstrained by programming dogma, authority, and tradition. 'The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor. 'Because once you think you know what you're doing you stop looking around for other ways of doing things and you stop being able to see other ways of doing things. You become blind.' He concludes, 'I think you have to say: "We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'"
I knew it (Score:4, Funny)
Every time some stupid colleagues of mine told me I was doing it wrong, I kept thinking they were close-minded idiots.
Turns out, I was right all along!
Re: (Score:2)
Hmm (Score:5, Insightful)
On the one hand, it is a good thing to prevent yourself from constrained thinking. I work with someone who thinks exclusively in design patterns; it leads to some solid code, in many cases, but it's also sometimes a detriment to his work (overcomplicated designs, patterns used for the sake of patterns).
Unlearning all we have figured out in computer science is silly, though. Use the patterns and knowledge we've spend years honing, but use them as tools and not as crutches. I think as long as you look at something and accurately determine that a known pattern/language/approach is a near-optimal way to solve it, that's a good application of that pattern/language/approach. If you're cramming a solution into a pattern, though, or only using a language because it's your hammer and everything looks like a nail to you, that's bad.
Re:Hmm (Score:5, Insightful)
Use the patterns and knowledge we've spend years honing, but use them as tools and not as crutches.
Having just watched this video a few hours ago (sat in my queue for a few days, providence seemingly was on my side to watch it right before this story popped), I can say he argues against this very idea.
He mentions late in the talk about how a generation of programmers learned very specific methods for programming, and in turn taught the next generation of programmers those methods. Because the teaching only involved known working methods and disregarded any outlying ideas, the next generation believes that all programming problems have been solved and therefore they never challenge the status quo.
Much of his talk references the fact that many of the "new" ideas in computing were actually discussed and implemented in the early days of programming. Multiple core processing, visual tools and interactions, and higher level languages are not novel in any way; he's trying to point out that the earliest programmers had these ideas too, but we ignored or forgot them due to circumstances. For example, it is difficult to break out of the single processing pipeline mold when one company is dominating the CPU market by pushing out faster and faster units that excel at exactly that kind of processing.
While TFS hits on the point at hand (don't rest on your laurels), it is worth noting that the talk is trying to emphasize open mindedness towards approaches to programming. While that kind of philosophical take is certainly a bit broad (most employers would rather you produce work than redesign every input system in the office), it is important that innovation still be emphasized. I would direct folks to look at the Etsy "Code as Craft" blog as an example of folks that are taking varying approaches to solving problems by being creative and innovating instead of simply applying all the known "best practices" on the market.
I suppose that final comment better elaborates this talk in my mind: Don't rely on "best practices" as if they are the best solution to all programming problems.
Re: (Score:2)
Much of his talk references the fact that many of the "new" ideas in computing were actually discussed and implemented in the early days of programming. Multiple core processing, visual tools and interactions, and higher level languages are not novel in any way; he's trying to point out that the earliest programmers had these ideas too, but we ignored or forgot them due to circumstances. For example, it is difficult to break out of the single processing pipeline mold when one company is dominating the CPU market by pushing out faster and faster units that excel at exactly that kind of processing..
I can attest to this. The phrase "Everything old is new again." (Or "All of this has happened before, and all of this will happen again." for you BSG fans) is uttered so frequently in our office that we might as well emblazon it on the door. It's almost eerie how well some of the ideas from the mainframe era fit into the cloud computing ecosystem.
Re: (Score:2)
So what's the point? They want a cookie? They want people not to use these concepts even now that they are viable bec
Re: (Score:2)
Re: (Score:3)
Designs are only complicated when they are unique. If I write my own LinkedHashMap to store 2 values, it is overcomplicated. If I just invoke a standard java LinkedHashMap to store 2 values, then it's the same design, but since everyone knows what a java LinkedHashMap does, it is simple. Also It can be swapped out for a simple array with relative ease if the code is designed in a way that is maintainable.
Even if you are using design patterns, you should be leveraging not just the knowledge that other peo
Would you happen to be an InfoSys trainer? (Score:5, Funny)
>> We don't know what programming is. We don't know what computing is. We don't even know what a computer is.
Aha - they found the guy who trains InfoSys employees.
70s yeah right! (Score:4, Funny)
The future of programming, from the seventies, it's all hippie talk...
"We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'"
Next thing we can throw our chairs out and sit on the carpet with long hair, smoke weed and drink beer....
Re:70s yeah right! (Score:5, Insightful)
If you aren't doing it that way already, then you're doing it wrong.
Re: (Score:2)
Re:70s yeah right! (Score:5, Interesting)
The future of programming, from the seventies, it's all hippie talk...
What you don't understand is, in ~1980 with the minicomputer, Computer Engineering got set back decades. Programmers were programming with toggle switches, then stepped up to assembly, then started programming with with higher level languages (like C). By the 90s objects started being used which brought the programming world back to 1967 (Simula). Now mainstream languages are starting to get first-class functions. What a concept, where has that been heard before?
Pretty near every programming idea that you use daily was invented by the 80s. And there are plenty of good ideas that were invented back then that still don't get used much.
My two favorite underused (old) programming ideas:
Design by contract.
Literate programming.
If those two concepts caught on, the programming world would be 10 times better.
Re: (Score:2)
I did an entire thesis with Tangle and Weave, and I'm glad that I did, but I'm not convinced that a narrative exposition is any better than the more random-access style that a hierarchical directory layout with some decent (embedded and out-of-line) documentation and viewer IDE does.
Rgds
Damon
Re: (Score:3)
If you have any other ideas on the topic I'd be interested in hearing them.
Re: (Score:3)
Naturally I *have* to ultimately present the program text in a form that the computer will be happy with, but I am very hot on appropriate human-centric documentation in-line and off-line, and phrases that make me spit include:
"It's self documenting."
"Oh it's hard to keep the comments in sync with the code."
Farg me!
I'm suffering on a new project from a slight lack of consideration as to what the coder following in your footsteps would need in order to understand what was intended vs what actually happened.
Re:70s yeah right! (Score:5, Insightful)
Re: (Score:2)
imagine how much we could make those resources achieve if we used them with the attitude those people had towards their *limited* resources.
We would gain nothing, hell we would still have god-damn teletype machines if everyone was worried about wasting nanoseconds of compute time. We would get some multiple of compute increase but we would loose out on the exponential increases in human productivity that comes from dealing with all of those abstractions automatically.
But I think we agree in that we need to focus on fixing the problems that need fixing. It's more important to figure out what you need done and then figure out whether you will h
Re: (Score:2)
Re: (Score:2)
Speaking of setting programming back, the current push in languages to get rid of declaring types of variables and parameters has set us back a few decades. In languages like Ruby, you can't say ANYTHING about your code without executing it. You have no idea what type of objects methods receive or return, whether formal and actual parameters match, or whether types are compatible in expressions, etc. I actually like a lot of aspects of Ruby, but it seems like it's thrown about 50 years of improvement in p
Re: (Score:2)
Ruby is actually rather strongly typed. Shell is far more like what you are describing.
Re: (Score:3)
Yes and no. It's true that objects have classes, but that's entirely malleable, and there's no way to look at a particular piece of Ruby code and have any idea what class an object has, unless you actually see it being created (yes, yes, even then you don't know because classes can be modified on the fly, but let's ignore that for the moment). Basically, I can't look at a method and do anything except guess what parameters it takes. Personally, I think that's a bad thing.
Re: (Score:2)
That I agree with. Ruby is strongly typed with very aggressive and implicit type conversions. Anyway, strongly typed languages are dominant: C++, C#, Java; so it would make sense the alternatives are dynamic. Moreover scripting has always been dynamic.
I can understand the virtues of strongly typed. IMHO dynamic typing works best in programs under 20 lines of code, works OK 20-1000 and starts to fall apart after 1000. Most Ruby programs are under 1000 lines.
Re:70s yeah right! (Score:4)
I'm a static language guy myself, but its important to keep in mind that different problems have different solutions.
Doing heavy image processing or transnational operations, number crunching, I/O with third party APIs, etc? Yeah, static languages are probably better.
Doing prototyping, or UI intensive work? Most UI frameworks suck, but the ones designed for static languages generally suck more, because some stuff just can't be done (well), so they have to rely on data binding expressions, strings, etc, that are out control of the language. At least dynamic languages deal with those like they deal with everything else and have them as first class concepts.
Case in point: an arbitrary JSON string, in a dynamic language, can be converted to any standard object without needing to know what it will look like ahead of time. In a static language, you either need a predesigned contract, or you need a mess of a data structure full of strings that won't be statically checked, so you're back at square one. These type of use cases are horribly common in UI.
Re: (Score:2)
I dabble in image processing algorithms. A lot of the things I write for my own use end up being a C program to do the serious number crunching, with a perl script for the interface. Perl does all the pretty work, then just calls upon the compiled executable when serious performance is required.
Re: (Score:2)
Re: (Score:2)
Maybe you do need to know, maybe you don't. Maybe the object is being introspected, maybe its used to feed to a template engine, maybe its just converted from one format to another. All these things can be done in any language. Some languages make it easier than others.
Static languages make structured stuff easy and dynamic stuff hard(er). Dynamic languages make dynamic stuff easier. Shocker, i know.
Re: (Score:2)
You ssem to be saying that using strings - and thus bypassing the static type checking of a static language - is worse that using a language with no static type checking in the first place. I don't see that at all.
Either you're fine with not knowing what the object looks like ahead of time - in which case you can't directly reference member names in any case, and strings are far better than reflection - or you have a specific subset of the object that you understand and needs to be how you expect it to be,
Re: (Score:2)
Dynamic languages have better support for introspection, handling what happens when a property is missing, dealing with objects that aren't QUITE the same but "close enough", and deep nested dynamic data structures.
If you want to represent the same things in a static languages, you need things like arbitrarily nested dictionaries or very complex data structure with meta data. Thats why to handle a JSON string in Java or .NET, you'll need a JSON object framework. The parsing is the trivial part.
I also won't
Re: (Score:3)
Re: (Score:2)
Well, it's a problem if I'm trying to actually read the code and understand what it does and what arguments it takes. Not to mention the wasted time when I pass the wrong argument type to a method and the problem doesn't show up until runtime. And god forbid that it's Ruby code using "method_missing", then I'm really screwed in so may ways it's hard to imagine. For example, in a Rails app you want to see if you are in the development environment:
if Rails.env.dev?
and that runs just fine... but ALWAYS retu
Re: (Score:2)
Well, it's a problem if I'm trying to actually read the code and understand what it does and what arguments it takes.
Don't write code so complicated that you can't easily tell what type is needed from inspection. Seriously, these are solved problems. You can write code with strong type-safety, or you can write code with runtime binding. Both are workable.
Re: (Score:3)
Design by contract is my favorite way of handling interfaces. It really is a good idea.
Literate programming though I'm not sure if I see much point to. There are cool examples like mathematica notebooks but in general even very good implementation like Perl's POD and Haskell's literate mode just don't seem to offer all that much over normative source code. API documentation just doesn't need to be that closely tied to the underlying source and the source documentation just doesn't need to be literate.
As
Re: (Score:2)
As for your 1990s and Objects. I also disagree. Objects were used for implicit parallelism and complex flow of control. No one had flow of controls like a typical GUI to deal with in 1967. Event programming was a hard problem solved well.
I don't understand what relationship 1990s, objects, and implicit parallelism have to do with each other, you'll have to explain it more clearly. But the complex flow required by an OS managing multiple resources is significantly more difficult than
Re: (Score:3)
I understand the ideas behind it. But I'm not sure why understanding the structure of the program matters much. If it does, throw a few paragraphs in about the structure or include a doc to the side.
OK. Modern GUIs create a situation where operating systems have huge numbers of tasks lying around. Thousands and thousands of asymmetric potential threads passin
Re: (Score:2)
I understand the ideas behind it. But I'm not sure why understanding the structure of the program matters much. If it does, throw a few paragraphs in about the structure or include a doc to the side.
Because structure is the key to understanding, in programming and literature.
Re: (Score:3)
Thousands of potential threads. And all of them OSX, Windows, KDE, Gnome. They all utilize tremendous numbers of objects able to operate with implicit parallelism. Generally in terms of execution threads some sort of thread pool is used to match actual CPUs to potential threads. In terms of modern systems using only one thread, look at any of the design of systems books that just ain't true. As for this being the actor model of concurrency. Yes it is. The event driven model's concurrency system was
Re: (Score:2)
As for literate you aren't answering the question what the point is of the human understanding that limited amount about the program. You are just sort of asserting that limited understanding by a human is useful.
The opposite. Literate programming makes it easier to understand programs.
In terms of modern systems using only one thread, look at any of the design of systems books that just ain't true.
Which GUI system doesn't? Swing? Openstep? .net? Android? They all use one thread.
Over-generalization error in line 4 (Score:2)
Unless of course you know you know what you are doing, because you also know to never stop looking for new ways of doing things.
Re: (Score:2)
But if you know what you are doing, you still have a majority with no clue around you, in the worst case micro-managing you and destroying your productivity. I think the major differences between today and the early years of computing is that most people back then were smart, dedicated and wanted real understanding. Nowadays programmers are >90% morons or at best semi-competent.
Re: (Score:2)
Re: (Score:2)
Well, I did include the semi-competent, those that eventually do get there, with horrible code that is slow, unreliable, a resource-hog and a maintenance nightmare. Plain incompetent may indeed just be 80%. Or some current negative experiences may be coloring my view.
Re: (Score:2)
Unless of course you know you know what you are doing, because you also know to never stop looking for new ways of doing things.
If you have to look for a new way to do something, then you don't know the answer, so how can you know you know what you are doing when you know you don't know the answer? When you are 100% confident in the wrong answer, you know you know what you are doing (and are wrong). If *ever* you know you know what you are doing, you don't.
Re: (Score:2)
I'm not a big enough moron to think that there is one answer that can be called the answer. Your mileage clearly varies.
Slashdot really needs a -1 anti-insightful option.
Epic facepalm (Score:3)
The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor.
Yeah. I bet Vincent Van Gogh thought he was total shit at painting, didn't know anything about paint mixing, brushes, or any of that. Look, I know what you're trying to say, Victor, but what you actually said made my brain hurt.
However, exploring new things and remembering old things are two different things. You can be good at what you do and yet still have a spark of curiousity to you and want to expand what you know. These aren't mutually exclusive. To suggest people murder their own egos in order to call themselves creative is really, really, fucking stupid.
You can, in fact, take pride in what you do, and yet be humble enough to want to learn more. It happens all the time.. at least until you're promoted to management.
Re: (Score:3)
The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor.
Yeah. I bet Vincent Van Gogh thought he was total shit at painting, didn't know anything about paint mixing, brushes, or any of that.
Um... yes actually. Van Gogh actually only sold one painting in his entire life, and he considered himself somewhat of a failure as a painter. He did not become famous until after his death.
Re: (Score:2)
Um... yes actually. Van Gogh actually only sold one painting in his entire life, and he considered himself somewhat of a failure as a painter. He did not become famous until after his death.
He considered himself a failure commercially... Because he was. He never stopped painting. That's fairly compelling evidence he knew he didn't suck... and that it was the world that was wrong, not him.
Just because you're bad at business doesn't mean you're bad at what you do. I know, I know... it's hard for people these days to understand that, but 'tis true.
Re: (Score:2)
Um... yes actually. Van Gogh actually only sold one painting in his entire life, and he considered himself somewhat of a failure as a painter. He did not become famous until after his death.
He considered himself a failure commercially... Because he was. He never stopped painting. That's fairly compelling evidence he knew he didn't suck... and that it was the world that was wrong, not him.
Just because you're bad at business doesn't mean you're bad at what you do. I know, I know... it's hard for people these days to understand that, but 'tis true.
Van Gogh was notoriously depressed. His entire career as an artist was little more than five years, ending with his suicide in 1890. The nature of his work changed dramatically at a rapid pace, pieces from a year before could almost be from another artist entirely. This all suggests that he was never truly satisfied with his works. It has nothing to do with the lack of financial success, but rather the lack of acceptance from his peers, who often derided him. He continued painting, not because he thought he
Re: (Score:3)
When I was doing design work, my mentor taught me the rules and told me to stay within them. After you've mastered the rules, learning the successes and mistakes of everybody before, then you can start breaking them as you explore new possibilities.
I am afraid this will convince people who know nothing yet to just go off in whatever direction they please, wasting massive time on things others already learned not to do, subjecting others to more horrible code.
Re: (Score:2)
That's guaranteed to happen. The only question is the extent. There's bound to be a few who say, "Hey! I don't have to know what I'm doing. That one guy said so!" In reality, we know different. Progress is made by learning from the mistakes of others. :)
Re: (Score:2)
Yeah. I bet Vincent Van Gogh thought he was total shit at painting,
He probably did. He died a commercial failure. Critic reviews at the time were very critical of his work.
Re: (Score:2)
Re: (Score:2)
We creative people are the good ones...those others...gosh, they're capable of violence.
I don't see how the two are mutually exclusive. Oh, the creative ways I murder people in my fantasies!
After all, what kind of creative would call himself fearful of people who can't create so much as a scrapbook unless they're following an example from youtube posted by...a creative.
Depends. Are they armed with just a scrapbook and a laptop, or something more substantial?
The fact is, there are way too many non-creatives and they are screwing up the planet. Just imagine how much better the world would be if every member of the Tea Party suddenly disappeared overnight. Oh, we can dream....
A true creative doesn't want people dropping dead or disappearing... they want them doing something useful and productive so they don't have time to "screw up the planet."
Re: (Score:2)
A true creative doesn't want
Would you please stop abusing the English language so roughly? What did it ever do to you? I'm sure it's very sorry, and wishes you would just stop hurting it like this.
Seriously, this is worse that mixing up lose and loose, because you're doing it on purpose.
Time for real apprenticeships in tech and not year (Score:5, Interesting)
Time for real apprenticeships in tech and not years of theory?
Re: (Score:2)
No. Time for real theory coupled with real experience. Apprenticeships only work when the profession is ruled by real craftsmen. The programmers today rarely qualify, hence real apprenticeships would only make things worse.
Re: (Score:2)
Won't change much. Even the "real theory" is half assed except in a select few colleges, usually (but not always) the high end ones. Then the professors that are good at the theory are usually impossibly terrible at the engineering aspect but still pass on their words as laws.
Its really an awkward situation.
Re: (Score:2)
I had the luck to have a really good theoretician do the introductory CS year at university for me and that he invested a lot of effort to find out how to do these things well in practice. I only later found out that the years before and after (they did rotate the introductory year) got a far, far worse education, either by bad practitioners or by theoreticians with exactly the problem you describe.
The bottom line is however that to be really good, you have to understand both theory and practice and you hav
can't have unpaid internships any more (Score:2)
and what about internships with out the university part.
Patents (Score:5, Insightful)
One reason I had so many patents relatively early in my career is I wound up doing hardware design in a much different area than I had planned on in school. I did not know the normal way to do things. So I figured out ways to do things.
Sometimes I wound up doing stuff normally but it took longer, this was OK as a bit of a learning curve was expected (they hired me knowing I didn't know the area yet).
Sometimes I did things a bit less efficiently than ideal, though this was usually fixed in design reviews.
But sometimes I came up with something novel, and after checking with more experienced folks to make sure it was novel, patented it.
A decade later, I know how a way to do pretty much everything I need to do, and get a lot less patents. But I finish my designs a lot faster:).
You need people who don't know that something isn't possible to advance the state of the art, but you also need people who know the lessons of the past to get things done quickly.
Wow man (Score:4, Funny)
Become One with the WTF (Score:2)
I agree having an open mind is a good thing. There is, of course, taking things too far. Just throw away everything we've spent the last 40-50 years developing? Is there some magical aura we should tap into, and rock back and forth absorbing instead? Should we h
Re: (Score:2)
We don't even know what a computer is.
Think of it like this. If you believe you already know what a computer is, then you are not likely to look for alternatives. If you're looking for alternatives, then you might come up with something interesting like this [hackaday.com]. If you just accept that super-scalar pipelines, the way Intel does it, is the best way, then you're not going to find a different, potentially better way of doing it.
Re: (Score:2)
Far from it. I seem to recall a researcher I read about over a decade ago who was designing a chip that worked more like a human neuron. Superscalar pipelines is just how Intel does instructions, and even they're trying to get away from it due to the cost of cache misses becoming more expensive as pipeline lengths increase. Having a talk on not being constrained to accepted dogma, and outright throwing away all known concepts are completely different things.
The very fact that you and I can even have this co
Re: (Score:3)
From TFA
It's possible to misinterpret what I'm saying here. When I talk about not knowing what you're doing, I'm arguing against "expertise", a feeling of mastery that traps you in a particular way of thinking.
But I want to be clear -- I am not advocating ignorance. Instead, I'm suggesting a kind of informed skepticism, a kind of humility.
Ignorance is remaining willfully unaware of the existing base of knowledge in a field, proudly jumping in and stumbling around. This approach is fashionable in certain hacker/maker circles today, and it's poison.
Knowledge is essential. Past ideas are essential. Knowledge and ideas that have coalesced into theory is one of the most beautiful creations of the human race. Without Maxwell's equations, you can spend a lifetime fiddling with radio waves and never invent radar. Without dynamic programming, you can code for days and not even build a sudoku solver.
It's good to learn how to do something. It's better to learn many ways of doing something. But it's best to learn all these ways as suggestions or hints. Not truth.
Learn tools, and use tools, but don't accept tools. Always distrust them; always be alert for alternative ways of thinking. This is what I mean by avoiding the conviction that you "know what you're doing".
Does that sound better?
Too simplistic (Score:2)
We do know a number of things about programming. One is that it is hard and that it requires real understanding of what you intend to do. Another is that languages help only to a very, very limited degree. There is a reason much programing works is still done in C, once people know what they are doing, language becomes secondary and often it is preferable if it does not help you much but does not stand in your way either. We also know that all the 4G and 5G hype was BS and that no language will ever make th
Re: (Score:2)
The big disappointment when I read about the 1st "design patterns", other than liking the idea of encouraging generic and somewhat uniform descriptions of solutions, I found many of them were solutions to problems created by JAVA and other restricted thinking.
See Fast Food Nation; the parts about industrialization and how it took expertise and eliminated it in favor of simple mechanistic low wage zero talent jobs.
Re: (Score:2)
Indeed. Some things were nice to see a bit more formalized, but the patterns are typically far too lean to fit a real problem. I also think that many things were formalized in the pattern community early on, just to get some mass. Most of these had better been left informal, as good programmers had no issues with them anyways and bad programmers did not get what they truly meant.
The analogy with Java is a good one, as my experience with Java is that once you have a certain skill, it stands constantly in you
Re: (Score:2)
Re: (Score:2)
Well said. One permanent problem with Java is that it is inherently slow and takes extraordinary skills to get to run fast. Most Java folks do not have that skill. I have seen really large, critical and expensive Java projects fail, because things were just to damn slow. The same thing could never have been slow in C, as it would have meant intentionally wasting inordinate amounts of CPU and memory. True, the team that wrote that Java would never have finished the code in C, but a much smaller team of peopl
Re: (Score:3)
Re: (Score:2)
Try writing something like GIMP in Java, it will fail horribly. GIMP is OO C (yes, not C++), something that requires actual skill. This does not make people that have that skill slower, it makes them faster. Yes, I admit that I nowadays mix Python and C modules, because doing glue code that has low performance needs is easier in modern scripting, while the C classes give me the control and performance needed for many things.
But Java? That thing is an abomination. Neither fast nor simple. Neither well struct
'Back to the Future' of Programming (Score:2)
better feel for stability in the 70s/80s (Score:2)
Most people I worked with in the 80s (and learned from in the 70s) had a good feel for concepts like "stable systems", "structural integrity", "load bearing weight", and other physical engineering concepts. Many from engineering degrees (most of them weren't CS grads like me), and a lot from playing with legos, erector sets, chemistry sets, building treehouses (and real houses). These concepts are just as important in software systems, but I can only think of a handful of people I've worked with over the
Software reinvents the wheel (Score:3)
I find it interesting that people in software think they are the first ones to ever design complicated things. It seems there are so many arguments over design styles and paths. All they need to do is look at what other engineering fields have done for the past 100+ years. It's pretty simple. When you are working in a small project where the cost for failure and rework is low you can do it however you want. Try out new styles and push technology and techniques forward. When it comes to critical infrastructure and projects where people will die or lose massive amounts of money you have to stick with what works. This is where you need all of the management overhead of requirements, schedules, budgets, testing, verification, operation criteria, and the dozens of other products besides the "design".
I'm a mechanical and a software engineer. When I'm working on small projects with direct contact with the customers it's easy and very minimal documentation is needed. But as more people are involved the documentation required increases exponentially.
You mean like the Antikythera Mechanism? (Score:2)
The base is set. The abstractions are open (Score:2)
So, we know what the computer does. It's this: List of x86 instructions. [wikipedia.org] It executes those instructions. The device stores and executes instructions. [wikipedia.org]
We think in terms of programming languages. The language abstracts away the complexity of manually generating the instructions. Then we build APIs to abstract away even more. So we can program a ball bouncing across a screen in just a few lines rather than generating tens of thousands of instructions manually, because of abstraction built upon abstraction.
In ha
Suggested Reading: Mythical Man Month (Score:2)
What was surprising to me was the fact that something written in the 60's about software development is still very relevant today.
The engineers who worked on the IBM System/360 OS discovered software engineering through pure trial and error.
One of the classic insights from the book that I've seen companies (i.e. Microsoft) violate over and over is Brook
The mess at the bottom (Score:5, Insightful)
A major problem we have in computing is the Mess at the Bottom. Some of the basic components of computing aren't very good, but are too deeply embedded to change.
The Pascal/Modula/Ada family of languages tried to address this. All the original Macintosh applications were in Pascal. Pascal was difficult to use as a systems programming language, and Modula didn't get it right until Modula 3, by which time it was too late.
Re:The mess at the bottom (Score:4, Insightful)
The whole x86/64 architecture is a mess when you get deep enough. It suffers severely from a commitment to backwards compatibility - your shiny new i7 is still code-compatible with an 80386, you could install DOS on it quite happily. But the only way to fix this by now is a complete start-over redesign that reflects modern hardware abilities rather than trying to pretend you are still in the era of the z80. That just isn't commercially viable: It doesn't matter how super-awesome-fast your new computer is when no-one can run their software on it. Only a few companies have the engineering ability to pull it off, and they aren't going to invest tens of millions of dollars in something doomed to fail. The history of computing is littered with products that were technologically superior but commercially non-viable - just look at how we ended up with Windows 3.11 taking over the world when OS/2 was being promoted as the alternative.
The best bet might be if China decides they need to be fully independant from the 'Capitalist West' and design their own architecture. But more likely they'll just shamelessly rip off on of ARM or AMD's designs (Easy enough to steal the masks for those - half their chips are made in China anyway) and slap a new logo on it.
Re:The mess at the bottom (Score:4, Interesting)
Some of the problems were pointed out:
- The device access model is still stuck in the ISA age, when peripherals were just wired up to the address and data buses. That isn't how things are done now - even the PCI-e 'bus' is actually a series of high-speed serial links. This means that all device drivers have to run in kernel memory space. Stability and security problems result.
- The 16-bit 'real' addressing mode. Another relic of the past, but still can't be abandoned without breaking the boot process. Lose that, and you could lose some complexity in silicon.
- Even the 32-bit mode could arguably go. The only upside it has over 64-bit is slightly lower memory usage when there are a lot of pointers being used, and it's a real headache at the OS level maintaining two variations on every library to support both 32-bit and 64-bit programs. Lose 32-bit, and you lose a load more complexity. Also means you could lose PAE as a bonus.
- There are opcodes for handling BCD. These are just completly pointless.
Re: (Score:2)
Regarding C/C++, Those languages are optimized to be close to the hardware; that's their forte: they are semi-assembler-language. If you optimize the language for software engineering improvements (code design & reliability), then you likely de-optimize it for hardware.
This is a common, and dangerous, misconception. It's quite possible to have efficient languages that are close to the hardware without having buffer overflows all over the place. Pascal did it. The various Modulas did it. Ada does it. Go is getting close. Subscript checking is really cheap, and often free, if the compiler understands how to optimize it. Hoisting subscript checks out of loops is important. The current Go compiler gets the easy cases (FOR loops), which is enough to keep the overhead down for
Other than shared memory latency, pretty good. (Score:2)
The Future is to scrap what you think is and . (Score:2)
.... realize what IS! http://abstractionphysics.net/pmwiki/index.php [abstractionphysics.net]
Entertaining, but... (Score:4, Insightful)
It's an entertaining presentation, but I don't think it's anything nearly as insightful as the summary made it out to be.
The one thing I take away from his presentation is that old ideas are often more valuable in modern times now that we have the compute power to implement those ideas.
As a for-example, back in my university days (early-mid 1980s), there were some fascinating concepts explored for computer vision and recognition of objects against a static background. Back then it would take over 8 hours on a VAX 7/80 to identify a human by extrapolating a stick figure and paint a cross-hair on the torso. Yet nowadays we have those same concepts implemented in automatic recognition and targetting systems that do the analysis in real time, and with additional capabilities such as friend/foe identification.
No one who read about Alan Kay's work can fail to recognize where the design of the modern tablet computer really came from, despite the bleatings of patent holders that they "invented" anything of note in modern times.
So if there is one thing that I'd say students of programming should learn from this talk, it is this:
Learn from the history of computing
Whatever you think of as a novel or "new" idea has probably been conceptualized in the past, researched, and shelved because it was too expensive/complex to compute back then. Rather than spending your days coding your "new" idea and learning how not to do it through trial and error, spend a few of those days reading old research papers and theories relevant to the topic. Don't assume you're a creative genius; rather assume that some creative genius in the annals of computing history had similar ideas, but could never take them beyond the proof-of-concept phase due to limitations of the era.
In short: learn how to conceptualize and abstract your ideas instead of learning how to code them. "Teach" the machine to do the heavy lifting for you.
Solved in 1959 (Score:2)
The near ideal programming language was invented in 1959: Lisp. It has nearly open-ended abstraction ability via simple syntax: flexibility without clutter. It's a thing of design beauty.
However, nobody has figured out how to leverage it for team programming and fungible staff. Regimented languages just seem to work better for your standard cubicle teams.
I think he got it wrong why we got lost ... (Score:4, Interesting)
I think he got it wrong why we got lost.
It's not because we didn't or don't know. It's because software was free back then. Hardware was so bizarly expensive and rare that no one gave a damn about giving away software and software ideas for free. It's only when software was commercialised that innovation in the field started to slow rapidly. The interweb is where it was 18 years ago because ever since simply because people are busy round the clock 24/7 trying to monetise it rather than ditching bad things and trying new stuff.
Then again, x86 wining as an archtecture and unix as software model probably does have a little to do with it aswell. We're basically stuck with early 80ies technology.
The simple truth is:
CPU and system development need's its iPhone/iPad moment - where a bold move is made to ditch out decade old concepts to make way for entirely new ones!
Look what happed since Steve Jobs and his crew redid commodity computing with their touch-toys. Imagine that happening with system architecture - that would be awesome. The world would be a totally different place in 5 years from now.
Point in case: We're still using SQL (Apollo era software technology for secretaries to manually access data - SQL is a fricking END-USER INTERFACE form the 70ies!!!) as a manually built and rebuilt access layer to persistance from the app level. That's even more braindead than keeping binary in favour of ASM, as given as example in the OPs video-talk.
Even ORM to hide SQL is nothing but a silly crutch from 1985. Java is a crutch to bridge across plattforms because since the mid 70ies people in the industry have been fighting turf wars over the patented plattforms and basically halted innovation (MS anyone?). The sceomorphic desktop metaphor is a joke - and allways has been. Stacked windowing UIs are a joke and allways have been. Our keyboard layout is a provisionary from the steam age, from before the zipper was invented (!!). E-Mail - one of the bizarest things still to be in widespread use - is from a time when computers weren't even connected yet, with different protocolls for every little shit it does, bizar, pointless, braindead and arcane concepts like the seperation of MUA, editor and seperate protocolls for sending and recieving - a human async communication system and protocol so bad it's outclassed by a shoddy commercial social networking site running from webscripts and browser-driven widgets - I mean WTF??? etc... I could go on and on ...
The only thing that isn't a total heap of shit is *nix as a system, and that's only because everything worthwhile being called Unix today is based on FOSS where we can still tinker and move forward with babysteps like fast no-bullshit non-tiling window managers, complete OpenGL accelerated avantgarde UIs (I'm thinking Blender here), workable userland and OS seperation and a matured way to handle text-driven UI, interaction and computer controll (zshell & modern bash).
That said, I do believe if we'd come up with a new, entire FOSS hardware arcitecture "2013" with complete redo and focus on massive parallel concurrency and build a logic-and-constraint driven touch-based direct-maniplation-interface system - think Squeak.org completely redone today for modern retina touch display *without* the crappy desktop - that does away with seperation of filesystem and persistance seperation and other ancient dead-ends, we'd be able to top and drop *nix in no time.
We wouldn't even miss it. ...
But building the bazillionth web framework and next half-assed x.org window manager and/or accompaning windows clone or redoing the same audio-player app / file manager / UI-Desktop toolkit every odd year from bottom to top again appears to be more fun I guess.
My 2 cents.
Re:Short version? (Score:4, Funny)
Re: (Score:2)
Re:Short version? (Score:4, Interesting)
You must be new here. That "pretentious philosophical BS" is like the spark in a fuel-and-oxygen filled chamber. It ignites into a heap of comments, and those comments are the actual story. Who needs an article when you can browse +5 funny / informative / interesting and -1 trolls?
As for the linked articles, that's just a cleverly disguised DDoS botnet setup. Some figured it out, but few seem to care the /. botnet is still operating. Heck, I'm even contributing people-time to it (on top of CPU cycles).
Re: (Score:2)
All I know about 1973 .. (Score:4, Interesting)
.. is that C was seen as a major setback by Frances E. Allen and others.
Source:
Frances E. Allen
ACM 2006 Conference
http://www.youtube.com/watch?v=NjoU-MjCws4 [youtube.com]
The context here surrounds abstractions and not allowing users (programmers) to play with pointers directly (C, and later, C++), which is a setback concerning optimization, because of the assumptions/connections you make about/with the underlying machine.
If you want to learn more about the ideas of the 1960s and 1970s, I highly recommend looking up talks by Alan C. Kay ("machine OOP" which is Smalltalk in a nutshell), Carl Hewitt (actor model), Dan Ingalls, Frances E. Allen (programming language abstractions and optimization), Barbara Liskov ("data OOP" which is C++ in a nutshell), and don't stop there.
Re: (Score:2)
I highly recommend looking up talks by Alan C. Kay ("machine OOP" which is Smalltalk in a nutshell),
Do you have a specific talk in mind here? This sounds fascinating.
Re:All I know about 1973 .. (Score:5, Informative)
The thing to get here is that there are basically two kinds of OOP, so to speak.
Here's a short discussion that covers it:
https://news.ycombinator.com/item?id=2336444 [ycombinator.com]
Kay OOP is closely related to the actor model by Carl Hewitt and others.
Liskov had her own idea of OOP, and she was not aware of Smalltalk (Kay, Ingalls) at the time. She started work on her own language, CLU, at the same time as Smalltalk was developed.
Re: (Score:2)
Sure, if you're doing high level programming (and plenty were in the 70s just as today), C is a bad tool. If you're writing an I/O driver and the hardware works though updates to specific memory addresses, well, you need to be aware of pointers.
I see the biggest failing of C itself was this notion of "int", where you don't know how many bits that is. If you're writing the kind of code that belongs in C, you have to know that, and endless 16-32 and 32-64 bit porting nightmares were the result. It wasn't u
Re: (Score:2)
If you're writing the kind of code that belongs in C, you have to know that, and endless 16-32 and 32-64 bit porting nightmares were the result. It wasn't until C99 that int32_t became standard.
I've always suspected (and I could certainly be wrong) the main cause of 32/64-bit pain is not actually that the programmer can't (or rather, shouldn't) depend on the limits of the fixed-point primitive types, but instead that programmers stupidly assume things like "an int will always be wide enough to hold the value of a pointer". C being over-lenient with implicit-casts is largely to blame, of course.
The mistake was mistaking "you can write a program that will compile for all platforms" for "your program will do what you expect on all platforms for which it compiles". The latter being rather more useful.
You're ignoring the reason C went the way it does: performance. 'int' can translate to whatever is fastes
Re: (Score:3)
In C99 this is called int_fast32_t: give me the fastest size that holds at least 32 bits. That's what was needed all along - well, with a shorter, less obnoxious name. If I'm counting higher that 2^16, I just don't care how fast that 16 bit int is, my program will fail mightily. But if you can do 64-bits faster than 32-bits, maybe that's OK.
Re: (Score:2)
We don't even know what a "job" is (Score:2)
We don't even know what "employment" is, what a "salary" is, and what "benefits" are . . .
Re: (Score:2)
Ever heard of analog computers? They existed. There were a few made using tri-state logic too, and a lot of the early ones used base ten arithmatic in hardware via dekatron tubes.