Power of Modern Programming Languages is That They Are Expressive, Readable, Concise, Precise, and Executable (scientificamerican.com) 268
An anonymous reader shares a Scientific American article: Programming has changed. In first generation languages like FORTRAN and C, the burden was on programmers to translate high-level concepts into code. With modern programming languages -- I'll use Python as an example -- we use functions, objects, modules, and libraries to extend the language, and that doesn't just make programs better, it changes what programming is. Programming used to be about translation: expressing ideas in natural language, working with them in math notation, then writing flowcharts and pseudocode, and finally writing a program. Translation was necessary because each language offers different capabilities. Natural language is expressive and readable, pseudocode is more precise, math notation is concise, and code is executable. But the price of translation is that we are limited to the subset of ideas we can express effectively in each language. Some ideas that are easy to express computationally are awkward to write in math notation, and the symbolic manipulations we do in math are impossible in most programming languages. The power of modern programming languages is that they are expressive, readable, concise, precise, and executable. That means we can eliminate middleman languages and use one language to explore, learn, teach, and think.
alexa turn off lamp (Score:2)
The Desk (Score:2)
I'm waiting for Douglas Adam's Desk based computer interface. You sit at it and start working to solve your problem, the AI looks at what you are doing, infers the problem you are trying to solve, then solves it for you. Seems like the best possible use of an AI. Of course, one might say Clippy tried to do that. But AI has come a long way.
Re: (Score:2)
Re: alexa turn off lamp (Score:2)
Re: (Score:2)
I swear the computers are training us more than we're training them though. Natural language is changing, even my kids change the way the talk when they are talking to the amazon dot.
That's because they've figured out that the voice recognition is very limited in its capability.
If everyone starts speaking "flat" English to these things, the manufacturers will have little motivation to ever improve the voice recognition.
So, you are right.
What? (Score:5, Insightful)
With modern programming languages -- I'll use Python as an example -- we use functions, objects, modules, and libraries
Who writes this shit? Confirming that C uses neither functions nor objects nor modules nor libraries
Re:What? (Score:5, Insightful)
Re:What? (Score:5, Informative)
Neither Fortran, nor C are first generation languages
In fact, every language he mentions in the article is a textbook example of a third-gen language (3GL). I didn't see any mention of even a single first generation language.
I'll give him the benefit of the doubt and assume he was trying to keep his writing approachable, but "first generation languages" has had an established meaning in this field for the last several decades, so misusing the terminology just makes it look like he has no idea what he's talking about. A far better way to phrase it would have been "In early high-level languages like FORTRAN and C [...]", since that would have been accurate while also being descriptive enough for most lay people to understand well enough.
For anyone who doesn't know about them, programming language generations refer to the level of abstraction, NOT to when they were introduced:
All of which is to say, starting off an article with a glaring misuse of established terminology is a great way to get anyone with a passing awareness of the topic to immediately dismiss anything he has to say. As a professor, I'd hope he'd have a better awareness about the importance of using those terms correctly. And as the publisher, I'd hope that Scientific American would have an editor who would recognize when they were out of their depth and would need to call in a domain expert to proof the text.
Re: (Score:2)
I worked with Prolog. You specify the problem as a series of associations and constraints. Then the system would run a brute force scan through all the possible combinations and come up with the possible solutions. Worked great for generating university timetables and solving any combinatoric problem.
Maybe Matlab could be 5GL, since many companies seem to use that to generate algorithms.
Re: (Score:2)
Hmm.
Re: (Score:2)
SQL was called a 4GL by our database lecturer. The fact that you could write some scripts to process a relational database on the network was a revolution at the time. Spreadsheet scripting came close along with any kind of scripting functionality with an application. No compiling or command prompt commands, just a safe little command line window.
Re: (Score:2)
There were macro assemblers before Fortran, and assemblers before macro assemblers, and octal op codes before assemblers, and panel switched before octal op codes, and wiring panels before panel switches. I suppose I could keep going back until I reached gear wheels, but anything prior to Babbage/Ada Lovelace doesn't really count as the same subject.
Now just where you start calling one of those things a language is something we could argue about, but that's one of those artificially chosen boundaries...whe
I put a hex on you (Score:2)
Octal was for people who hadn't learned their digits up to F.
Re: (Score:2)
The Chrysler Imperial was the first car to feature all-wheel anti-lock brakes, but we wouldn't call it a first generation car, since it was clearly preceded by others. The iPhone 5 was the first to feature a Retina display, but no one's suggesting it's a first-gen iPhone, since the name alone tells you that it's part of a line that goes back before it.
So, why, then, do you refer to FORTRAN as a first generation language, just because it was the first to feature a high-level of abstraction? The fact that it'
Re: (Score:2)
Fact-checking myself: it was the iPhone 4, apparently, that introduced the Retina display. Mea culpa.
Re: What? (Score:2)
C libraries (Score:2)
Re: C libraries (Score:2)
Re: (Score:2)
C doesn't support libraries, that's done by the linker. But C does support modules. It's not even convoluted. You just make each module a separate file, and only disclose those you want visible in the header file. Static and external also help when used in various places.
Note: The actual connection of the thus created modules is still done by the linker. But because of the definition of the header file and the static and external keywords it's fair to say that C supports them.
That said, I don't like t
Re: C libraries (Score:2)
Re: (Score:2)
OTOH, the C compiler doesn't even know about the existence of header files.
Re: (Score:2)
OOP is only a style, not a thing. A language can have special features to support it, but code doesn't lack objects just because the language doesn't have special features for that.
If it is programmed using OOP semantics then it is OO regardless of language. Lots of OO is done in C. Even the whole Gtk library is OOP in plain C. In the 90s I had all the headers printed out as a desk reference.
So you don't need a struct. You could use one, but generally with OOP in C all you have to do is pass the reference t
This is MTBS - Marketing-type BullShit (Score:2)
The people who write stuff like this also write mission statements like:
DoIT’s mission is to empower the State of Illinois through high-value, customer-
centric technology by delivering best-in-class innovation to client agencies fostering collabora
tion and empowering employees to provide better services to residents, businesses, and visitors.
Giveth me a break.
Re: (Score:2)
you missed a spot: ... by using synergistic energies between stakeholders ... /*pukes*
Re: (Score:2)
With modern programming languages -- I'll use Python as an example -- we use functions, objects, modules, and libraries
Who writes this shit?
Obviously, not a programmer.
Re: (Score:2)
With modern programming languages -- I'll use Python as an example -- we use functions, objects, modules, and libraries
Who writes this shit?
Obviously, not a programmer.
Those who can, do. ...
Those who can't
Re: (Score:2)
Also bizarre to call Python a modern language. It's been around a long time and is fundamentally not that different from other scripting languages that showed up around that time.
If you want expressive, readable, concise, precise, and executable, then use Scheme! Sure, you have to actually *learn* the language first, but that must be true for every language.
Re: (Score:2)
Now go - write a devices driver or Linux kernel in Python.
I wonder if that can be done. Write the driver in Python and use Cython [cython.org] to convert Python to C.
I once wrote a Python script to roll a pair of dice 1M times that took 123 seconds. Used Cython to convert the script to C that ran in a second.
Re: (Score:2)
I doubt it would work with memory mapped I/O. You need to set/read data at a specific address. The high level language is going to shove stuff wherever it likes.
If modern systems don't use memory mapped I/O then didn't shoot me. It was 30 years ago that i last meddled with a device driver (in 8086 assembly - beurk!).
Having said that you can do anything in a language like Sinclair Basic because it has PEEK and POKE, though some might say that's cheating.
Re: What? (Score:2)
You need to set/read data at a specific address.
Many higher-level languages such as Oberon supported this with a set of low-level "loophole" utility functions. That's how the Oberon system was written without a single line of assembly language anyway.
Re: (Score:2)
There was something called "Named Common" storage in Fortran that is quite capable of being assigned to absolute memory locations. If you don't mind the fact that the only structured data concept in Fortran was array data. Peek and Poke are loopholes to I/O ports, but perfectly functional ones. Volatile memory is harder to manage without language extensions, especially when compilers start seriously optimizing code.
Prime Computer actually optimized their machine instruction set to run Fortran optimally - th
Re: (Score:2)
Re: (Score:2)
Memory mapped I/O is still a thing.
However, there is the ctype module for python that will let it actually work with that, though it might be a bit clunky compared to just doing it in C.
Re: (Score:2)
Clearly there's been a paradigm leap or a quantum shift and you missed it because you're a square old daddy-o. Something like that.
I haven't read it in ages but I thought Scientific American was supposed to be OK. This is the kind of plop you expect from Betawireverge.
Re: (Score:2)
The power of modern programming languages is that they are expressive, readable, concise, precise, and executable
Maybe SciAm has decided to compete with the Onion. Of course the sentence above isn't really Onion class, but it did make me chuckle.
Anyone have an example of a non-executable programming language and a situation where you might want to use it?
Re:What? (Score:5, Informative)
Everybody simmer down. TL;DR:
1. TFA is not an article in Scientific American magazine, it is a "guest blog." Quote: "The views expressed are those of the author(s) and are not necessarily those of Scientific American."
2. The author teaches computer science at a private undergraduate college. He has written a series of books, all of which use Python examples to explain concepts.
3. When he talks about "modern languages," he's clearly referring to languages that look and feel a lot like Python. Mainly Python.
4. When he talks about "eliminating intermediary languages," he means things like pseudocode. He believes (and gives an example to illustrate) that Python is now expressive enough that anything you might previously have expressed in pseudocode you can now express in Python in one go.
5. From there, he makes the leap that you no longer need to teach computer science from a "math-first" approach; you can instead use a "code-first" approach, where you teach students to "think" in Python.
6. Hand-waving about the future from there.
Re: (Score:2)
Oh, and adding to my own comment: No, he never once uses terms like "functional programming," "Lisp," "Scheme," etc.
Re: (Score:2)
If you simmer down a word salad, I'm not sure it is really any improvement.
I can tell you from my Perl experience that just because the language is expressive doesn't mean that no translation is required to get from abstract business logic to implementation. I've used Python and there it gives no advantage in this area.
My advice to the blog author it to stick to being a blog author, because in the real world the implementations are not all trivial.
Re: (Score:2)
Perl is about as expressive as Helen Keller.
Re: (Score:2)
No math-first means he's at another of those wannabe colleges that trade schools in disguise. Removing the science from Computer Science.
Re: (Score:2)
So the definition of a real language is the ability to program device drivers? Are you a moron or just playing one on /.?
Re: (Score:2)
(or a device driver)
Let's talk hammers (Score:2)
When your only hammer is Python, all problems begin to look like they take huge amounts of memory via a parasitic interpreter, and can only be solved in trillions of low level compute cycles.
And I say that as a huge fan of Python. But I would never let it be my only hammer, or make the ludicrous claims in TFS (nor does TFS give me any reason to spend time with TFA.)
Finally: You can usually spot unqualified-in-c / c++ programmers by the claims they make about c / c++ that are based upon their own incompetenc
Re: (Score:2)
No, a real language should allow you to program anything, including device drivers.
Re: (Score:2)
Well, to be fair you can, even with python, or perl.
write script that accepts textfile of op codes, convert opcodes to asm using pack();
write to file in binmode.
Simple! (stupid for sure, but simple).
I guess that's really a compiler more than direct...
make your compiler able to convert python to asm. there. stupider and still simple.
TL;DR of above:
I *dislike* Python (I'm a Perl guy, I want my curlies and semis damnit...). Sure I could contrive to write a driver in a scripting language, but why? If I'm mu
Re: (Score:2)
It would be interesting seeing a device driver in COBOL or Fortran.
Re: (Score:2)
Re: (Score:2)
Why would you expect the editors to be professionals programmers? Must they also be astronauts and chemical engineers? They are paid to pick submissions to place on the front page. They are clearly not experts in technical fields, nor should they need to be. Unfortunately, they are also not paid to edit those submissions for grammar or spelling.
No, the fault here lies squarely with Scientific American for publishing this drivel in the first place. WTF were they thinking? They really should have technical ex
Re: (Score:2)
Everyone loves a good train wreck. Just sayin'.
Re: What? (Score:2)
Re: What? (Score:2)
Re: (Score:2)
This is funny stuff, all around. When I'm using Ruby, I solve this problem by not asking that types do any validation. Instead, each object has to be able to validate all its input fields, and generally will be expected to have staged data input where you set the values, validate them, and then save the object using an accessor that raises an error (or generates a log entry) without saving if the validation fails. This works really well even for things like, "the first parameter is an ascending sequence of
Re: (Score:2)
You can still define an interface using dynamically-typed languages.
Having interface police built into the language may or may not be good in one situation or another, but it is not a requirement of being able to define an interface.
Words matter, even in logically fuzzy field like programming. ;)
Re: What? (Score:2)
Hey, hey, hey, don't recycle the charter of COBOL (Score:5, Insightful)
Re: (Score:3)
My dad was a COBOL programmer for more than 30 years, learned on an IBM System 36 in the late sixties and early seventies. He maintained that COBOL was very human-readable, especially compared to languages like C where something like
for(;P("\n"),R--;P("|"))for(e=C;e--;P("_"+(*u++/8)%2))P("| "+(*u/4)%2);
will clean-compile.
If one considers the growth of business-computing, business has always been about what the nontechnical person can understand, as the nontechnical person is usually the business manager. That's in-part why nowadays everything is either GUI-only or at least h
Re: (Score:2)
COBOL was/is a lot more readable because of the pseudo-english syntax. We used to try and write 'dirty' COBOL too: PERFORM UNNATURAL-ACTS VARYING PARTNERS FROM
Those long winter evenings just flew by.
Re: (Score:2)
Re: (Score:2)
The subborn human race. (Score:5, Insightful)
"...That means we can eliminate middleman languages and use one language to explore, learn, teach, and think."
One solution for all? Never gonna happen.
Some prime examples:
"That means we can eliminate the standard system and use one metric system to measure everything."
"That means we can eliminate the right-side driving wheel and everyone will drive on the same side of the road."
"That means we can eliminate all of the world's individual spoken languages and use only one language to communicate."
Humans are stubborn. Like really fucking stubborn.
Re: (Score:2)
A hammer is better.
No! A wrench is better!
You're both wrong, a screwdriver is the best!
concise? readable? (Score:5, Insightful)
I don't buy this. A simple hello world in Java is much more complex and wordy than the same functionality in 50 year-old BASIC. And any language that relies on whitespace to modify the program flow cannot be described as readable.
And many object-oriented programs have so much of their basic functions hidden away in inheritance and class definitions that a printed form of a program is impractical. I would not call that "progress".
As for natural language, it tends to be incredibly imprecise: the meaning is only apparent when the context of its use is taken into account. I would love to see a translator that tried to convert "natural language" sarcasm into executable code. But I wouldn't want it running in my driverless vehicle or airplane.
Re: (Score:2)
Re: (Score:2)
I don't buy this. A simple hello world in Java is much more complex and wordy than the same functionality in 50 year-old BASIC. And any language that relies on whitespace to modify the program flow cannot be described as readable.
Now use the 50 year old BASIC to do a modern task in a way that is portable and provides a graphical output to an arbitrary device.
I don't think esoteric languages was mentioned in the article (not that I've read it - but it would make no sense), what real language are you thinking about that uses whitespace for control flow?
Re: (Score:2)
... what real language are you thinking about that uses whitespace for control flow?
Is that a serious question?
Re: (Score:2)
I call BS. Modern programming languages = bloat (Score:5, Insightful)
Case in point: non-memory managed languages don't need to manage memory - memory requirements for these programs are huge due to inadequate planning. How many programmers take into account object pooling?
Case in point: The Motorola Startac was a very limited device but had its programming in hardware - you could not type faster than the device. New smartphones have 2-3Gb in memory and yet are less responsive.
Case in point: a field programmable gate array was never intended for production use - yet every computer today uses these
Case in point: how many Java programmers think about += string concatenation versus =+ string concatenation?
While the K&R manual is correct that every complex problem can be further simplified by one more level of indirection, it is not true that there is no cost.
Our computers today are 1000 times more powerful and solve the same problems as before. So what has changed? Our efficiency in coding as dropped and we are not using the resources at hand to make better solutions but sloppier ones that require less effort on our part but more computational overhead.
We have become lazy and complacent and we call it progress. Until our programs can optimize to the level that we can generate by hand, I would not deem to consider our current state of programming languages an improvement in anything other than readability.
Re: (Score:2)
Re: (Score:2)
P.S. Every example provided in Python can be done as readable in C or D or even C++. It is not a function of the language but a function of the programmer.
Yup, in C++ it's even easier now because C++11 has language support for lambda's
Re: (Score:2)
Our computers today are 1000 times more powerful and solve the same problems as before. So what has changed? Our efficiency in coding as dropped and we are not using the resources at hand to make better solutions but sloppier ones that require less effort on our part but more computational overhead. We have become lazy and complacent and we call it progress. Until our programs can optimize to the level that we can generate by hand, I would not deem to consider our current state of programming languages an improvement in anything other than readability.
While I agree with your criticisms of TFA, I think it not quite correct to assume that "require less effort on our part" or "readability" are unimportant pieces of the picture. As I see it, software has multiple kinds of unavoidable costs: cost to design, cost to implement/code, cost to test, cost to maintain, cost in hardware resources to run.
Using a high-level language may increase the last cost, while reducing all the other kinds of cost. That may well be the wise tradeoff.
Of course, that sounds good u
Re: I call BS. Modern programming languages = bloa (Score:2)
Case in point: non-memory managed languages don't need to manage memory - memory requirements for these programs are huge due to inadequate planning. How many programmers take into account object pooling?
Sounds like several issues being conflated into one, badly. Case in point: Java and Smalltalk not having space-efficient headerless value types being a distinct problem from having automated memory management, or object pooling fucking up generational memory management schemes.
Re: (Score:2)
Unless you're talking about C, or D, the fact of the matter is that you've hidden all the computational overhead in multiple layers of automated translation behind your syntactical sugar.
Surely you must mean C++ not C. What would you call Boost? Syntactical sugar?
Furthermore, not all applications need 100% optimized performance or scalability to meet the demands of millions of users. Real seasoned programmers use the right tool for the job based on what's needed not some ivory tower bullshit. If we need 3 or 4 9's of high availability we use one approach, if we expect a small number of concurrent users on some mobile app, we use a different approach.
No offense, but you sound inexperienc
Re: (Score:2)
Re: (Score:2)
Unless you're talking about C, or D, the fact of the matter is that you've hidden all the computational overhead in multiple layers of automated translation behind your syntactical sugar.
What about all other languages for system programming* or otherwise? You are wrong BTW, even _real_ high level languages can have thin abstraction layers.
Also I don't think you understand what the term syntactic sugar means - it isn't an indication of abstraction but an indication of a feature intended to be "sweeter" for the programmer. Garbage collection isn't syntactic sugar, the C pre/post increment support is. Many languages support alternative looping constructs that reduce typing compared to a generi
Re:I call BS. Modern programming languages = bloat (Score:4, Funny)
#define unless(x) if(!(x))
^how to *really* piss off the Perl hater in your C development team.
Sounds fundamentally wrong (Score:3)
If the assertion is that one language can be used for everyday purposes and also for specifying programs, it is hopelessly wrong. The merit of a programming language, like any other scientific specification, is that it is exact, precise, correct, and unambiguous. If it is easy to understand, that is a nice bonus - but it cannot be a high priority.
Natural language, on the other hand, revels in deliberate ambiguity, multiple shades of meaning, and even saying slightly different things to different listeners. That's why synthetic languages like Esperanto never succeed beyond a small circles of fans: you can't write poetry in them, or even any stirring emotional prose.
The purpose of programming languages is to eliminate doubt and ambiguity - to perfect communication. The purpose of natural languages, inasmuch as they have a single purpose, is to sell, seduce, persuade and - yes - confuse.
Re: (Score:2)
That's why synthetic languages like Esperanto never succeed beyond a small circles of fans: you can't write poetry in them, or even any stirring emotional prose.
Is that really true? I don't know, I'm wondering.
Rubbish (Score:2)
What kind of nonsense is this?! (Score:3)
Programming languages are "executable" now?
Really...
He's talking about interpreted vs compiled and interpreted languages have been around before C (BASIC anyone?!)
And as others have posted COBOL made these same claims years ago.
Expressive isn't necessarily better and really depends on your interpretation of such. It is far easier to "express" new hardware concepts/manipulations (like memory handling) in C than it is in Java. It is also easier to make "Expressive" code in C++ much to the detriment of most maintainers!
Python and Java (like BASIC before them - and we're talking about the one with line numbers, not that MS bs) provide low barriers to programming - code, run, repeat. They're great for learning and beginners and to quickly hammer out program concepts. With faster hardware these days you generally can get away with the performance you get from these languages without having to transcribe them to a lower level language (C, Assembly).
It used to be standard practice to write the algorithm in C then run the C parser to output the code directly to the assembly it was going to generate and then hand-tweak the assembly! That we don't need to do THAT anymore is a testament to the hardware and less to the languages used. (and in some cases because you can't anymore because the hardware is too disparate - IE phones)
Executable? (Score:2)
Non-modern programming languages are not executable? What does that mean? They can obviously be compiled/interpreted into executable code, just like modern languages.
Re: Executable? (Score:2)
Re: (Score:2)
I'm not sure I follow.
Wouldn't the proper term then be "evaluatable"?
I understand in devops how even server configurations are "executable" but those aren't programming languages being used to generate those configurations but JSON or some other definition format that may contain executable chunks (like javascript, shell calls, etc).
And even though HTML and XML have the term "language" in them they're not "programming" languages per se.
Explains everything (Score:2)
By using a modern programming language you no longer need to think or plan. You can just tell something to go do. I'm sure that is in no way related to the garbage some of those languages have produced.
Re: (Score:2)
For trivial programs there are many posers who can quickly turn out unmaintainable crap code heavily copy/pasted from Google and Stackoverflow without thinking or planning.
The posers can't build the large non-trivial program, even in a modern language. Thinking and planning is still necessary. Experience also counts. (Experience measured in code and learning from past p
What a piece of crap (Score:4, Insightful)
By executable the author actually means interpreted ..Any decent programmer knows this means the language is slower than need be. Translation is not a bad thing, it is another word for compiling and/or assembling, the process that converts human readable code into actual machine instructions. Any language could be compiled, but in practice some languages (for example C) are compiled (and assembled) while others (for example Basic) are typically interpreted. Interpreting makes the language slower, as a step that occurs one time in the creation of a compiled program before the program is ever run must now happen each time the program is run (and in many cases it happens each time a line of code is rerun, although there are "just-in-time" tricks that can avoid the repeated interpreting.). Interpreted languages are an unfortunate side effect of faster computers, people get lazy, want to not have to go through a separate compilation stem when making a simple change, and figure the interpreted language is "good enough" for the user, even though it means the program will run slower than need be each time every single user runs it.
Interpreted languages have their place. Basic was a good introductory teaching language in its day, as it was intended to quickly let beginners write and test their code. But only the feeble minded would have tried to use it for production coding. There are great special purpose interpreted languages like AWK that are quite good for quick and dirty one time tasks. And I'm even saying all of this as being a Forth programmer who even was on a team that implemented a Forth on the C64. (Forth actually does a lot of the "compilation" when each line or word is written, but it is still inefficient as it spends most of its time in subroutine calls and returns). But interpreted languages will always sacrifice speed, and you can write a compiler to translate any interpreted language into true machine code (although in many cases it isn't worth doing).
Re: What a piece of crap (Score:4, Informative)
Refreshingly optimistic... (Score:2)
That's an idiotic article (Score:3)
Just from the quote.... Is that trying to suggest that C doesn't have functions?
You can write good, modular code in almost any language. And I've *seen* spaghetti python, as well as enough other languages.
And making the code clean and readable depends on the programmer, and whether they pay any attention to standards and best practices.
Not Modern at all (Score:2)
I'll use Python as an example -- we use functions, objects, modules, and libraries to extend the language, and that doesn't just make programs better, it changes what programming is.
Ok young whipper-snapper programmer that thinks you're all that and a bag of potato chips and you're using some new fancy language that supposedly has all these "new" features. Check out, just to name a few, these languages that have been around for several decades: C++, Pascal, Smalltalk, Java... even C# has been around for 15 years now. Do I need to continue? Even Python isn't really all that new.
This article just makes the author sound like they have no clue what
Garbage article. (Score:2)
Absolute garbage.
Baseless language holy war spoken from ignorance and a skewed vision of the past. "It uses libraries!" Really? Is that really all you're trying to bring to the table here? And he discovers that you can use actual code rather than pseudo code?
Scrap away the buzzwords and the remaining statements are just plain false. Does he think C doesn't make use of libraries?
It's garbage and shouldn't have made it's way to Slashdot. Come on msmash, are you just trying to get a rise out of us?
And Verbose... (Score:2)
Poster forgot to include that modern languages are verbose.
For example, Javascript is insanely verbose for even the simplest of things.
Python sales pitch (Score:3)
Re: (Score:2, Insightful)
The article is rubbish. OOP programming, informally at least, has been around since the 1960s. Procedural programming has been around as long. Yes, the very first generation of languages were pretty bereft of "modern" features, but by the 1970s we had most of the paradigms of modern programming in place.
Re: (Score:2)
Re: (Score:2)
Then what? You just stare blankly at your monitor? I don't think your manager would appreciate that.
Heard this quote today: "Don't mistake activity with achievement." - John Wooden
Weed? (Score:2)
You mean the hobbit's leaf? ;)
Re: (Score:2)
You are just showing your ignorance - computer programming is closer to engineering now than it have been before.
The problem is that the development model is fucked up, that most people programming doesn't know actual software engineering* and that crap is accepted as good software.
(* it still isn't real engineering IMNSHO)
Re: (Score:2)
Re: (Score:2)
Someone that knows what they are doing can use new high level languages as a powerful lever when appropriate.
Re: (Score:2)
Forth was created as a cut-down attempt at Fortran for a machine that controlled a radio-telescope, and couldn't support a real Fortran compiler. So it's a derivative of Fortran, even though it looks quite different, and even though the assembler shows through in a lot of the implementations. (That's not a required feature.) Even the name is cut down. The computer wouldn't support 6 letter names.