Are We Experiencing a Great Software Stagnation? (alarmingdevelopment.org) 286
Long-time programmer/researcher/former MIT research fellow Jonathan Edwards writes a blog called "Alarming Development: Dispatches from the User Liberation Front."
He began the new year by arguing that software "is eating the world. But progress in software technology itself largely stalled around 1996." Slashdot reader tonique summarizes Edwards' argument: In 1996 there were "LISP, Algol, Basic, APL, Unix, C, Oracle, Smalltalk, Windows, C++, LabView, HyperCard, Mathematica, Haskell, WWW, Python, Mosaic, Java, JavaScript, Ruby, Flash, Postgress [sic]". After that we're supposed to have achieved "IntelliJ, Eclipse, ASP, Spring, Rails, Scala, AWS, Clojure, Heroku, V8, Go, React, Docker, Kubernetes, Wasm".
Edwards's main thesis is that the Internet boom around 1996 caused this slowdown because programmers could get rich quick. Then smart and ambitious people moved into Silicon Valley, and founded startups. But you can't do research at a startup due to time and money constraints. Today only "megacorps" like Google, Facebook, Apple and Microsoft are supposedly able to do relevant research because of their vast resources.
Computer science wouldn't help, either, because "most of our software technology was built in companies" and because computer science "strongly disincentivizes risky long-range research". Further, according to Edwards, the aversion to risk and "hyper-professionalization of Computer Science" is part of a larger and worrisome trend throughout the whole field and all of western civilisation.
Edwards' blog post argues that since 1996 "almost everything has been cleverly repackaging and re-engineering prior inventions. Or adding leaky layers to partially paper over problems below. Nothing is obsoleted, and the teetering stack grows ever higher..."
"[M]aybe I'm imagining things. Maybe the reason progress stopped in 1996 is that we invented everything. Maybe there are no more radical breakthroughs possible, and all that's left is to tinker around the edges. This is as good as it gets: a 50 year old OS, 30 year old text editors, and 25 year old languages.
"Bullshit. No technology has ever been permanent. We've just lost the will to improve."
He began the new year by arguing that software "is eating the world. But progress in software technology itself largely stalled around 1996." Slashdot reader tonique summarizes Edwards' argument: In 1996 there were "LISP, Algol, Basic, APL, Unix, C, Oracle, Smalltalk, Windows, C++, LabView, HyperCard, Mathematica, Haskell, WWW, Python, Mosaic, Java, JavaScript, Ruby, Flash, Postgress [sic]". After that we're supposed to have achieved "IntelliJ, Eclipse, ASP, Spring, Rails, Scala, AWS, Clojure, Heroku, V8, Go, React, Docker, Kubernetes, Wasm".
Edwards's main thesis is that the Internet boom around 1996 caused this slowdown because programmers could get rich quick. Then smart and ambitious people moved into Silicon Valley, and founded startups. But you can't do research at a startup due to time and money constraints. Today only "megacorps" like Google, Facebook, Apple and Microsoft are supposedly able to do relevant research because of their vast resources.
Computer science wouldn't help, either, because "most of our software technology was built in companies" and because computer science "strongly disincentivizes risky long-range research". Further, according to Edwards, the aversion to risk and "hyper-professionalization of Computer Science" is part of a larger and worrisome trend throughout the whole field and all of western civilisation.
Edwards' blog post argues that since 1996 "almost everything has been cleverly repackaging and re-engineering prior inventions. Or adding leaky layers to partially paper over problems below. Nothing is obsoleted, and the teetering stack grows ever higher..."
"[M]aybe I'm imagining things. Maybe the reason progress stopped in 1996 is that we invented everything. Maybe there are no more radical breakthroughs possible, and all that's left is to tinker around the edges. This is as good as it gets: a 50 year old OS, 30 year old text editors, and 25 year old languages.
"Bullshit. No technology has ever been permanent. We've just lost the will to improve."
Frameworks (Score:5, Informative)
Re: (Score:2)
Re:Frameworks (Score:5, Funny)
s/"same one"/"same ones"/
incremental changes
Re: (Score:2)
Because everyone codes - poorly. No engineering (Score:5, Insightful)
That's all that most "programmers" were taught.
If you pick any random subject, whether that be music, geology, chemistry, or whatever, and open any chapter in any textbook, you'll find the generally all start out the same. It starts by teaching you the vocabulary of the subject. A book or video teaching music composition may start by defining melody, harmony and rhythm. Then it teaches a bunch about each of these topics. A later book my define the terms like alla breve and fermata, before teaching about these subjects.
Similar with say a chemistry book, which will define vocabulary words like "organic chemistry" and "pH", because you have to know the vocabulary before you can learn the field.
In software engineering we have vocabulary words like foreach and struct. You need to know the terms before you can learn the craft, like anything else. In software somewhere along the way people who apparently don't know about software engineering got the idea that if you teach someone a programming language, the vocabulary of programming, then that's it - they are a qualified software engineer.
In any other field, we know better. We know that just because someone knows the language of auto mechanics, vocabulary words like "engine" and "exhaust", doesn't make them a qualified mechanic, a good mechanic. Knowing the vocabulary of doctors doesn't make you a doctor - learning the language is prerequisite to learning to be a doctor.
Yet we think that someone is a software engineer because they know what "declare Age int" means - though they don't even know what the word "engineering" means! We teach programming languages, and completely fail to teach the art and science of programming, or of software engineering.
That's like expecting that teaching someone the language uses by Maya Angelou will make them a poet. Nope, knowing the languages (English, C#) is a prerequisite to learning to be a poet or a software engineer.
We do teach some computer science. Computer science is to building information systems is as physics is to designing and building bridges. We need to teach information systems design.
Re: (Score:3)
Everyone codes poorly, but everyone is a critic, too. In a sense does every person on the planet act poorly, and then they die. Only this is a depressing view of life and doesn't capture the truth.
Computers are highly deterministic and very precise machines and as a human being should one not try to compare oneself to them. Rather should one accept that computer allow us to make mistakes and to see them and enable us to solve problems we didn't even know we had. We then don't stop learning after we've left
Re: (Score:2)
frameworks.
why not.
try machine learning.
and think of your product as just part of a system.
things get done differently then
I don't use frameworks. I'm not somebody's bitch. (Score:5, Interesting)
The problem with frameworks over libraries is, that a library gets integrated into your program, while a framework wants you to integrate into *its* program.
This lead to monolithism, because you cannot have any other framework beside it.
I think both frameworks and monolithism are software design anti-patterns. But hey, it's not like you cannot go even worse, and have an entire OS ("platform") to run on... on top of another OS... Right Chrome? Right Mozilla? ;)
Re:I don't use frameworks. I'm not somebody's bitc (Score:5, Funny)
Chrome and Mozilla stole the idea from Emacs when they started extending an application into an operating system. So, not even points for originality.
Re:I don't use frameworks. I'm not somebody's bitc (Score:5, Funny)
Re:Frameworks (Score:4, Interesting)
I came here to write this - frameworks are convenient but they don't add any new creative solutions and sometimes even prevents new fresh ideas.
At the same time a framework is a great security risk because once a system is developed and deployed it requires a lot of testing and re-testing when the framework is updated. A total solution that is delivered has a lifetime of maybe 10 years (some shorter, some longer) and the customer is not prepared to invest much money into the solution once delivered as long as it works. This means that security updates aren't always put into place and a framework may have had a major version update that breaks backward compatibility so it's not possible to get the necessary security fixes installed without a total rewrite - the customer may decide to run with what works and hope for the best. This often works well for internal systems and on a few systems on the internet that don't utilize the compromised functionality.
But if a framework is included from a remote server on the net then a web site can go down and stop working or give strange browser errors for unexplained reasons when that framework is updated or the framework hosting site goes down. It could be a disaster for the customer when that happens and the developers that were involved have moved on to greener pastures and nobody has a clue about what happened. This is not unusual when it comes to client side javascript, something that ad servers provides.
Frameworks have also the disadvantage of adding a performance overhead, much like using an 18-wheeler to transport a single toy dumper, much like this: https://i.redd.it/szerhxj9grhz.jpg [i.redd.it]
Re: (Score:3)
I came here to write this - frameworks are convenient but they don't add any new creative solutions and sometimes even prevents new fresh ideas.
While "creativity" and "new fresh ideas" may be useful when writing computer games or other consumer software, they are unnecessary and can be dangerous when writing system-critical or safety-critical software. That's why the great majority of such systems are still written in COBOL, even though it is 60 years old and was considered awkward and boring as soon as it was standardised.
If I fly on an aircraft controlled by computers, or have my money handled by a financial system, I positively want those comput
Re: (Score:3)
Frameworks have also the disadvantage of adding a performance overhead
That is unlikely.
If you would not use a framework, you would more or less write ghat code yourself. How that would result in faster code is beyond me. Not to mention coding time.
Currently unfortunately many frameworks are influenced by inexperienced hobbyists, so there might be flaws, you are right there. But bottom line, if you want to write something from scratch in a topic you are inexperienced in, you probably produce similar flaws.
Re: (Score:3)
Smetimes "ghat" code is much lighterweight. A few lines of "awk" or "nroff" can often replace entire suites of python, perl, Java, Rust, or ruby modules to accomplish well defined small tasks..
Re: (Score:3)
Re: (Score:3)
Screw thread sizes. Machine tools that have standard gauges. Standard voltages on batteries. Standard AC frequency and voltage. Standard gauges for wire, thread, pipes.
I can go on and on and on.
These things, and, notably, the manufacturing industry behind them, are all gigantic frameworks with things that nobody wants to reinvent. All large systems consist of companies working together and things coordinated by industry bodies like ISO, ITU and IEEE.
When you buy a car you buying into a system since you can
Re: (Score:3)
You're talking about standards for compatibility and interoperability. Your examples are more like saying all web servers speak HTTP or every programming language in the universe has a FFI to call C code. That doesn't mean you have to use the same web server or programming language for everything.
Let UIX be the next wave (Score:4, Insightful)
Seriously, not only is modern day UIX absolute garbage, but it's actually getting worse. There is such a huge need for sanely developed interfaces, but it seems as though every new version of whatever software package you'd care to name just gets worse.
So hopefully someone other than myself realizes this and we start to see well designed user interfaces make a come back. That charge won't be led by Microsoft, or Google...or even Apple. It'll have to be someone small enough that they can throw out the inertia that got us here.
Re:Let UIX be the next wave (Score:5, Interesting)
Re: (Score:2)
The thing I would point out, is the obvious and unavoidable intersection between the "Glacially slow" pace of human evolution (since this sets hard limits on how adaptable humans are), vs the rapid pace that software developers are accustomed to.
Throw into that the concept of pareto-optimality, from the resulting constrained solution space, and you end up with the inevitable situation where that rapid pace must grind to a halt.
The question is "Have we reached that", or are we just trapped in a local maxima,
Re: Let UIX be the next wave (Score:2)
["Maxima" is the plural, by the way]
Well, both, kinda.
It always depends on the box you are willing to think outside of.
Sure, there is nothing from stopping us from building a real actual AI and telepathically communicating with it, including choosing to experience it as hallucinations, so even concepts that we have no words or images for can be transported. (In theory, neurons have an electric field that can he both read and manipulated.
But as you can clearly see, there's a teensy hindrance there in terms o
Re: (Score:2)
The flat Metro UI is actually giving me the feeling of going back to Windows 2.x.
Re:Let UIX be the next wave (Score:5, Insightful)
Two things killed good UIs:
1. Tablet fever. More precisely, the perceived need to be able to run a UI on hardware slower than a 10 year old mid-range laptop... in a way that's finger-friendly. IMHO, Aero Glass was the pinnacle of Windows' UI... everything from that point onward has been a step backwards. I have a goddamn computer with more computing power than the sum total of every computer that ever existed prior to 1970, with a graphics card capable of realtime hardware-accelerated raytracing, and thanks to goddamn tablets... a UI that looks like a higher-res version of Windows 2.0 with dropshadows.
2. Dark patterns... explicit attempts to trick users into clicking things they don't want to click, and manipulate users into navigating a specific way by hiding the way to follow the navigation path the user WANTS to take. Perfect example:
Drop everything and breathlessly upgrade to Windows 10 RIGHT FUCKING NOW?!?
[ YES!!! RIGHT NOW!!! ]
[ No, force a reboot and install it In 10 minutes. ]
(tiny text that says something like "copyright information and legal disclaimers" that doesn't appear to be clickable... but if you blindly click it anyway, then scroll 80% of the way down, there's a tiny link hidden in section 5, paragraph 3(b)(ii) to cancel the installation, under the heading "Beware of the tiger!")
3... Oh, and don't even get me STARTED on insane UI designers who think medium-gray 6pt text on a low-contrast mauve background, or displaying 40 normal pages of text 3 lines at a time (with some tedious, convoluted gadget for advancing through it) is actually good.
Modern web UI design really, really makes me nostalgic for CSS 1.0 with IE5's semi-proprietary extensions (like hover)... enough functionality to transcend bold, italic, and underline, but still limited enough to encourage developers to make clickable links LOOK like clickable links, and pages with meaty content you could actually print without ending up with 47 printed pages that have one sentence apiece.
Re: (Score:3)
One point to add to your list - UI used to intentionally blur the lines between web applications and desktop applications. Prime example of this is Office 365 UI where they use flat white ribbon so it looks as if you are accessing it via a web browser.
I think UI designer should be added to 'most hated profession' list of 2020, right behind predatory debt collectors and parking ticket rent-a-cops.
Re: (Score:3)
Re: Let UIX be the next wave (Score:2)
That is mostly only true if you treat software like a business. And not like somethink like Linux. .
Because then you *have* to keep finding reasons for people to give you even more money. And things for your emplyoees to do. Even tough it's already good enou . . . OOOOH . .
I just realized: The entire damn industry is bullshit jobs [wikipedia.org] by now!
Seriously... Don't treat creative work of any kind like a business, people! It's mutually exclusive with target group maximization and profit maximization, by definition.
Re: (Score:2)
AI is the new hotness! (Score:3, Funny)
Isn't AI supposed to be the new hotness? That's what all the kids are saying.
And have been since the 60's?
Re: (Score:3)
Isn't AI supposed to be the new hotness?
When people say "AI" today, they are usually referring to machine learning using deep ANNs. ANNs are useful for solving only a narrow set of problems. They are not a replacement for general-purpose programming.
Re: AI is the new hotness! (Score:2, Insightful)
They usually mean basically a tensor of weight, that transforms an input data set into an output.
I always say that "universal function" is a better name. A function for stuff that you do not understand, are too incometent to code up, but want it to do anyway, as long as it gets it mostly right.
I think that just means that one is a bad programmer/scientist.
Re: AI is the new hotness! (Score:5, Insightful)
I think that just means that one is a bad programmer/scientist.
Can you write a program to look at pixels and determine if a photo is of a dog or a cat? Can you be done with your program in an hour?
If you can't, are you a bad programmer?
Re: AI is the new hotness! (Score:4, Interesting)
For regression we have a mathematical basis for what neural networks are doing. For most types of systems there exists a picewise polynomial approximation to the solution. In 2D we can use a spline, in 3D splines really start to break down, in 4D+ they pretty much stop working. You can still use an ANN though for the problem and can put bounds on the mathematical accuracy of it.
This is why there has been so much success as n-body simulations, fluid mechanics, material properties etc with neural networks. Solving the actual equations is extremely time consuming but on any given bounded area there much exist a piecewise polynomial approximation to arbitrary accuracy and an ANN is a good way to build one.
Neural networks for classification though... not much theory to back that up which is also why they get confused and often screw up badly.
Re: (Score:2)
"Plastics!"
Re: (Score:2)
"Plastics!"
I'm guessing that whoever downmodded you didn't get the reference. Kids these days!
Re: AI is the new hotness! (Score:2)
Can someone point us there? Cause I was born in the 70s, when people still had humor where we have only "references" today, and didn't catch that.
Re: (Score:3)
60 second clip of the scene [youtube.com]
Re: (Score:3)
It's not AI unless it won't let you turn it off.
Re: AI is the new hotness! (Score:2)
"... try to ..."
Because one can "turn you off" quite easily. :P
Cryptocurrencies? (Score:2)
Re: Cryptocurrencies? (Score:2)
Which returns to the issue of momentum others mention. Block chains can especially have utility for supply chain authenticity but these are less about pure currency and more about a market with certain consumers and producers. Braves work is fascinating too. However in virtually all these markets, we have to break from conventions that often have had years or decades to become ingrained.
Things have changed, that's for sure (Score:2)
In a way, we *have* invented everything, when it comes to software. Turns out, there aren't any shortcuts to anything, and actual, by-the-book Artificial Intelligence is still a pipe dream.
The advances we have had in the past 10 years, the *real* advances, have all been thanks to doing statistical analysis on ENORMOUS datasets built from user-provided data. We didn't have truly useful voice recognition until Google and Apple started training on huge piles of voice data. Predictive keyboards on your phone ar
Re: (Score:2)
The field of math is where our innovations come from.
Re: Things have changed, that's for sure (Score:2)
*ba-dum TISS*
Math without verification of usefulness via observations in the real world, is not even science, but somewhere between religion and metaphysical philosophy navel gazing.
You have to strongly separate those two sides of math. Sadly, it rarely happens, and is drowned in arrogance instead.
Re: Things have changed, that's for sure (Score:4, Insightful)
Mathematics is not models. Mathematics is a formal system, founded on a small set of axioms. Models are hypotheses about how the world works, which are manipulated using typically very simple math. We frequently hypothesize the wrong model, but when we get the right one, we invariably find that it is composed of math.
Sometimes something miraculous happens: some structure we discovered in pure math turns out to be a good model of something out in the world.
Re: (Score:2)
Yeah, there is nothing that software can do that I can't do with discrete logic elements, eg transistors.
And the laws of physics haven't changed in the last couple decades.
So it seems if progress has slowed, there must be a lack of new use cases to implement solutions for.
If there is new math, it won't help programming one squat, because the new math is never a new algebra, or new statistics, it is always some kind of calculus that has to be reduced to steps carefully by a human, even including some trial a
Re: (Score:3)
The advances we have had in the past 10 years, the *real* advances, have all been thanks to doing statistical analysis on ENORMOUS datasets built from user-provided data. We didn't have truly useful voice recognition until Google and Apple started training on huge piles of voice data.
Lolwut.
No the advance we had is that someone figured they'd make products based on that training available to the general public "for free". That's what Apple and Google have done.
Plenty of others have been working successfull
What the hell happened??? (Score:5, Insightful)
Re: What the hell happened??? (Score:2)
I think they issue is the user space. With backend, you still often care about complexity and patterns like REST seem rather compact but it's the way these services are often rolled up into user facing websites. Everything needs to be flashy and often simplicity is easily exchanged for UI that looks alright but functionally is lacking. And if you change your UI, better be prepared to have a classic mode for sometime for bitter vets. In a lot of ways efficiency on the backend pays in dividends. Effici
Re: (Score:2)
Re: (Score:3)
Well, we got HiDPI and mixed DPI and all sort of fractional scaling to deal with in desktop software nowadays. So there ought to be some extra bloat.
In Win98 days, too many windows / UI controls / buttons could hang the OS. Excel spreadsheets were limited to 65536 rows and 256 columns. Files and software could only contain one extra language other than English and multiple language familes could not co-exist in the same environment / document. Emoji was emulated by special font file and could not stored
Re: (Score:2)
Weird, the old software I wrote all still uses the same number of ops to run as it did when I first compiled it.
This list is ridiculous. (Score:2)
Freshmeat and Tucows gone (Score:3, Insightful)
Use to be able to go see what new software people released in opensource, was fun. Now most forums dont want people publishing new software posts.
I think there is new software out there, but people dont know it exists.
Re: (Score:2)
Re:Freshmeat and Tucows gone (Score:4, Informative)
Use to be able to go see what new software people released in opensource, was fun. Now most forums dont want people publishing new software posts.
I think there is new software out there, but people dont know it exists.
Former Tucows guy here. They closed our office in 2001. It was kind of hard to make money off internet ads after the dot com bubble burst.
I think they kept it going with some sort of skeleton staff in the Toronto office for a while.
Re: (Score:2)
I was literally thinking about the stack today... (Score:2)
,,,and the Jai programming language. Although it hasn't been released yet, it holds promise to some developers to blow out a lof of cruft. I was pondering if it could be considered part of a new movement, and if so what that movement should be called.
There is plainly a yearning among many developers, as expressed by TFS, to dump the baggage overboard. In hindsight this movement might be known as the "great flattening" or "the great delete"; but if it is even to become a movement, it seems to have barely
Re: (Score:2)
WASM is a thing but it's still not the standard
It is literally a standard now :) but companies like Apple want to scuttle it because it allows non-app store software to run on their phones. And I do mean their phones - people with iPhones are just renting them.
Re: (Score:2)
and now WASM is a thing but it's still not the standard.
WASM is standard but it still doesn't have access to the DOM. After that it will be ready.
What's wrong with 30 year old text editors? (Score:2)
Re: (Score:2)
Yeah, the text editor part of the argument certainly seems weak. I'm not sure what one would expect to change, except for stuff that's under the hood (e.g. multibyte character handling).
UX vs code (Score:2)
The problem with a 30-year-old text editor is that it cannot handle 32-bit, let alone 64-bit OSes. Or that it cannot load files above a certain length (somewhat shorter than a full book.) I mean, I could go on. But you meant "what's wrong a 30-year-old UX". And the answer to that is while there's nothing wrong with most, there have been innovations. Autocomplete actually works well. Syntax highlighting. More importantly, when you type a function name in an IDE it can actually pull and display the com
Obsolescence. (Score:2)
Nothing is obsoleted
That will help - rewrite old software that already works in a new oxidation based language will surely get us out of stagnation.
and the teetering stack grows ever higher..."
Yeah, because it turns out most people don't like the things that work to break for no good reason. Of course, what we need here is NOT innovation - just time and money to do the things that needs to be done - code cleanups, refactoring, hardening, using more modern techniques where it makes sense. Many things should be discarded, but they still need to be replaced with something r
Causes (Score:4, Insightful)
It's because all of the development since approximately that time frame has started going into DRM and other software development that is designed to STOP you from doing things instead of enabling you. Up until shortly after 1995 there was no real thought put into stopping people from doing whatever they wanted with their computers.
Tools are becoming mature, apps not so much (Score:5, Insightful)
Using Edwards' approach, I could also argue that
- Architecture
- Aeronautics
- Automobiles
- Motors (ICE and Electric)
etc.
are all stagnating because the tools aren't changing - we use the same tools (hammer, nails, saw, block and tackle, etc.) to build houses as were used two thousand years ago. They've definitely been improved but they're still work in the same basic ways.
When I look through Edwards' lists, he primarily lists programming languages, a couple of IDEs with Windows thrown in (and, despite what you think of Microsoft, it's disingenuous to say that it has stagnated). I wouldn't expect them to fundamentally change what they do but what they're being used for is radically different.
Edwards discusses machine learning as something new but there have been huge advances across many application disciplines that weren't thought possible in 1996 which makes me feel like he's missing the big picture that we're using established, mature tools to create the products of the future.
Re: (Score:2)
The physical engineering fields are all stagnating because there are no groundbreaking developments in materials that are also capitalistically viable. No new materials, no need for drastically different tools, because the things that can be done with existing materials have been exhausted.
Think
Re: (Score:2)
I would argue that we still don't have a language that does code organization well.
Re: (Score:3)
Well, in motors the new generation of reluctance motors require very very expensive software to calculate the rotor configuration, because Maxwell's Equations are great and all, but a real bear to apply to a real motor design. And you can't easily build a reluctance rotor without calculating the changing flux path.
An induction rotor is more straightforwards, you only need to calculate various totals, and the phase shift in the rotor. The flux flow within the rotor is obvious.
With a BLDC or any sort of synch
Shitty Example (Score:2)
I mean, not really. A modern construction crew should use electric and pneumatic tools, and there have been advances there. Hell, even modular metal scaffolding is a fairly recent (Post-WWII) innovation. Whole house sheet warpping is now a concept (although it involves new ways of designing roofs and overhands). Hell, even wood is going through a relatively recent innovation cycle, as di
Parabolic software quality (Score:2)
An application's quality often forms a parabola when graphed over time.
For example:
The first release version comes out, and it's full of promise, but its shortcomings are glaringly obvious to all, so when the second releases comes out with most of them addressed, it's an undeniable improvement.
Then the third release comes out a bit later, and cleans up the remaining major shortcomings and a lot of minor issues besides.
The fourth release cleans up the last of the minor quibbles, and adds a few new features t
Re: (Score:2)
Interesting. Personally, I'd argue that the third release would generally be the best - for the fourth release, the developers are starting to look for features to add and lose the thread of what the app does or how the UI is set up and the new features are actually degrading the user experience while quality is probably a bit better (as you noted).
...Isn't that the point? (Score:5, Interesting)
Edwards' blog post argues that since 1996 "almost everything has been cleverly repackaging and re-engineering prior inventions.
There's two pieces to this statement, both of which seem to indicate that Edwards might be right, but that he fails to establish a problem.
vi and emacs (and nano, which is my favorite, go ahead and crucify me now) may need ever-so-slight iterations here and there, but by and large, they're done. They're very small programs that do a very specific thing and are generally optimized to be effective in what they do. They solve a very specific use case and are effective in doing so. Every so often, somebody tries reinventing the wheel with a new text editor, and I do still appreciate the 10,001 different functions of Textpad and Notepad++, but as far as being able to edit config files on CLI-only systems, the problem has been solved. Time to move on to the next problem.
On the flip side, there are no shortage of SaaS applications that are basically "WebUI / Mobile App frontends on a database". Why is this? It's because there are no shortage of Main Street businesses that have slightly different needs, but know their craft. Whether it's the hair stylist or the dentist or the auto mechanic or the dry cleaner or the restaurant or the tutor or the travel agency or the insurance broker...all of them have business needs that can be adequately solved with "a WebUI / Mobile App frontend on a database". Let 'em pay $20/user/month for a service that gets out of their way and lets them focus on styling hair or cleaning teeth or fixing cars or cleaning suits.
Each of these businesses will have slightly different business needs. The insurance broker doesn't need inventory management, but the restaurant does. The hair stylist doesn't need to bill insurance companies, but the dentist does. That MariaDB and jQuery-or-nodeJS-or-whatever can be tweaked slightly with common business logic for the particular disciplines is exactly the point of having both software and developers in the first place. Nothing is stopping the butcher, the baker, or the candlestick maker from rolling their own custom software for the task, but odds are pretty damn good that each of these specialists are far happier avoiding spending a lot of time to create their own just-barely-works implementation (I still have a customer stuck on MS Access 2007 SP2 for this reason - not kidding).
I don't understand why Edwards would have a problem with the fact that we've gotten to the point where a number of readily-made pieces of software exist, and are in a state of sufficient maturity that they can be easily retooled to solve far more problems with far less effort than before. Sure, it's helpful to take a step back from time to time and make sure that we aren't stuck in a "because that's how we've always done it" rut, but vi and emacs and nano aren't going to be readily replaced just because their code bases are relatively old.
Software? Or new languages? (Score:2)
The post lists includes little else besides software development languages and tools. So at best, he is pointing to a stagnation in new programming languages, not stagnation in the entire software world!
Replatforming (Score:2)
While there are structural imbalances between research and product development, the main cause of stagnation is the replatforming treadmill:
Well, we might finally be done with replatforming now and can get on to real work after 25 years.
BTW, asking Google/Siri/Alexa for information by voice, without being at a (desktop) computer, and getting a voice response is Star Trek-like software advancement not considered in 1996 to be feasible within our lif
Re: (Score:2)
and getting a voice response is Star Trek-like software advancement not considered in 1996 to be feasible within our lifetimes.
I don't know why you think that, we already had all the major pieces available in 1996: voice recognition, voice synthesis, and knowledge databases.
Re: Replatforming (Score:2)
See, that is precisely what I mean by toting a batshit insane degeneration that only exists for flashiness as "innovation".
The "easy" stuff has already been done (Score:2)
There's no real demand for a better text editor, because there are already multiple choices of highly capable text editors. Notepad++, Sublime Text, Visual Studio Code are three popular ones, there are many, many more. Creating yet another new one would be like creating yet another new calculator app. Why? And why would anyone want one?
The same is true for programming languages. It used to be that programming languages only had to handle basic I/O to a few devices. If you wanted to zip or unzip a file, you
I was supposed to get rich quick? (Score:2)
Oh, well. At least I have kids that are still speaking to me. A couple of them.
Re:I was supposed to get rich? (Score:2)
Quick or slow, I was supposed to get rich?
He is not wrong (Score:2)
In particular, CS research is a complete mess. If you cannot assure results fast (and than that is not research), you do not get funding or you do not get that PhD. Absolute insanity when you look at some research that is being done in other STEM fields. Too many people seem to think that applied CS is now in a final state (when it is anything but) and the only "research" worthwhile doing is following the next hype.
Expectations of usability (Score:2)
American base asumption (Score:2)
Thinking ony in $money and running after get-rich-quick schemes is known a distinctly American, and to some extent western cultural trait.
Sure, the rest of the world doesn't want to be poor either. But that isn't our goal in itself. We many other things. And money is merely one tool to get that. Any other way is just as fine. Like friendship, respect, education, social status, etc.
And I figure that's actually kinda true for Americans too, no?
So I don't even follow the basis of this argument. It wasn't get-r
No (Score:2)
It's a troll post. He doesn't even mention breakthroughs in distributed version control, CI, Rust, machine learning, the cloud, etc, that have changed everything.
Re: (Score:2)
It's a troll post. He doesn't even mention breakthroughs in distributed version control,
How is distributed version control impressive? Everyone still just uses a centralized repository and pushes to it.
Loudmouthed asshat (Score:4, Interesting)
What a bizarre metric for "progress" and "stagnation".
instead of making new versions (Score:3)
What is old is new again. (Score:3)
Part of it is we just starting to get into ubiquitous computing and what can be done with it
In 2000 I was using dragon speaking. To issue command strings to my computer. But because the frame work on the computer could barely reach beyond itself. I could control the computer and not much else.
Now alexa is using similar command frameworks to control lights, my roku, my tv, adjust streaming music to whole house audio.etc.
When every switch,outlet and light bulbs and most devices can interact we will see bigger changes. Ther merging of multiple os's into one that can be used on a variety of platforms is begining. (Mac os, ios, watchos) soon
Yes it has been tried before. But hardware and software frameworks wasn't there yet to make it
Usable on a large scale
People are amazed at the apple m1 processor. But what happens when apple puts that processor in an iphone?
hyper-professionalization of Computer Science (Score:5, Funny)
Computer Science has stalled because we've become a cookie cutter industry where to many people refuse to build something from the ground up simply because it's either to hard or they don't actually understand what their working on, so they have to leverage a framework to hide the lack of qualification.
A few years ago I was tasked with the job of writing a RTOS for an embedded testing platform, and instead of trying to roll a custom Linux image and then throw some logic on top, it was simpler to write the RTOS from the ground up from both a complexity and time standpoint. About 1 year after I wrote it, had moved on from the company, and it was working in production, a new developer took over that project, his first comment to me was: "Why would you roll you own when you could of used Linux?". I explained several times why, outlining exactly how the code worked, and a few months later was called by the owner because all the production test beds had failed. The problem ended up being that the person who took over the project felt the pointless need to rebuild the RTOS with Linux and ended up breaking everything. Luckily I had a backup stored and was able to upload the original version, but that developer could not wrap his head around the concept of building something from the ground up.
Around that same time our UX / UI designer was working on rebuilding the company website, and he kept running into issues. I asked him to send me the code and it was just a cluster bomb of bad libraries, craptastic frameworks (jQuery
I'm not going to bother getting into all the examples I have, but needless to say it's almost always a lack of qualification hidden with frameworks, libraries and enough lacking knowledge that it's a stunning example of the Dunning-Kruger effect.
Faulty premise (Score:3)
The Ship of Theseus and Technology. (Score:4, Interesting)
The language, the OS and the editors, especially the editors will continue to be called the same, while becoming radically different in behavior.
Back in 1999 I bought a editor called SlickEdit for one simple reason it gave me EMACS keystrokes and captured stdout from windows programs automatically. I was transitioning from Sun Solaris/HP-UX to Windows after joining the company. Regularly upgraded it and got the latest version now.
The latest version can read a MsDev solution file, (.sln file) and build a symbol table, context sensitive, class structure aware labeling of every symbol in the entire solution space. About 12,000 cpp and h files, spread over 40 static libraries, 4 executable targets. Reference look up, class structure look up, static code analysis, customizable macros and keystrokes, step through debugging of command line debuggers .... you name it it got it. But it is still called Slick Edit.
It is the question as old as civilization. If you replace parts of Theseus' ship one by one, till all the parts are new and none of the old parts survive, is it still Theseus' old ship? If it is not at what point it stopped being Theseus' ship? If it is, can you call a new ship built using exactly the same parts Theseus' ship?
How many of you coders ... (Score:3)
Re: (Score:2)
Re: (Score:2)
Swift gets rid of the pointer syntax that remained from Objective-C's heritage.
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
After Swift, what else is needed?
I don't know, but I do know that Apple could re-release Visual Basic under its own brand as replacing Swift, and all the Apple developers would say how great it is and get mad at those who disagreed.
Re: AfterSwift (Score:2)
Observe Haskell, and see that you have been blind.
Re: I built a creimer simulator ten years ago (Score:2)
To be fair though, who the hell want to be friends with ... humans?
Thar's like someone in the 14th century going "Hey, let's be friends with Yersinia pestis!".
Re: TFA missed a few major software advancements (Score:2)
Lol.
Why does your list contain so much cancer?
Node.js? Are you on drugs?