Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming

Are We Experiencing a Great Software Stagnation? (alarmingdevelopment.org) 286

Long-time programmer/researcher/former MIT research fellow Jonathan Edwards writes a blog called "Alarming Development: Dispatches from the User Liberation Front."

He began the new year by arguing that software "is eating the world. But progress in software technology itself largely stalled around 1996." Slashdot reader tonique summarizes Edwards' argument: In 1996 there were "LISP, Algol, Basic, APL, Unix, C, Oracle, Smalltalk, Windows, C++, LabView, HyperCard, Mathematica, Haskell, WWW, Python, Mosaic, Java, JavaScript, Ruby, Flash, Postgress [sic]". After that we're supposed to have achieved "IntelliJ, Eclipse, ASP, Spring, Rails, Scala, AWS, Clojure, Heroku, V8, Go, React, Docker, Kubernetes, Wasm".

Edwards's main thesis is that the Internet boom around 1996 caused this slowdown because programmers could get rich quick. Then smart and ambitious people moved into Silicon Valley, and founded startups. But you can't do research at a startup due to time and money constraints. Today only "megacorps" like Google, Facebook, Apple and Microsoft are supposedly able to do relevant research because of their vast resources.

Computer science wouldn't help, either, because "most of our software technology was built in companies" and because computer science "strongly disincentivizes risky long-range research". Further, according to Edwards, the aversion to risk and "hyper-professionalization of Computer Science" is part of a larger and worrisome trend throughout the whole field and all of western civilisation.

Edwards' blog post argues that since 1996 "almost everything has been cleverly repackaging and re-engineering prior inventions. Or adding leaky layers to partially paper over problems below. Nothing is obsoleted, and the teetering stack grows ever higher..."

"[M]aybe I'm imagining things. Maybe the reason progress stopped in 1996 is that we invented everything. Maybe there are no more radical breakthroughs possible, and all that's left is to tinker around the edges. This is as good as it gets: a 50 year old OS, 30 year old text editors, and 25 year old languages.

"Bullshit. No technology has ever been permanent. We've just lost the will to improve."
This discussion has been archived. No new comments can be posted.

Are We Experiencing a Great Software Stagnation?

Comments Filter:
  • Frameworks (Score:5, Informative)

    by theshowmecanuck ( 703852 ) on Sunday January 03, 2021 @12:19AM (#60889376) Journal
    Everyone uses frameworks. By their nature it means nobody is doing anything really new, unless they are creating a new framework. But most people use the same one, and all that happens with those are incremental changes to CRUD features. People find different ways to use the frameworks but mostly they use frameworks. So little innovation.
    • s/"same one"/"same ones"/
    • Mostly. There are still people who write without frameworks for various reasons like efficiency or security on relatively small projects. I agree though that using frameworks is far more common
    • by raymorris ( 2726007 ) on Sunday January 03, 2021 @01:32AM (#60889596) Journal

      That's all that most "programmers" were taught.

      If you pick any random subject, whether that be music, geology, chemistry, or whatever, and open any chapter in any textbook, you'll find the generally all start out the same. It starts by teaching you the vocabulary of the subject. A book or video teaching music composition may start by defining melody, harmony and rhythm. Then it teaches a bunch about each of these topics. A later book my define the terms like alla breve and fermata, before teaching about these subjects.

      Similar with say a chemistry book, which will define vocabulary words like "organic chemistry" and "pH", because you have to know the vocabulary before you can learn the field.

      In software engineering we have vocabulary words like foreach and struct. You need to know the terms before you can learn the craft, like anything else. In software somewhere along the way people who apparently don't know about software engineering got the idea that if you teach someone a programming language, the vocabulary of programming, then that's it - they are a qualified software engineer.

      In any other field, we know better. We know that just because someone knows the language of auto mechanics, vocabulary words like "engine" and "exhaust", doesn't make them a qualified mechanic, a good mechanic. Knowing the vocabulary of doctors doesn't make you a doctor - learning the language is prerequisite to learning to be a doctor.

      Yet we think that someone is a software engineer because they know what "declare Age int" means - though they don't even know what the word "engineering" means! We teach programming languages, and completely fail to teach the art and science of programming, or of software engineering.

      That's like expecting that teaching someone the language uses by Maya Angelou will make them a poet. Nope, knowing the languages (English, C#) is a prerequisite to learning to be a poet or a software engineer.

      We do teach some computer science. Computer science is to building information systems is as physics is to designing and building bridges. We need to teach information systems design.

      • Everyone codes poorly, but everyone is a critic, too. In a sense does every person on the planet act poorly, and then they die. Only this is a depressing view of life and doesn't capture the truth.

        Computers are highly deterministic and very precise machines and as a human being should one not try to compare oneself to them. Rather should one accept that computer allow us to make mistakes and to see them and enable us to solve problems we didn't even know we had. We then don't stop learning after we've left

    • frameworks.
      why not.
      try machine learning.
      and think of your product as just part of a system.
      things get done differently then

    • by BAReFO0t ( 6240524 ) on Sunday January 03, 2021 @02:19AM (#60889658)

      The problem with frameworks over libraries is, that a library gets integrated into your program, while a framework wants you to integrate into *its* program.
      This lead to monolithism, because you cannot have any other framework beside it.

      I think both frameworks and monolithism are software design anti-patterns. But hey, it's not like you cannot go even worse, and have an entire OS ("platform") to run on... on top of another OS... Right Chrome? Right Mozilla? ;)

    • Re:Frameworks (Score:4, Interesting)

      by Z00L00K ( 682162 ) on Sunday January 03, 2021 @02:42AM (#60889708) Homepage Journal

      I came here to write this - frameworks are convenient but they don't add any new creative solutions and sometimes even prevents new fresh ideas.

      At the same time a framework is a great security risk because once a system is developed and deployed it requires a lot of testing and re-testing when the framework is updated. A total solution that is delivered has a lifetime of maybe 10 years (some shorter, some longer) and the customer is not prepared to invest much money into the solution once delivered as long as it works. This means that security updates aren't always put into place and a framework may have had a major version update that breaks backward compatibility so it's not possible to get the necessary security fixes installed without a total rewrite - the customer may decide to run with what works and hope for the best. This often works well for internal systems and on a few systems on the internet that don't utilize the compromised functionality.

      But if a framework is included from a remote server on the net then a web site can go down and stop working or give strange browser errors for unexplained reasons when that framework is updated or the framework hosting site goes down. It could be a disaster for the customer when that happens and the developers that were involved have moved on to greener pastures and nobody has a clue about what happened. This is not unusual when it comes to client side javascript, something that ad servers provides.

      Frameworks have also the disadvantage of adding a performance overhead, much like using an 18-wheeler to transport a single toy dumper, much like this: https://i.redd.it/szerhxj9grhz.jpg [i.redd.it]

      • I came here to write this - frameworks are convenient but they don't add any new creative solutions and sometimes even prevents new fresh ideas.

        While "creativity" and "new fresh ideas" may be useful when writing computer games or other consumer software, they are unnecessary and can be dangerous when writing system-critical or safety-critical software. That's why the great majority of such systems are still written in COBOL, even though it is 60 years old and was considered awkward and boring as soon as it was standardised.

        If I fly on an aircraft controlled by computers, or have my money handled by a financial system, I positively want those comput

      • Frameworks have also the disadvantage of adding a performance overhead
        That is unlikely.
        If you would not use a framework, you would more or less write ghat code yourself. How that would result in faster code is beyond me. Not to mention coding time.

        Currently unfortunately many frameworks are influenced by inexperienced hobbyists, so there might be flaws, you are right there. But bottom line, if you want to write something from scratch in a topic you are inexperienced in, you probably produce similar flaws.

        • Smetimes "ghat" code is much lighterweight. A few lines of "awk" or "nroff" can often replace entire suites of python, perl, Java, Rust, or ruby modules to accomplish well defined small tasks..

    • Screw thread sizes. Machine tools that have standard gauges. Standard voltages on batteries. Standard AC frequency and voltage. Standard gauges for wire, thread, pipes.

      I can go on and on and on.

      These things, and, notably, the manufacturing industry behind them, are all gigantic frameworks with things that nobody wants to reinvent. All large systems consist of companies working together and things coordinated by industry bodies like ISO, ITU and IEEE.

      When you buy a car you buying into a system since you can

      • You're talking about standards for compatibility and interoperability. Your examples are more like saying all web servers speak HTTP or every programming language in the universe has a FFI to call C code. That doesn't mean you have to use the same web server or programming language for everything.

  • by grasshoppa ( 657393 ) on Sunday January 03, 2021 @12:22AM (#60889384) Homepage

    Seriously, not only is modern day UIX absolute garbage, but it's actually getting worse. There is such a huge need for sanely developed interfaces, but it seems as though every new version of whatever software package you'd care to name just gets worse.

    So hopefully someone other than myself realizes this and we start to see well designed user interfaces make a come back. That charge won't be led by Microsoft, or Google...or even Apple. It'll have to be someone small enough that they can throw out the inertia that got us here.

    • by nbzso ( 7592526 ) on Sunday January 03, 2021 @01:03AM (#60889498)
      I am UI designer. Actually you can blame Microsoft and invention of flat Metro UI. Microsoft never was able to do something meaningful in this field. What was my shock when Apple and Google adopted the idea. Yes, skeumorfic UI was killed prematurely. We now have realtime 3d capable computers that visualize interface abstraction using flat icons and associate files with folders. Desktop paradigm is repeated to death and user is treated like dumb kid. Mobile interfaces are with interaction of web from 1995. All for money. Because its cheaper to build software without design and every programmer hates the idea of adapting something new. There is widespread hate for design now, designers that are capable of change are not welcomed in the edge of Figma, prebuilt design systems and hipster decorating and posting two paragraphs on Medium about something UX related. I don’t see the way of change other than academia. As usual big corp will steal it if users are willing to adopt. Hey, if you ask Musk the interface in Tesla is temporary, before human element is removed from control:) So brave new world build by introverts with god complex.
      • The thing I would point out, is the obvious and unavoidable intersection between the "Glacially slow" pace of human evolution (since this sets hard limits on how adaptable humans are), vs the rapid pace that software developers are accustomed to.

        Throw into that the concept of pareto-optimality, from the resulting constrained solution space, and you end up with the inevitable situation where that rapid pace must grind to a halt.

        The question is "Have we reached that", or are we just trapped in a local maxima,

        • ["Maxima" is the plural, by the way]

          Well, both, kinda.
          It always depends on the box you are willing to think outside of.

          Sure, there is nothing from stopping us from building a real actual AI and telepathically communicating with it, including choosing to experience it as hallucinations, so even concepts that we have no words or images for can be transported. (In theory, neurons have an electric field that can he both read and manipulated.
          But as you can clearly see, there's a teensy hindrance there in terms o

      • by Z00L00K ( 682162 )

        The flat Metro UI is actually giving me the feeling of going back to Windows 2.x.

      • by Miamicanes ( 730264 ) on Sunday January 03, 2021 @03:33AM (#60889794)

        Two things killed good UIs:

        1. Tablet fever. More precisely, the perceived need to be able to run a UI on hardware slower than a 10 year old mid-range laptop... in a way that's finger-friendly. IMHO, Aero Glass was the pinnacle of Windows' UI... everything from that point onward has been a step backwards. I have a goddamn computer with more computing power than the sum total of every computer that ever existed prior to 1970, with a graphics card capable of realtime hardware-accelerated raytracing, and thanks to goddamn tablets... a UI that looks like a higher-res version of Windows 2.0 with dropshadows.

        2. Dark patterns... explicit attempts to trick users into clicking things they don't want to click, and manipulate users into navigating a specific way by hiding the way to follow the navigation path the user WANTS to take. Perfect example:

        Drop everything and breathlessly upgrade to Windows 10 RIGHT FUCKING NOW?!?

        [ YES!!! RIGHT NOW!!! ]

        [ No, force a reboot and install it In 10 minutes. ]

        (tiny text that says something like "copyright information and legal disclaimers" that doesn't appear to be clickable... but if you blindly click it anyway, then scroll 80% of the way down, there's a tiny link hidden in section 5, paragraph 3(b)(ii) to cancel the installation, under the heading "Beware of the tiger!")

        3... Oh, and don't even get me STARTED on insane UI designers who think medium-gray 6pt text on a low-contrast mauve background, or displaying 40 normal pages of text 3 lines at a time (with some tedious, convoluted gadget for advancing through it) is actually good.

        Modern web UI design really, really makes me nostalgic for CSS 1.0 with IE5's semi-proprietary extensions (like hover)... enough functionality to transcend bold, italic, and underline, but still limited enough to encourage developers to make clickable links LOOK like clickable links, and pages with meaty content you could actually print without ending up with 47 printed pages that have one sentence apiece.

        • by sinij ( 911942 )
          Reading your post feels me with rage, because how spot-on you are with your points.

          One point to add to your list - UI used to intentionally blur the lines between web applications and desktop applications. Prime example of this is Office 365 UI where they use flat white ribbon so it looks as if you are accessing it via a web browser.

          I think UI designer should be added to 'most hated profession' list of 2020, right behind predatory debt collectors and parking ticket rent-a-cops.
    • Agreed. Modern UIs seem to often be built just for the sake of novelty - Product version 12 much have a *new* , *different* interface than version 11.
      • That is mostly only true if you treat software like a business. And not like somethink like Linux.
        Because then you *have* to keep finding reasons for people to give you even more money. And things for your emplyoees to do. Even tough it's already good enou . . . OOOOH . . .

        I just realized: The entire damn industry is bullshit jobs [wikipedia.org] by now!

        Seriously... Don't treat creative work of any kind like a business, people! It's mutually exclusive with target group maximization and profit maximization, by definition.

        • Haven't linux desktops also been infected with new UIs? At least in linux you can turn them off, but the default desktop on major distributions like Ubuntu has changed, and to my mind, not for the better. Its possible those are driven by business interests - I don't know the linux business landscape very well.
  • by kwerle ( 39371 ) <kurt@CircleW.org> on Sunday January 03, 2021 @12:24AM (#60889386) Homepage Journal

    Isn't AI supposed to be the new hotness? That's what all the kids are saying.

    And have been since the 60's?

    • Isn't AI supposed to be the new hotness?

      When people say "AI" today, they are usually referring to machine learning using deep ANNs. ANNs are useful for solving only a narrow set of problems. They are not a replacement for general-purpose programming.

      • They usually mean basically a tensor of weight, that transforms an input data set into an output.
        I always say that "universal function" is a better name. A function for stuff that you do not understand, are too incometent to code up, but want it to do anyway, as long as it gets it mostly right.

        I think that just means that one is a bad programmer/scientist.

        • by ShanghaiBill ( 739463 ) on Sunday January 03, 2021 @02:55AM (#60889732)

          I think that just means that one is a bad programmer/scientist.

          Can you write a program to look at pixels and determine if a photo is of a dog or a cat? Can you be done with your program in an hour?

          If you can't, are you a bad programmer?

        • by Ambassador Kosh ( 18352 ) on Sunday January 03, 2021 @05:57AM (#60890052)

          For regression we have a mathematical basis for what neural networks are doing. For most types of systems there exists a picewise polynomial approximation to the solution. In 2D we can use a spline, in 3D splines really start to break down, in 4D+ they pretty much stop working. You can still use an ANN though for the problem and can put bounds on the mathematical accuracy of it.

          This is why there has been so much success as n-body simulations, fluid mechanics, material properties etc with neural networks. Solving the actual equations is extremely time consuming but on any given bounded area there much exist a piecewise polynomial approximation to arbitrary accuracy and an ANN is a good way to build one.

          Neural networks for classification though... not much theory to back that up which is also why they get confused and often screw up badly.

    • by Tablizer ( 95088 )

      "Plastics!"

    • It's not AI unless it won't let you turn it off.

  • There's been a great deal of innovation there. Bitcoin unfortunately gets about 99% of the press though, so you don't hear so much about the smaller teams working on cutting edge stuff unless you really look for it.
    • Which returns to the issue of momentum others mention. Block chains can especially have utility for supply chain authenticity but these are less about pure currency and more about a market with certain consumers and producers. Braves work is fascinating too. However in virtually all these markets, we have to break from conventions that often have had years or decades to become ingrained.

  • In a way, we *have* invented everything, when it comes to software. Turns out, there aren't any shortcuts to anything, and actual, by-the-book Artificial Intelligence is still a pipe dream.

    The advances we have had in the past 10 years, the *real* advances, have all been thanks to doing statistical analysis on ENORMOUS datasets built from user-provided data. We didn't have truly useful voice recognition until Google and Apple started training on huge piles of voice data. Predictive keyboards on your phone ar

    • The field of math is where our innovations come from.

      • *ba-dum TISS*

        Math without verification of usefulness via observations in the real world, is not even science, but somewhere between religion and metaphysical philosophy navel gazing.
        You have to strongly separate those two sides of math. Sadly, it rarely happens, and is drowned in arrogance instead.

      • Yeah, there is nothing that software can do that I can't do with discrete logic elements, eg transistors.

        And the laws of physics haven't changed in the last couple decades.

        So it seems if progress has slowed, there must be a lack of new use cases to implement solutions for.

        If there is new math, it won't help programming one squat, because the new math is never a new algebra, or new statistics, it is always some kind of calculus that has to be reduced to steps carefully by a human, even including some trial a

    • The advances we have had in the past 10 years, the *real* advances, have all been thanks to doing statistical analysis on ENORMOUS datasets built from user-provided data. We didn't have truly useful voice recognition until Google and Apple started training on huge piles of voice data.

      Lolwut.

      No the advance we had is that someone figured they'd make products based on that training available to the general public "for free". That's what Apple and Google have done.

      Plenty of others have been working successfull

  • by LarryBanks ( 7394048 ) on Sunday January 03, 2021 @12:28AM (#60889398)
    All I know is that my Pentium II PE (400 Mhz) from 1999 with 192MB of RAM seemingly did the same things as my current I9 with 16GB... Surf the web, internet chats, MS Office, email, etc... But it was able to do it all with a LOT less!!! Seriously - the size of apps has gotten out of hand!!! How many GB is Office now? How much RAM does Outlook take to sit there and do nothing??? Why is Calibre (eBook reader software) 380 MB??? No question that the hardware has gotten much more powerful (and readily available), but the functional utility enabled by it seems seriously lacking... It seems like we've packed on framework after framework that over-bloats everything such that even word processing now takes GBs of RAM and disk! Back in the day, they drilled us on stuff like computabilty, complexity, Big-O, etc... I guess today's crowd don't care about stuff like this anymore - it's too easy to just toss in a boatload of frameworks to do simple stuff these days...
    • I think they issue is the user space. With backend, you still often care about complexity and patterns like REST seem rather compact but it's the way these services are often rolled up into user facing websites. Everything needs to be flashy and often simplicity is easily exchanged for UI that looks alright but functionally is lacking. And if you change your UI, better be prepared to have a classic mode for sometime for bitter vets. In a lot of ways efficiency on the backend pays in dividends. Effici

    • What happened, it seemed, was that people confused server side software with desktop software and thought that, because companies were more willing to throw money at more and more hardware, the same development practices could apply to the desktop.
    • Well, we got HiDPI and mixed DPI and all sort of fractional scaling to deal with in desktop software nowadays. So there ought to be some extra bloat.

      In Win98 days, too many windows / UI controls / buttons could hang the OS. Excel spreadsheets were limited to 65536 rows and 256 columns. Files and software could only contain one extra language other than English and multiple language familes could not co-exist in the same environment / document. Emoji was emulated by special font file and could not stored

    • Weird, the old software I wrote all still uses the same number of ops to run as it did when I first compiled it.

  • How the fuck did Heroku make the list when TikTok didn't?
  • by BrookHarty ( 9119 ) on Sunday January 03, 2021 @12:31AM (#60889404) Journal

    Use to be able to go see what new software people released in opensource, was fun. Now most forums dont want people publishing new software posts.

    I think there is new software out there, but people dont know it exists.

    • There's trending repos [github.com] on github
    • by NateFromMich ( 6359610 ) on Sunday January 03, 2021 @12:40AM (#60889428)

      Use to be able to go see what new software people released in opensource, was fun. Now most forums dont want people publishing new software posts.

      I think there is new software out there, but people dont know it exists.

      Former Tucows guy here. They closed our office in 2001. It was kind of hard to make money off internet ads after the dot com bubble burst.
      I think they kept it going with some sort of skeleton staff in the Toronto office for a while.

      • Tucows (The Ultimate Collection of Winsock Software) was awesome in the 90's! One of the many useful sites from back then I miss.
  • ,,,and the Jai programming language. Although it hasn't been released yet, it holds promise to some developers to blow out a lof of cruft. I was pondering if it could be considered part of a new movement, and if so what that movement should be called.

    There is plainly a yearning among many developers, as expressed by TFS, to dump the baggage overboard. In hindsight this movement might be known as the "great flattening" or "the great delete"; but if it is even to become a movement, it seems to have barely

    • WASM is a thing but it's still not the standard

      It is literally a standard now :) but companies like Apple want to scuttle it because it allows non-app store software to run on their phones. And I do mean their phones - people with iPhones are just renting them.

    • and now WASM is a thing but it's still not the standard.

      WASM is standard but it still doesn't have access to the DOM. After that it will be ready.

  • I switched companies recently and my main computer now uses a different operating system with a different IDE, and I'm programming in a different language. However, the IDE has a plugin to emulate the text editor I learned 30 years ago (and still use regularly), and it works great - it's made the change so much easier. Being able to use all those memorized key commands is a lifesaver.
    • Yeah, the text editor part of the argument certainly seems weak. I'm not sure what one would expect to change, except for stuff that's under the hood (e.g. multibyte character handling).

    • The problem with a 30-year-old text editor is that it cannot handle 32-bit, let alone 64-bit OSes. Or that it cannot load files above a certain length (somewhat shorter than a full book.) I mean, I could go on. But you meant "what's wrong a 30-year-old UX". And the answer to that is while there's nothing wrong with most, there have been innovations. Autocomplete actually works well. Syntax highlighting. More importantly, when you type a function name in an IDE it can actually pull and display the com

  • Nothing is obsoleted

    That will help - rewrite old software that already works in a new oxidation based language will surely get us out of stagnation.

    and the teetering stack grows ever higher..."

    Yeah, because it turns out most people don't like the things that work to break for no good reason. Of course, what we need here is NOT innovation - just time and money to do the things that needs to be done - code cleanups, refactoring, hardening, using more modern techniques where it makes sense. Many things should be discarded, but they still need to be replaced with something r

  • Causes (Score:4, Insightful)

    by rtkluttz ( 244325 ) on Sunday January 03, 2021 @12:56AM (#60889476) Homepage

    It's because all of the development since approximately that time frame has started going into DRM and other software development that is designed to STOP you from doing things instead of enabling you. Up until shortly after 1995 there was no real thought put into stopping people from doing whatever they wanted with their computers.

  • by mykepredko ( 40154 ) on Sunday January 03, 2021 @12:56AM (#60889480) Homepage

    Using Edwards' approach, I could also argue that
    - Architecture
    - Aeronautics
    - Automobiles
    - Motors (ICE and Electric)
    etc.

    are all stagnating because the tools aren't changing - we use the same tools (hammer, nails, saw, block and tackle, etc.) to build houses as were used two thousand years ago. They've definitely been improved but they're still work in the same basic ways.

    When I look through Edwards' lists, he primarily lists programming languages, a couple of IDEs with Windows thrown in (and, despite what you think of Microsoft, it's disingenuous to say that it has stagnated). I wouldn't expect them to fundamentally change what they do but what they're being used for is radically different.

    Edwards discusses machine learning as something new but there have been huge advances across many application disciplines that weren't thought possible in 1996 which makes me feel like he's missing the big picture that we're using established, mature tools to create the products of the future.

    • I think I would argue they're also stagnating. There really haven't been any seriously groundbreaking developments in those fields, and so the tools they use don't really need to be groundbreakingly innovative.

      The physical engineering fields are all stagnating because there are no groundbreaking developments in materials that are also capitalistically viable. No new materials, no need for drastically different tools, because the things that can be done with existing materials have been exhausted.

      Think
    • I would argue that we still don't have a language that does code organization well.

    • Well, in motors the new generation of reluctance motors require very very expensive software to calculate the rotor configuration, because Maxwell's Equations are great and all, but a real bear to apply to a real motor design. And you can't easily build a reluctance rotor without calculating the changing flux path.

      An induction rotor is more straightforwards, you only need to calculate various totals, and the phase shift in the rotor. The flux flow within the rotor is obvious.

      With a BLDC or any sort of synch

    • we use the same tools (hammer, nails, saw, block and tackle, etc.) to build houses as were used two thousand years ago.

      I mean, not really. A modern construction crew should use electric and pneumatic tools, and there have been advances there. Hell, even modular metal scaffolding is a fairly recent (Post-WWII) innovation. Whole house sheet warpping is now a concept (although it involves new ways of designing roofs and overhands). Hell, even wood is going through a relatively recent innovation cycle, as di

  • An application's quality often forms a parabola when graphed over time.

    For example:

    The first release version comes out, and it's full of promise, but its shortcomings are glaringly obvious to all, so when the second releases comes out with most of them addressed, it's an undeniable improvement.

    Then the third release comes out a bit later, and cleans up the remaining major shortcomings and a lot of minor issues besides.

    The fourth release cleans up the last of the minor quibbles, and adds a few new features t

    • Interesting. Personally, I'd argue that the third release would generally be the best - for the fourth release, the developers are starting to look for features to add and lose the thread of what the app does or how the UI is set up and the new features are actually degrading the user experience while quality is probably a bit better (as you noted).

  • by Voyager529 ( 1363959 ) <.voyager529. .at. .yahoo.com.> on Sunday January 03, 2021 @01:02AM (#60889496)

    Edwards' blog post argues that since 1996 "almost everything has been cleverly repackaging and re-engineering prior inventions.

    There's two pieces to this statement, both of which seem to indicate that Edwards might be right, but that he fails to establish a problem.

    vi and emacs (and nano, which is my favorite, go ahead and crucify me now) may need ever-so-slight iterations here and there, but by and large, they're done. They're very small programs that do a very specific thing and are generally optimized to be effective in what they do. They solve a very specific use case and are effective in doing so. Every so often, somebody tries reinventing the wheel with a new text editor, and I do still appreciate the 10,001 different functions of Textpad and Notepad++, but as far as being able to edit config files on CLI-only systems, the problem has been solved. Time to move on to the next problem.

    On the flip side, there are no shortage of SaaS applications that are basically "WebUI / Mobile App frontends on a database". Why is this? It's because there are no shortage of Main Street businesses that have slightly different needs, but know their craft. Whether it's the hair stylist or the dentist or the auto mechanic or the dry cleaner or the restaurant or the tutor or the travel agency or the insurance broker...all of them have business needs that can be adequately solved with "a WebUI / Mobile App frontend on a database". Let 'em pay $20/user/month for a service that gets out of their way and lets them focus on styling hair or cleaning teeth or fixing cars or cleaning suits.

    Each of these businesses will have slightly different business needs. The insurance broker doesn't need inventory management, but the restaurant does. The hair stylist doesn't need to bill insurance companies, but the dentist does. That MariaDB and jQuery-or-nodeJS-or-whatever can be tweaked slightly with common business logic for the particular disciplines is exactly the point of having both software and developers in the first place. Nothing is stopping the butcher, the baker, or the candlestick maker from rolling their own custom software for the task, but odds are pretty damn good that each of these specialists are far happier avoiding spending a lot of time to create their own just-barely-works implementation (I still have a customer stuck on MS Access 2007 SP2 for this reason - not kidding).

    I don't understand why Edwards would have a problem with the fact that we've gotten to the point where a number of readily-made pieces of software exist, and are in a state of sufficient maturity that they can be easily retooled to solve far more problems with far less effort than before. Sure, it's helpful to take a step back from time to time and make sure that we aren't stuck in a "because that's how we've always done it" rut, but vi and emacs and nano aren't going to be readily replaced just because their code bases are relatively old.

  • The post lists includes little else besides software development languages and tools. So at best, he is pointing to a stagnation in new programming languages, not stagnation in the entire software world!

  • While there are structural imbalances between research and product development, the main cause of stagnation is the replatforming treadmill:

    1. 1. DOS -> Windows
    2. 2. Windows -> web
    3. 3. web -> mobile

    Well, we might finally be done with replatforming now and can get on to real work after 25 years.

    BTW, asking Google/Siri/Alexa for information by voice, without being at a (desktop) computer, and getting a voice response is Star Trek-like software advancement not considered in 1996 to be feasible within our lif

    • and getting a voice response is Star Trek-like software advancement not considered in 1996 to be feasible within our lifetimes.

      I don't know why you think that, we already had all the major pieces available in 1996: voice recognition, voice synthesis, and knowledge databases.

    • See, that is precisely what I mean by toting a batshit insane degeneration that only exists for flashiness as "innovation".

  • There's no real demand for a better text editor, because there are already multiple choices of highly capable text editors. Notepad++, Sublime Text, Visual Studio Code are three popular ones, there are many, many more. Creating yet another new one would be like creating yet another new calculator app. Why? And why would anyone want one?

    The same is true for programming languages. It used to be that programming languages only had to handle basic I/O to a few devices. If you wanted to zip or unzip a file, you

  • Oh, well. At least I have kids that are still speaking to me. A couple of them.

  • In particular, CS research is a complete mess. If you cannot assure results fast (and than that is not research), you do not get funding or you do not get that PhD. Absolute insanity when you look at some research that is being done in other STEM fields. Too many people seem to think that applied CS is now in a final state (when it is anything but) and the only "research" worthwhile doing is following the next hype.

  • Back in 90s apps were absolute garbage if compared to anything these days. The average person was writing for a heavily tech-savvy audience. These days even professional users are so spoiled by fancy IDEs and tools that anything small development teams can put together for free is generally garbage in comparison. I mean how many people remember the quality of software back in the days of windows 95?
  • Thinking ony in $money and running after get-rich-quick schemes is known a distinctly American, and to some extent western cultural trait.
    Sure, the rest of the world doesn't want to be poor either. But that isn't our goal in itself. We many other things. And money is merely one tool to get that. Any other way is just as fine. Like friendship, respect, education, social status, etc.
    And I figure that's actually kinda true for Americans too, no?

    So I don't even follow the basis of this argument. It wasn't get-r

  • by roca ( 43122 )

    It's a troll post. He doesn't even mention breakthroughs in distributed version control, CI, Rust, machine learning, the cloud, etc, that have changed everything.

    • It's a troll post. He doesn't even mention breakthroughs in distributed version control,

      How is distributed version control impressive? Everyone still just uses a centralized repository and pushes to it.

  • Loudmouthed asshat (Score:4, Interesting)

    by nyet ( 19118 ) on Sunday January 03, 2021 @03:47AM (#60889814) Homepage

    What a bizarre metric for "progress" and "stagnation".

  • by FudRucker ( 866063 ) on Sunday January 03, 2021 @09:12AM (#60890476)
    with new features, lets make the stuff we already have now work better together, too many library version conflicts and and abandoned headers & includes cause packages to break, so the next time you upgrade to the next version of your favorite Linux distro all the out-of-tree software you liked to build and run work work anymore,
  • by peragrin ( 659227 ) on Sunday January 03, 2021 @09:59AM (#60890584)

    Part of it is we just starting to get into ubiquitous computing and what can be done with it

    In 2000 I was using dragon speaking. To issue command strings to my computer. But because the frame work on the computer could barely reach beyond itself. I could control the computer and not much else.

    Now alexa is using similar command frameworks to control lights, my roku, my tv, adjust streaming music to whole house audio.etc.

    When every switch,outlet and light bulbs and most devices can interact we will see bigger changes. Ther merging of multiple os's into one that can be used on a variety of platforms is begining. (Mac os, ios, watchos) soon

    Yes it has been tried before. But hardware and software frameworks wasn't there yet to make it
    Usable on a large scale

    People are amazed at the apple m1 processor. But what happens when apple puts that processor in an iphone?

  • by Murdoch5 ( 1563847 ) on Sunday January 03, 2021 @11:19AM (#60890764) Homepage
    hyper-professionalization of Computer Science is exactly what the problem is! When we're spending time arguing about if Master / Slave should be replaced with Primary / Replica, or Blacklist / Whitelist are acceptable terms, then what kind of real progress are you going to make? I wouldn't say we've been stalled since 1996, I think that's premature, but in the last couple years everything has moved from roll your own to use a framework and library combo.

    Computer Science has stalled because we've become a cookie cutter industry where to many people refuse to build something from the ground up simply because it's either to hard or they don't actually understand what their working on, so they have to leverage a framework to hide the lack of qualification.

    A few years ago I was tasked with the job of writing a RTOS for an embedded testing platform, and instead of trying to roll a custom Linux image and then throw some logic on top, it was simpler to write the RTOS from the ground up from both a complexity and time standpoint. About 1 year after I wrote it, had moved on from the company, and it was working in production, a new developer took over that project, his first comment to me was: "Why would you roll you own when you could of used Linux?". I explained several times why, outlining exactly how the code worked, and a few months later was called by the owner because all the production test beds had failed. The problem ended up being that the person who took over the project felt the pointless need to rebuild the RTOS with Linux and ended up breaking everything. Luckily I had a backup stored and was able to upload the original version, but that developer could not wrap his head around the concept of building something from the ground up.

    Around that same time our UX / UI designer was working on rebuilding the company website, and he kept running into issues. I asked him to send me the code and it was just a cluster bomb of bad libraries, craptastic frameworks (jQuery .... why does it even exist), and enough slop CSS to make pigs happy. I recommended he start over and do the job properly without relying on his library / framework soup, that he wasn't even using correctly, and didn't need. His comment to me was: "No, this is how you do it, you never want to write new code when you can use libraries.". To date about 1 year later, our site is functional but unmaintainable and is truly a teachable example on how not to code.

    I'm not going to bother getting into all the examples I have, but needless to say it's almost always a lack of qualification hidden with frameworks, libraries and enough lacking knowledge that it's a stunning example of the Dunning-Kruger effect.
  • by MakerDusk ( 2712435 ) on Sunday January 03, 2021 @11:22AM (#60890772)
    More software is written than is used for desktop computers. 3d printers weren't around in the 90's, nor were IoT. The author is clearly only looking through university approved corporate pipelines and is too brainwashed to even percieve there are small groups carrying out advancing projects. Like volumetric hologram displays, drone projects, mesh networks, etc.
  • by 140Mandak262Jamuna ( 970587 ) on Sunday January 03, 2021 @12:44PM (#60890990) Journal
    “I don't know what the language of the year 2000 will look like, but I know it will be called Fortran.” —Tony Hoare, winner of the 1980 Turing Award, in 1982.

    The language, the OS and the editors, especially the editors will continue to be called the same, while becoming radically different in behavior.

    Back in 1999 I bought a editor called SlickEdit for one simple reason it gave me EMACS keystrokes and captured stdout from windows programs automatically. I was transitioning from Sun Solaris/HP-UX to Windows after joining the company. Regularly upgraded it and got the latest version now.

    The latest version can read a MsDev solution file, (.sln file) and build a symbol table, context sensitive, class structure aware labeling of every symbol in the entire solution space. About 12,000 cpp and h files, spread over 40 static libraries, 4 executable targets. Reference look up, class structure look up, static code analysis, customizable macros and keystrokes, step through debugging of command line debuggers .... you name it it got it. But it is still called Slick Edit.

    It is the question as old as civilization. If you replace parts of Theseus' ship one by one, till all the parts are new and none of the old parts survive, is it still Theseus' old ship? If it is not at what point it stopped being Theseus' ship? If it is, can you call a new ship built using exactly the same parts Theseus' ship?

  • by Jerry ( 6400 ) on Sunday January 03, 2021 @04:22PM (#60891696)
    worked in an IT department which was headed by a suit, many times a political appointee (gov shop) or the owner's son (private firm), who didn't have a clue about how computers, much less programming, worked. And, how many times did that suit consult with you or anyone on your team of coders about software purchasing decisions affecting everyone, especially taxpayers. I've seen millions go down the drain after a suit purchased a do-all app that supposedly would allow clerks to write their own software. One such app was called OneWorld. Another is called Apex. Clerks code? They can't even keep their keyboards and monitors clean.

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...