Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Java IT

Are Trendy Developers Ignoring Tradeoffs and Over-Engineering Workplaces? (github.io) 211

An anonymous reader shares an article titled "Does IT Run on Java 8?"

"After more than ten years in tech, in a range of different environments, from Fortune 500 companies, to startups, I've finally come to realize that most businesss and developers simply don't revolve around whatever's trending on Hacker News," argues one Python/R/Spark data scientist: Most developers -- and companies -- are part of what [programmer] Scott Hanselman dubbed a while ago as the 99%... "They don't read a lot of blogs, they never write blogs, they don't go to user groups, they don't tweet or facebook, and you don't often see them at large conferences. Lots of technologies don't iterate at this speed, nor should they.

"Embedded developers are still doing their thing in C and C++. Both are deeply mature and well understood languages that don't require a lot of churn or panic on the social networks. Where are the dark matter developers? Probably getting work done. Maybe using ASP.NET 1.1 at a local municipality or small office. Maybe working at a bottling plant in Mexico in VB6. Perhaps they are writing PHP calendar applications at a large chip manufacturer."

While some companies are using Spark and Druid and Airflow, some are still using Coldfusion... Or telnet... Or Microsoft TFS... There are reasons updates are not made. In some cases, it's a matter of national security (like at NASA). In others, people get used to what they know. In some cases, the old tech is better... In some cases, it's both a matter of security, AND IT is not a priority. This is the reason many government agencies return data in PDF formats, or in XML... For all of this variety of reasons and more, the majority of companies that are at the pinnacle of succes in America are quietly running Windows Server 2012 behind the scenes.

And, not only are they running Java on Windows 2012, they're also not doing machine learning, or AI, or any of the sexy buzzwords you hear about. Most business rules are still just that: hardcoded case statements decided by the business, passed down to analysts, and done in Excel sheets, half because of bureacracy and intraction, and sometimes, because you just don't need machine learning. Finally, the third piece of this is the "dark matter" effect. Most developers are simply not talking about the mundane work they're doing. Who wants to share their C# code moving fractions of a cent transactions between banking systems when everyone is doing Tensorflow.js?

In a footnote to his essay, Hanselman had added that his examples weren't hypothetical. "These people and companies all exist, I've met them and spoken to them at length." (And the article includes several tweets from real-world developers, including one which claims Tesla's infotainment firmware and backend services were all run in a single-location datacenter "on the worst VMware deployment known to man.")

But the data scientist ultimately asks if our online filter bubbles are exposing us to "tech-forward biases" that are "overenthusiastic about the promises of new technology without talking about tradeoffs," leading us into over-engineered platforms "that our companies don't need, and that most other developers that pick up our work can't relate to, or can even work with...

"For better or worse, the world runs on Excel, Java 8, and Sharepoint, and I think it's important for us as technology professionals to remember and be empathetic of that."
This discussion has been archived. No new comments can be posted.

Are Trendy Developers Ignoring Tradeoffs and Over-Engineering Workplaces?

Comments Filter:
  • Yes (Score:5, Insightful)

    by ArchieBunker ( 132337 ) on Saturday May 18, 2019 @08:46PM (#58616374)

    Too many abstraction layers. Virtual python environments running inside filesystem images. Nobody but the author can compile anything because it needs a dozen obscure libraries so they just make a container. Sounds a lot like another operating system that people always bitched about, DLL hell I believe.

    • by msauve ( 701917 )
      Yep. Ease of development may have little long term value. Any organization worth anything would have a focus on paying for the logic and not any particular implementation. Pay less now for code using the language-of-the day, then pay someone else much more later to maintain that code base. Or pay a bit more to develop in something well established, and reap the rewards because it's easier to maintain in the future

      The most successful organizations take a long term view.
      • by Kjella ( 173770 )

        Yep. Ease of development may have little long term value. Any organization worth anything would have a focus on paying for the logic and not any particular implementation. Pay less now for code using the language-of-the day, then pay someone else much more later to maintain that code base. Or pay a bit more to develop in something well established, and reap the rewards because it's easier to maintain in the future

        I think that's a bit of an oversimplification, rapid implementation isn't only about cost it's also about adapting to consumer demand and shifting competition, meeting deliverables and deadlines. Sometimes taking a bit longer and "doing it right" is a pretty big deal and is often why large, slow enterprises are outmaneuvered by small, nimble contenders. But if you're constantly grabbing the latest tool from the toolbox you're leaving behind a wake of legacy solutions that all have to be maintained. If you'r

    • .... and now it is âBut in runs in my container!!â(TM)

      Nothing changes, it just gets wrapped in more indirection

    • by sjames ( 1099 )

      The containers are a pet peeve of mine. Developer constructs something with so many undocumented interlocking dependencies even he can't get it to install in a clean environment, so just cram the whole kitchen sink into a container and declare that the one and only distribution method. If it's open source, throw the source in a tarball with build instructions that don't stand a chance of working and hope nobody ever tries.

    • by AmiMoJo ( 196126 )

      Eh, it's not that much better elsewhere. For example on the embedded side we keep around old versions of the IDE and toolchain, sometimes even check the installers into Git. Most of the decent IDEs let you have multiple versions of the toolchain, libraries and header files installed so you can build software that hasn't been touched for a decade and no longer compiles on the latest versions.

      The biggest problem is that the toolchains stop working on newer versions of the OS, so you end up keeping a Windows X

    • Ok Don Quixote, settle down.

  • Duh (Score:5, Insightful)

    by Austerity Empowers ( 669817 ) on Saturday May 18, 2019 @08:52PM (#58616398)

    I haven't seen any major tool shift in language or compiler over 20 years. It's not that we aren't aware of alternatives, it's that the alternatives are solving problems we either don't have, or do not make the risk/reward payoff. C/C++ are probably never going to be replaced.

    The only major shift I've seen in my line of work is scripting. Perl gave way to Python, TCL gave way to Ruby. I do not think any of these are "end-game" languages like C. The one thing I haven't seen a clean implementation of in these languages is threading, the gloabl interpreter lock seems to haunt all of them. This doesn't seem like an interesting problem to the developers of these languages, they're heads are way too far up the ass of computer science philosophy and idiomatic coding, but the problem is there to be solved.

    • Re: (Score:3, Interesting)

      It's not that we aren't aware of alternatives, it's that the alternatives are solving problems we either don't have, or do not make the risk/reward payoff. C/C++ are probably never going to be replaced.

      I respectfully disagree. C and C++ are the epitome of languages that remain as popular as they are more because of momentum than merit. The cost to the world of bugs in C and C++ programs that simply couldn't happen if the software had been written in a better language is incalculable, they lack useful features that are widely available in other mainstream languages, and the arguments about a lack of suitable alternatives are getting weaker all the time. Will they be eliminated entirely? Presumably not, at

      • Re:Duh (Score:4, Insightful)

        by phantomfive ( 622387 ) on Saturday May 18, 2019 @11:08PM (#58616714) Journal
        Anyone who lumps C/C++ together doesn't know them both. They are quite different languages.
        • I know them both. At least to the degree anyone can really know C++. And while some source trees are meticulously clean C code, the majority I see use some merger of the two, including some aspects of C++ which would be illegal or undefined in strict C. Very few people in my line of work use pure C++, I would say primarily because, again, it solves problems they're not having.

      • I respectfully disagree. C and C++ are the epitome of languages that remain as popular as they are more because of momentum than merit. The cost to the world of bugs in C and C++ programs that simply couldn't happen if the software had been written in a better language is incalculable, they lack useful features that are widely available in other mainstream languages, and the

        If other languages are so much better where is the correspondingly better software?

        arguments about a lack of suitable alternatives are getting weaker all the time.

        Excuses for failure to create better software because you picked a "better" language are getting weaker all the time.

        My personal view general purpose language selection is like selecting deck chairs for an ocean liner. While it matters in the grand scheme of things turns out to be quite irrelevant.

        The future is in services and DSLs backed by gargantuan sums of accumulated dead labor.

        • If other languages are so much better where is the correspondingly better software?

          One of our clients makes devices for environments where taking them out of service, for example to update firmware to patch a bug, has very serious implications.

          We have a subsystem within that firmware that does some fairly complicated calculations, and the logic is entirely customised for these devices so it's all written from scratch, no ready-made off-the-shelf libraries here. If those calculations were ever wrong, Very Bad Things could happen. Such failures would probably be very obvious, not least beca

          • So far, that subsystem has been running in the field in numerous deployments, some of them for quite a few years now, with a total of zero issues reported by customers where the root cause was a programming error.

            Same here. It's always pilot error.

            Ain't that right, c6gunner?

            • In the example I mentioned, the most common cause of failures in production is actually hardware components that don't perform according to their specifications under some combination of conditions. You can write your firmware as carefully as you like, but if the instructions it outputs are then misinterpreted by the hardware, there's not much you can do.

      • The cost to the world of bugs in C and C++ programs that simply couldn't happen if the software had been written in a better language is incalculable, they lack useful features that are widely available in other mainstream languages

        What do you think C++ is lacking? It has almost everything. In fact, this is one of the biggest problems with it - code in C++ is well-nigh incomprehensible what with all the added features to it. It is almost impossible to reason about any snippet of C++ code because it has so many different "features" that all interact in subtle ways.

        • What do you think C++ is lacking? It has almost everything.

          I don't disagree with your view that the kitchen sink approach can be a problem, particularly for a language like C++ that has inevitably picked up a lot of historical baggage over the years and sometimes struggles to reconcile the different programming models it tries to support.

          However, to give a few examples where C++ is starting to look quite under-powered compared to other languages now in relatively common use, it has a very limited type system (not even algebraic data types, pattern matching, etc.),

          • For applications programming that isn't running in a very resource-constrained or close-to-the-metal environment, some trade-offs that might have seemed silly a decade or two ago might make more sense with the system architectures we have today. If a compilation model and runtime system that incur say a 10% overhead but support some of those more powerful features are available, you have to buy slightly more/faster hardware to get the same performance.

            Or you've just added 20 cents to your BoM and now you c

            • Or you've just added 20 cents to your BoM and now you can't turn a profit. Or you're running on a supercomputer and you have the fastest computer already. Or you're trying to run on execrable low end phone and the users won't use your app if it doesn't fit in the resources. Or your program has billions of users (e.g. a web browser) so the cumulative impact of 10% is massive.

              There are plenty of cases where that 10% still matters.

              Cases certainly exist, but fewer than you imagine, looking at your list of candidates.

              The first one I agree really exists for some devices and components, but is really just the OS case, where - yes - people often need to use low level abstractions for efficiency in systems software. Similarly, browsers are still written in C++.

              But the supercomputer example is definitely imaginary. All supercomputers scale by adding processors, and so have no hard "fastest speed", and the rate of speed increase for the fast

              • The first one I agree really exists for some devices and components, but is really just the OS case

                Most of the low end MCUs come in a variety of speeds, RAM and flash amounts. For example on the super low end you have things with maybe 64 bytes of RAM and 1k word of flash. Often they don't have an OS you just run on the are metal, or in other cases you'd use FreeRTOS or possibly the product vendor will supply some sort of noddy callback based thing (and don't fuck up by making your callback take too long mm

            • There are plenty of cases where that 10% still matters.

              Of course. Better performance is, other things being equal, a good thing.

              However, in many of those cases, reliability and safety matter more. Personally, I'd rather wait 11 days for that big data processing job and get the right answer than have the wrong answer in 10, and I'd take a 10% performance hit in my browser in a heartbeat if it meant eliminating entire classes of security and privacy vulnerabilities in return.

    • Perl 6 has a lot neat well-implemented ideas. Removal of the global interpreter lock was one of the major priorities of the devs, and they succeeded. So Perl ends up being the first scripting language with first class concurrency support.

      Nobody seems too keen on moving old Perl 5 libraries over, though, and without the ecosystem it just doesn't have the same draw. If you're willing to implement a lot from scratch (or port)....

    • It's brute force, but it works extremely well.

      I did a lot of C++ early in my career, I still use C for embedded and Python for everything else.

      Not scoffing at Python anymore. It is extremely powerful.

    • by Boronx ( 228853 )

      I think we're due for a major shift in how we program, or maybe even a fundamental change in languages.

      Data flow design, data binding, dependency injection, multi-threading, these have all gone from obscure techniques to every day use. C like languages in a text editor is not a good way to develop with these ideas.

      • Data flow design, data binding, dependency injection, multi-threading, these have all gone from obscure techniques to every day use.

        These have been common for a long time in software development.

        What sort of idea do you have, something like UML, except it works?

      • Re:Duh (Score:5, Informative)

        by goose-incarnated ( 1145029 ) on Sunday May 19, 2019 @02:41AM (#58617196) Journal

        I think we're due for a major shift in how we program, or maybe even a fundamental change in languages.

        Data flow design, data binding, dependency injection, multi-threading, these have all gone from obscure techniques to every day use. C like languages in a text editor is not a good way to develop with these ideas.

        I must be having a stroke, because I remember doing all those things in C 15 years ago. Many of them, including stuff like data encapsulation, code isolation, etc was all the new hotness in the 80s. That you think these things are new and need new languages means that the article was spot on.

    • AngelScript does not have a global interpreter lock, and can be used in a multithreaded fashion. It's also fast, easy to integrate, and has a simple and familiar syntax that looks a lot like C++ but does away with all the difficult parts (pointers, templates, etc.).

    • I haven't seen any major tool shift in language or compiler over 20 years.

      I don't see how you can possibly say that. Most Windows software today is written in C#, a language that didn't exist 20 years ago. Most iOS software is written in Swift, a language that didn't exist even 10 years ago. Android development is quickly moving to Kotlin, another language that didn't exist 20 years ago.

      And no tool shift in compilers? LLVM didn't exist 20 years ago.

  • But Cold Fusion? Those remaining installations need to die in a fire.

  • by pegdhcp ( 1158827 ) on Saturday May 18, 2019 @10:15PM (#58616576)
    ...was the first rule I was given at the beginning of my tenure in a computer center. I still believe this rule is the correct one. The caveat here is that "works" must be defined very carefully. Being "Trendy" as the post puts it, is not within the definition of "working". However "keeping running" is not either. You need to have some rules, location, organisation, resources based, for example:
    • System must be sustainable. You can keep your main financial engine in a bank running on Cobol, but you need to ensure that you have access to programmers who can manage that, systems that can run it and some kind of "emergency recovery plan" for data and operation recovery in case of a sudden demise in the primary.
    • System must be reliable, most old/legacy systems are inheritedly more reliable than modern systems. There are several reasons for that, like most being isolated in the first place, most being tested over years and decades, and potential problems are usually known and patched, most are not flashy so less attractive to people loving to meddle with flashy things, or as the common parlance has it, cowboys do not get involved with them.
    • One serious exception to rule (with a caveat) is however. A system can reach legacy status for two major reasons. If the maintenance team is competent, the system is long lived because it is maintained adequately. However if the maintenance team is not competent, system can be long lived as the team is afraid to touch it. The caveat is that most young people (trendy pups if you like) tend to disregard the cautious approaches of older generation as incompetence, which usually is not.

    I can but obviously would not name several internationally know organisations that uses Cobol, PL-1, CAT-1 cabling, printers whose toner and/or ribbons produced specifically for that installation on order etc. Also I would definitely not name the giant holding I used to work who rehired their HQ's network engineer one week after retiring him due to old age. Such are actual production environments, and as long as you know you can print invoices for the next four quarters, you do not care if the printer is 30 years old.
    Really, if it "sustainably" works, never touch it. And keep in mind that "you" might be the reason why system turned into an unsustainable state.

    • by swilver ( 617741 )

      The problem often is that legacy code has 0 tests or tests that barely test anything at all -- perhaps the language doesn't even facilitate easy testing. You always need good integration and unit tests if you want a system that you can maintain and evolve. Anything less, and every change you do becomes a nightmare of manual testing and production rollbacks.

      • I have some items to point out. First of all I do not have a "deep" belief in unit testing, That being said it has some important uses, primarily ensuring things are more or less as they were when you left them last week or last month. However I have a firm belief in testing when it comes to managing legacy systems, legacy code or databases.
        I personally love to write outside monitoring tools (while keeping in mind Turing...) for legacy systems. You are right that either language itself or system design mig
  • Grizzled programmers still work on code written for the IBM System/360 systems that were delivered from 1965-1978. Even today's most modern IBM System z mainframe can run code largely unchanged since the 1960s.

  • Some people get excited and chase the "shiny" (PHBs especially), but I was trying to impart upon my college intern recently to eschew the latest framework X or language Y (which come and go head-spinningly fast) and instead focus on getting stuff done, even if the tools you have are "old" or considered unsexy. The older you get, the more tired you become of the changing, the flux, the waste, when you just want to be productive.

    And yes, for the youngin's who are presently spending their career chasing th
    • by Boronx ( 228853 )

      I agree, but the world needs both kinds. Every now and then one of these kids will discover something really new and find themselves in clickbait for disrupting an industry.

  • by Todd Knarr ( 15451 ) on Saturday May 18, 2019 @11:13PM (#58616724) Homepage

    A lot of it is that commercial developers (the people working at companies whose business isn't building tech components) don't talk about their work because they aren't allowed to. Management considers the underpinnings of what makes their systems work to be proprietary information that they don't want competitors to have, so the developers can't discuss it without NDAs and other legal agreements in place first. They also can't publish it on Github or anything. And they aren't building or contributing to open-source projects because they're spending 90% of their time on the company's software rather than generic tools. They'll use open-source tools, but they can't allocate significant amounts of time to enhancing those tools and they aren't likely to be doing things that would trigger bugs and justify the effort of finding and fixing them.

    Plus they're likely to be using outdated tools and software by open-source standards simply because it works well enough, everybody working on it knows it well and since it's all internal and not Internet-facing it's nowhere near as critical that it have all the latest security patches applied to it. I know what you're about to say, but I'm talking here about stuff that runs on a virtual network that isn't even directly connected to the company's internal network so by the time you've got enough access to try probing for vulnerabilities in the production systems you've already got complete run of the corporate internal network and compromise of production systems is a minor problem compared to compromise of the source code control repositories and internal security controls.

    And when the tools aren't outdated, they're likely not generally available yet. I've had questions like that about a couple of projects (door access control, RFID pay-at-the-pump tech) where the question was why I'd bothered to create them instead of just buying them in. My answer was that yeah, if I had to do them today I'd buy the solutions in, but back in '95/'96 when I originally wrote them the tech simply didn't exist (not just the software, I had to design some custom hardware for the door-control system because the existing at-door hardware of the time couldn't handle being controlled remotely and having access codes updated every few minutes).

    • by xleeko ( 551231 )

      if I had to do them today I'd buy the solutions in, but back in '95/'96 when I originally wrote them the tech simply didn't exist

      Amen! What is missing from the whole discussion is that hard problems take time to solve, which outlives the trend of the moment. I've worked on the same codebase for thirty years. We started trendy, with Brad Cox's new hotness Objective-C, and a couple of years later ported to the Bjarne's new hotness C++ (cfront 1.2, 9-track from ATT).

      At the time, C++ didn't even have multiple inheritance. Templates? Distant dream. We did it all with macros and name2 token pasting. But it worked, and has

  • If you a business with machines running an OS or JIT that is no longer gets security updates then I have no sympathy for you because you do not deserve it. You had years to migrate between releases, you were told about the implicit shortcoming of these systems before they were ever installed (but you ignored the warnings) and you have now put everyone at risk because you don't want to expend the resources needed to maintain your systems.

    If you are building software with cutting-edge programming languages t

    • by darkain ( 749283 ) on Saturday May 18, 2019 @11:17PM (#58616736) Homepage

      What happens when there is no upgrade path for the software? Should we just close the entire business and let everyone go, because an outside vendor either stop supporting a specific piece of software or went out of business? These fantasy worlds where it is just a point n click to upgrade seem all nice n dandy in theory, but the reality of the situation is that some industrial systems need to work and be stable for 10-30 years, sometimes longer. We cant just sit here and shut down entire companies because a piece of software needs updating. There isn't always an ideal solution.

    • If you a business with machines running an OS or JIT that is no longer gets security updates then I have no sympathy for you because you do not deserve it.

      Programmers are cute! It's funny when they think that things have a service life of 10 years are old.

      Industrial kit is expensive and has service lives measured in decades. Stuff like that is now and has been for quite a while computer controlled. The upgrade path you're referring to is simply not viable in many cases.

  • by BobC ( 101861 ) on Saturday May 18, 2019 @11:16PM (#58616734)

    CBA: Cost-Benefit Analysis. Every tool change can slow your time to market. Or can inflate the compute needs of your product, requiring ever more and faster CPUs and RAM, increasing COGS (Cost of Goods Sold). Change can destroy the leverage inherent in the "sunk costs" of prior development.

    I advocate tech upgrades when I can show they will pay, though all too often it means we will have to endure short-term death-marches to bridge the gap. Change is tough, slow and expensive. Ironically, change is hardest when you are leading the market, and slightly less hard when you are trying to change it, to make it bend in your direction.

    But you can't advocate for an upgrade you don't know about. How best to keep a team aware of the "cutting edge" while working productively well behind it?

    I like to do this in two key areas: R&D and Manufacturing. Both are critically important, but can also be isolated as sources of failure, allowing significant risks to be taken with minimal downsides.

    One engineer I worked with wanted to make a production test fixture using Erlang, and we knew she'd make it work (because, well, she always made things work). That effort succeeded, but not well enough to use Erlang again. The lessons learned were still valuable, as the Erlang experiment encouraged architectural changes we brought forward. And that test fixture ran well for years.

    In my own R&D work, I had the problem of C/C++ code I had written for a "Proof of Concept" prototype all too often wound up in the final product, actually reducing its quality. So I switched my R&D coding to Python, which both increased my productivity while simultaneously making the product team look at things from first principles, rather than "borrowing" from my quick & dirty prototype. Plus, I learned how to make Python go fast. Very fast. With no more than a 2x performance penalty compared to optimized C++ (though with a 6x memory penalty, which is fine in R&D).

    But my favorite software innovation also caused the greatest cultural change: Our product teams started relying ever more heavily on Open Source libraries, sending our changes upstream, and shipping product faster and better because of access to the communities surrounding the code we used. (Seriously, the maintainers of Open Source code are almost always generous heroes who richly deserve to be supported.) Over three years we moved from a strictly closed-source shop to an Open Source shop with only a minimal amount of our "Special Sauce" kept private. It also makes it much easier to hire talent, since they can immediately start working with a known code base before being dipped in the Special Sauce. Seriously, before this, our hiring and on-boarding process was sheer torture, with months between the hire and productivity. But this process was gradual, step by step, library by library, based on the product and schedule demands.

    So, how best to encourage forward-looking thought with near-term application? Rotate everyone through the R&D and Manufacturing "playgrounds". Have lunch talks to share new things, with learning and sharing being ongoing professional development goals. Encourage "tolerable risk" whenever possible, but still ruthlessly eliminating risk when the schedule demands it.

  • After I sold my company, I started looking for a software job for the first time since 1996. I despaired at the state of the industry... All web-developy jargon-filled crap. But then I found a job as an embedded developer and couldn't be happier. Real programs that get stuff done with a minimum of trendiness and jargon, not to mention being able to use my EE degree to understand hardware aspects of the design. I wouldn't want to do "mainstream" software development any more; it's just not fun.
  • Not everybody using Neural Networks in their work? You don't say.

  • by lkcl ( 517947 ) <lkcl@lkcl.net> on Sunday May 19, 2019 @01:00AM (#58616986) Homepage

    this is very very simple: through github, the belief that the popularisation of a developer's output correlates with and therefore *is* a measure of the software's success is poisoning and biasing newcomers minds.

    there's an extremely good package called "python-htmltmpl". look it up. release date: 2001! the development statistics on it are zero. it was written, it worked, it does the job, and it requires no further development. you almost certainly never heard of it until now... why?

    because the "development" statistics - the number of commits, the number of tweets, github "feel-good-about-me-me-me-the-developer-aren't-i-so-clever-for-making-yet-another-unnecessary-change" are *zero*.

    as a result it is forgotten about, and re-invented, again and again. kirbybase is another example. it's an extremely elegant very simple database engine that sits on top of a human-readable CSV file format. written in 2005, does the job, no hassle, and forgotten about.

    the development pace (the number of commits), the number of times the developer wiped their backside or posted how much coffee they drank today is *not* and *cannot* be a measure of the *quality* of the software, and the number of downloads does *not* correlate directly with its usefulness or its suitability for a task.

    far from being the "saviour" of software, github - and to a lesser extent sourceforge (which is rescued by its ability to encourage team development) - has actually done us a massive disservice, by providing misleading signals based on the naive belief that social media drives our lives instead of intelligence and common sense.

  • Most applications just get info from/to an RDBMS, apply some domain logic, and squirt it to/from HTML. You don't need to turn bicycle science into rocket science with microservices, async crap, mobile-friendly screens when everyone in biz still uses PC's, pulsating JavaScript widgets, 3D date pickers, etc.

    They forgot the key principle: K.I.S.S. Too many parts makes shit break and expensive to repair.

    As far as some orgs still using ColdFusion, my ColdFusion apps still work after 16 years. (Some converted in

  • Seems to me that unless you have a somewhat high staff turnover or are a young company, then chances are that your department is stuck using the non-trendy stuff. By the numbers, I'm sure most 'dev' work is done in IT departments for monolithic corporations or government or utilities... they've got legacy apps that were created 20 years ago and do a reasonable job. Budgets are low and timescales are tight for new projects - this does not promote risk-taking with new languages, and probably not the training
  • "For better or worse, the world runs on Excel, Java 8, and Sharepoint, and I think it's important for us as technology professionals to remember and be empathetic of that."

    well shit, now i know what i've been doing wrong all this time. must schedule some time to replace my debian GNU/Linux OS and 20-years-in-the-making hand-crafter fvwm2 setup with the latest windows, redo 20 years worth of python programs in excel and java, and convert my customer's website database to use java and sharepoint. then just maybe i can get a job! w00t!

  • Java 8 is too hard to migrate to.

  • Internet guy realizes he lives in a bubble.

  • There is a buzzword: "Minimum Necessary Technology". 8-)

    Ask any Engineer...

  • I do web. See it every day. Best example: The "virtual DOM" fad. Yeah, it's notably faster when you're updating a webchat window. Yes, SPAs are neat. Yes, doing relational trail resolution in the client is neat and GraphQL is it's newest brainchild. But it's exactly as laid out in the article: Toolstacks are so complicated and have a staggering amount of external depencies, they fall appart every odd month with some obscure bug deep down in npms abysmal dependency tree. Sucks big time to debug that shit, I'

  • The reason COBOL still hangs around in data centers is because you cannot make a valid business case to replace it. To wit: let's spend $$$$ to rewrite everything in a new language/platform/architecture and not change one single bit of functionality.

    If the software is working correctly, it's not worth replacing. You can only make a business case to replace broken stuff.

/earth: file system full.

Working...