Are Trendy Developers Ignoring Tradeoffs and Over-Engineering Workplaces? (github.io) 211
An anonymous reader shares an article titled "Does IT Run on Java 8?"
"After more than ten years in tech, in a range of different environments, from Fortune 500 companies, to startups, I've finally come to realize that most businesss and developers simply don't revolve around whatever's trending on Hacker News," argues one Python/R/Spark data scientist: Most developers -- and companies -- are part of what [programmer] Scott Hanselman dubbed a while ago as the 99%... "They don't read a lot of blogs, they never write blogs, they don't go to user groups, they don't tweet or facebook, and you don't often see them at large conferences. Lots of technologies don't iterate at this speed, nor should they.
"Embedded developers are still doing their thing in C and C++. Both are deeply mature and well understood languages that don't require a lot of churn or panic on the social networks. Where are the dark matter developers? Probably getting work done. Maybe using ASP.NET 1.1 at a local municipality or small office. Maybe working at a bottling plant in Mexico in VB6. Perhaps they are writing PHP calendar applications at a large chip manufacturer."
While some companies are using Spark and Druid and Airflow, some are still using Coldfusion... Or telnet... Or Microsoft TFS... There are reasons updates are not made. In some cases, it's a matter of national security (like at NASA). In others, people get used to what they know. In some cases, the old tech is better... In some cases, it's both a matter of security, AND IT is not a priority. This is the reason many government agencies return data in PDF formats, or in XML... For all of this variety of reasons and more, the majority of companies that are at the pinnacle of succes in America are quietly running Windows Server 2012 behind the scenes.
And, not only are they running Java on Windows 2012, they're also not doing machine learning, or AI, or any of the sexy buzzwords you hear about. Most business rules are still just that: hardcoded case statements decided by the business, passed down to analysts, and done in Excel sheets, half because of bureacracy and intraction, and sometimes, because you just don't need machine learning. Finally, the third piece of this is the "dark matter" effect. Most developers are simply not talking about the mundane work they're doing. Who wants to share their C# code moving fractions of a cent transactions between banking systems when everyone is doing Tensorflow.js?
In a footnote to his essay, Hanselman had added that his examples weren't hypothetical. "These people and companies all exist, I've met them and spoken to them at length." (And the article includes several tweets from real-world developers, including one which claims Tesla's infotainment firmware and backend services were all run in a single-location datacenter "on the worst VMware deployment known to man.")
But the data scientist ultimately asks if our online filter bubbles are exposing us to "tech-forward biases" that are "overenthusiastic about the promises of new technology without talking about tradeoffs," leading us into over-engineered platforms "that our companies don't need, and that most other developers that pick up our work can't relate to, or can even work with...
"For better or worse, the world runs on Excel, Java 8, and Sharepoint, and I think it's important for us as technology professionals to remember and be empathetic of that."
"After more than ten years in tech, in a range of different environments, from Fortune 500 companies, to startups, I've finally come to realize that most businesss and developers simply don't revolve around whatever's trending on Hacker News," argues one Python/R/Spark data scientist: Most developers -- and companies -- are part of what [programmer] Scott Hanselman dubbed a while ago as the 99%... "They don't read a lot of blogs, they never write blogs, they don't go to user groups, they don't tweet or facebook, and you don't often see them at large conferences. Lots of technologies don't iterate at this speed, nor should they.
"Embedded developers are still doing their thing in C and C++. Both are deeply mature and well understood languages that don't require a lot of churn or panic on the social networks. Where are the dark matter developers? Probably getting work done. Maybe using ASP.NET 1.1 at a local municipality or small office. Maybe working at a bottling plant in Mexico in VB6. Perhaps they are writing PHP calendar applications at a large chip manufacturer."
While some companies are using Spark and Druid and Airflow, some are still using Coldfusion... Or telnet... Or Microsoft TFS... There are reasons updates are not made. In some cases, it's a matter of national security (like at NASA). In others, people get used to what they know. In some cases, the old tech is better... In some cases, it's both a matter of security, AND IT is not a priority. This is the reason many government agencies return data in PDF formats, or in XML... For all of this variety of reasons and more, the majority of companies that are at the pinnacle of succes in America are quietly running Windows Server 2012 behind the scenes.
And, not only are they running Java on Windows 2012, they're also not doing machine learning, or AI, or any of the sexy buzzwords you hear about. Most business rules are still just that: hardcoded case statements decided by the business, passed down to analysts, and done in Excel sheets, half because of bureacracy and intraction, and sometimes, because you just don't need machine learning. Finally, the third piece of this is the "dark matter" effect. Most developers are simply not talking about the mundane work they're doing. Who wants to share their C# code moving fractions of a cent transactions between banking systems when everyone is doing Tensorflow.js?
In a footnote to his essay, Hanselman had added that his examples weren't hypothetical. "These people and companies all exist, I've met them and spoken to them at length." (And the article includes several tweets from real-world developers, including one which claims Tesla's infotainment firmware and backend services were all run in a single-location datacenter "on the worst VMware deployment known to man.")
But the data scientist ultimately asks if our online filter bubbles are exposing us to "tech-forward biases" that are "overenthusiastic about the promises of new technology without talking about tradeoffs," leading us into over-engineered platforms "that our companies don't need, and that most other developers that pick up our work can't relate to, or can even work with...
"For better or worse, the world runs on Excel, Java 8, and Sharepoint, and I think it's important for us as technology professionals to remember and be empathetic of that."
Yes (Score:5, Insightful)
Too many abstraction layers. Virtual python environments running inside filesystem images. Nobody but the author can compile anything because it needs a dozen obscure libraries so they just make a container. Sounds a lot like another operating system that people always bitched about, DLL hell I believe.
Re: (Score:3)
The most successful organizations take a long term view.
Re: (Score:3)
Yep. Ease of development may have little long term value. Any organization worth anything would have a focus on paying for the logic and not any particular implementation. Pay less now for code using the language-of-the day, then pay someone else much more later to maintain that code base. Or pay a bit more to develop in something well established, and reap the rewards because it's easier to maintain in the future
I think that's a bit of an oversimplification, rapid implementation isn't only about cost it's also about adapting to consumer demand and shifting competition, meeting deliverables and deadlines. Sometimes taking a bit longer and "doing it right" is a pretty big deal and is often why large, slow enterprises are outmaneuvered by small, nimble contenders. But if you're constantly grabbing the latest tool from the toolbox you're leaving behind a wake of legacy solutions that all have to be maintained. If you'r
It used to be âBut it ran on my machine!! (Score:3)
.... and now it is âBut in runs in my container!!â(TM)
Nothing changes, it just gets wrapped in more indirection
Re:It used to be âBut it ran on my machine!! (Score:2)
FTFY.
Re: (Score:2)
The containers are a pet peeve of mine. Developer constructs something with so many undocumented interlocking dependencies even he can't get it to install in a clean environment, so just cram the whole kitchen sink into a container and declare that the one and only distribution method. If it's open source, throw the source in a tarball with build instructions that don't stand a chance of working and hope nobody ever tries.
Re: (Score:2)
Eh, it's not that much better elsewhere. For example on the embedded side we keep around old versions of the IDE and toolchain, sometimes even check the installers into Git. Most of the decent IDEs let you have multiple versions of the toolchain, libraries and header files installed so you can build software that hasn't been touched for a decade and no longer compiles on the latest versions.
The biggest problem is that the toolchains stop working on newer versions of the OS, so you end up keeping a Windows X
Re: Yes (Score:2)
Ok Don Quixote, settle down.
Re: (Score:2, Offtopic)
I've used Linux since 1996 and then got away in the early 00s. In the meantime my personal servers all ran *BSD of some sort. So I had no idea what direction Linux was taking. Being on that hiatus allowed me to see how drastic these changes were. AC is also correct about documentation. Its either very outdated or non existent. Stackoverflow shouldn't be your guide. That's why I really enjoy Slackware. Good docs and no extra bullshit.
Duh (Score:5, Insightful)
I haven't seen any major tool shift in language or compiler over 20 years. It's not that we aren't aware of alternatives, it's that the alternatives are solving problems we either don't have, or do not make the risk/reward payoff. C/C++ are probably never going to be replaced.
The only major shift I've seen in my line of work is scripting. Perl gave way to Python, TCL gave way to Ruby. I do not think any of these are "end-game" languages like C. The one thing I haven't seen a clean implementation of in these languages is threading, the gloabl interpreter lock seems to haunt all of them. This doesn't seem like an interesting problem to the developers of these languages, they're heads are way too far up the ass of computer science philosophy and idiomatic coding, but the problem is there to be solved.
Re: (Score:3, Interesting)
It's not that we aren't aware of alternatives, it's that the alternatives are solving problems we either don't have, or do not make the risk/reward payoff. C/C++ are probably never going to be replaced.
I respectfully disagree. C and C++ are the epitome of languages that remain as popular as they are more because of momentum than merit. The cost to the world of bugs in C and C++ programs that simply couldn't happen if the software had been written in a better language is incalculable, they lack useful features that are widely available in other mainstream languages, and the arguments about a lack of suitable alternatives are getting weaker all the time. Will they be eliminated entirely? Presumably not, at
Re:Duh (Score:4, Insightful)
Re: (Score:2)
I know them both. At least to the degree anyone can really know C++. And while some source trees are meticulously clean C code, the majority I see use some merger of the two, including some aspects of C++ which would be illegal or undefined in strict C. Very few people in my line of work use pure C++, I would say primarily because, again, it solves problems they're not having.
Re: (Score:2)
I respectfully disagree. C and C++ are the epitome of languages that remain as popular as they are more because of momentum than merit. The cost to the world of bugs in C and C++ programs that simply couldn't happen if the software had been written in a better language is incalculable, they lack useful features that are widely available in other mainstream languages, and the
If other languages are so much better where is the correspondingly better software?
arguments about a lack of suitable alternatives are getting weaker all the time.
Excuses for failure to create better software because you picked a "better" language are getting weaker all the time.
My personal view general purpose language selection is like selecting deck chairs for an ocean liner. While it matters in the grand scheme of things turns out to be quite irrelevant.
The future is in services and DSLs backed by gargantuan sums of accumulated dead labor.
Re: (Score:2)
If other languages are so much better where is the correspondingly better software?
One of our clients makes devices for environments where taking them out of service, for example to update firmware to patch a bug, has very serious implications.
We have a subsystem within that firmware that does some fairly complicated calculations, and the logic is entirely customised for these devices so it's all written from scratch, no ready-made off-the-shelf libraries here. If those calculations were ever wrong, Very Bad Things could happen. Such failures would probably be very obvious, not least beca
Re: (Score:2)
Same here. It's always pilot error.
Ain't that right, c6gunner?
Re: (Score:2)
In the example I mentioned, the most common cause of failures in production is actually hardware components that don't perform according to their specifications under some combination of conditions. You can write your firmware as carefully as you like, but if the instructions it outputs are then misinterpreted by the hardware, there's not much you can do.
Re: (Score:2)
A variant of Haskell, in this particular case.
Re: (Score:2)
The cost to the world of bugs in C and C++ programs that simply couldn't happen if the software had been written in a better language is incalculable, they lack useful features that are widely available in other mainstream languages
What do you think C++ is lacking? It has almost everything. In fact, this is one of the biggest problems with it - code in C++ is well-nigh incomprehensible what with all the added features to it. It is almost impossible to reason about any snippet of C++ code because it has so many different "features" that all interact in subtle ways.
Re: (Score:2)
What do you think C++ is lacking? It has almost everything.
I don't disagree with your view that the kitchen sink approach can be a problem, particularly for a language like C++ that has inevitably picked up a lot of historical baggage over the years and sometimes struggles to reconcile the different programming models it tries to support.
However, to give a few examples where C++ is starting to look quite under-powered compared to other languages now in relatively common use, it has a very limited type system (not even algebraic data types, pattern matching, etc.),
Re: (Score:2)
For applications programming that isn't running in a very resource-constrained or close-to-the-metal environment, some trade-offs that might have seemed silly a decade or two ago might make more sense with the system architectures we have today. If a compilation model and runtime system that incur say a 10% overhead but support some of those more powerful features are available, you have to buy slightly more/faster hardware to get the same performance.
Or you've just added 20 cents to your BoM and now you c
Re: (Score:2)
Or you've just added 20 cents to your BoM and now you can't turn a profit. Or you're running on a supercomputer and you have the fastest computer already. Or you're trying to run on execrable low end phone and the users won't use your app if it doesn't fit in the resources. Or your program has billions of users (e.g. a web browser) so the cumulative impact of 10% is massive.
There are plenty of cases where that 10% still matters.
Cases certainly exist, but fewer than you imagine, looking at your list of candidates.
The first one I agree really exists for some devices and components, but is really just the OS case, where - yes - people often need to use low level abstractions for efficiency in systems software. Similarly, browsers are still written in C++.
But the supercomputer example is definitely imaginary. All supercomputers scale by adding processors, and so have no hard "fastest speed", and the rate of speed increase for the fast
Re: (Score:2)
The first one I agree really exists for some devices and components, but is really just the OS case
Most of the low end MCUs come in a variety of speeds, RAM and flash amounts. For example on the super low end you have things with maybe 64 bytes of RAM and 1k word of flash. Often they don't have an OS you just run on the are metal, or in other cases you'd use FreeRTOS or possibly the product vendor will supply some sort of noddy callback based thing (and don't fuck up by making your callback take too long mm
Re: (Score:2)
There are plenty of cases where that 10% still matters.
Of course. Better performance is, other things being equal, a good thing.
However, in many of those cases, reliability and safety matter more. Personally, I'd rather wait 11 days for that big data processing job and get the right answer than have the wrong answer in 10, and I'd take a 10% performance hit in my browser in a heartbeat if it meant eliminating entire classes of security and privacy vulnerabilities in return.
Re: (Score:2)
C/C++ isn't a language, and as someone who has actually written code that was compiled for many different targets (hardware/OS/compiler combinations) I am always wary about claims of cross-platform portability. It is true that there is a C compiler for almost every platform you can imagine (C++, not so much) but we're moving towards a world where tools like LLVM will potentially become as important for the success of a new platform as having a simple C compiler, and then any language that compiles to LLVM i
Re: (Score:2)
Old is bad, because newer languages masking bugs is better.
It's not about masking bugs, it's about not admitting those failure modes in the first place. There is no need for about 99% of new software to be written in languages that have pointers with arithmetic and null values by default, unsafe memory models, unsafe enumerations (and no more powerful algebraic data types), unsafe type aliases and awkward syntax for defining types more generally, silent type coercion that is sometimes lossy, a standard library full of functions that are literally never safe to call
Re: (Score:2)
High performance widgets or precious use of little resources drive C.
This is true, but even in the embedded space, a lot of systems are now actually running on something like an ARM chip and Linux under the hood, even if it's a low-power chip and a minimal build instead of a major distro. You do have quite a few potentially useful options other than C at that point.
It's more tricky if you really are working close to the metal, either embedded or other systems programming like the OS, device drivers, and so on. This is where C and to some extent C++ do still have surprisingly
Re: (Score:2)
Perl 6 has a lot neat well-implemented ideas. Removal of the global interpreter lock was one of the major priorities of the devs, and they succeeded. So Perl ends up being the first scripting language with first class concurrency support.
Nobody seems too keen on moving old Perl 5 libraries over, though, and without the ecosystem it just doesn't have the same draw. If you're willing to implement a lot from scratch (or port)....
ZMQ + Python for multithreading (Score:3)
It's brute force, but it works extremely well.
I did a lot of C++ early in my career, I still use C for embedded and Python for everything else.
Not scoffing at Python anymore. It is extremely powerful.
Re: (Score:2)
I think we're due for a major shift in how we program, or maybe even a fundamental change in languages.
Data flow design, data binding, dependency injection, multi-threading, these have all gone from obscure techniques to every day use. C like languages in a text editor is not a good way to develop with these ideas.
Re: (Score:3)
Data flow design, data binding, dependency injection, multi-threading, these have all gone from obscure techniques to every day use.
These have been common for a long time in software development.
What sort of idea do you have, something like UML, except it works?
Re:Duh (Score:5, Informative)
I think we're due for a major shift in how we program, or maybe even a fundamental change in languages.
Data flow design, data binding, dependency injection, multi-threading, these have all gone from obscure techniques to every day use. C like languages in a text editor is not a good way to develop with these ideas.
I must be having a stroke, because I remember doing all those things in C 15 years ago. Many of them, including stuff like data encapsulation, code isolation, etc was all the new hotness in the 80s. That you think these things are new and need new languages means that the article was spot on.
Re: (Score:2)
AngelScript does not have a global interpreter lock, and can be used in a multithreaded fashion. It's also fast, easy to integrate, and has a simple and familiar syntax that looks a lot like C++ but does away with all the difficult parts (pointers, templates, etc.).
Re: (Score:2)
I haven't seen any major tool shift in language or compiler over 20 years.
I don't see how you can possibly say that. Most Windows software today is written in C#, a language that didn't exist 20 years ago. Most iOS software is written in Swift, a language that didn't exist even 10 years ago. Android development is quickly moving to Kotlin, another language that didn't exist 20 years ago.
And no tool shift in compilers? LLVM didn't exist 20 years ago.
I’m fine with most of this (Score:2)
But Cold Fusion? Those remaining installations need to die in a fire.
If it works, do not touch it... (Score:3)
I can but obviously would not name several internationally know organisations that uses Cobol, PL-1, CAT-1 cabling, printers whose toner and/or ribbons produced specifically for that installation on order etc. Also I would definitely not name the giant holding I used to work who rehired their HQ's network engineer one week after retiring him due to old age. Such are actual production environments, and as long as you know you can print invoices for the next four quarters, you do not care if the printer is 30 years old.
Really, if it "sustainably" works, never touch it. And keep in mind that "you" might be the reason why system turned into an unsustainable state.
Re: (Score:2)
The problem often is that legacy code has 0 tests or tests that barely test anything at all -- perhaps the language doesn't even facilitate easy testing. You always need good integration and unit tests if you want a system that you can maintain and evolve. Anything less, and every change you do becomes a nightmare of manual testing and production rollbacks.
Re: (Score:2)
I personally love to write outside monitoring tools (while keeping in mind Turing...) for legacy systems. You are right that either language itself or system design mig
Re: (Score:2)
Elegance in code is overrated. Elegance in behavior is where it's at. Correct code that performs well is often ugly, bulky and boring.
Re: (Score:2)
Re: (Score:2)
You also use global variables, you just don't realize it because of the 20 layers of abstraction and 20MB of code that you have npm'd into your repository that you have no way to audit and that you pull in from repos that will mostly be gone in 10 years.
I bet you couldn't write a semaphore, write a linked l
Grizzled IBM programmers still work on 1960s code (Score:2)
Grizzled programmers still work on code written for the IBM System/360 systems that were delivered from 1965-1978. Even today's most modern IBM System z mainframe can run code largely unchanged since the 1960s.
The Shining (Score:2)
And yes, for the youngin's who are presently spending their career chasing th
Re: (Score:2)
I agree, but the world needs both kinds. Every now and then one of these kids will discover something really new and find themselves in clickbait for disrupting an industry.
Re: (Score:2)
Not allowed to talk about it (Score:4, Insightful)
A lot of it is that commercial developers (the people working at companies whose business isn't building tech components) don't talk about their work because they aren't allowed to. Management considers the underpinnings of what makes their systems work to be proprietary information that they don't want competitors to have, so the developers can't discuss it without NDAs and other legal agreements in place first. They also can't publish it on Github or anything. And they aren't building or contributing to open-source projects because they're spending 90% of their time on the company's software rather than generic tools. They'll use open-source tools, but they can't allocate significant amounts of time to enhancing those tools and they aren't likely to be doing things that would trigger bugs and justify the effort of finding and fixing them.
Plus they're likely to be using outdated tools and software by open-source standards simply because it works well enough, everybody working on it knows it well and since it's all internal and not Internet-facing it's nowhere near as critical that it have all the latest security patches applied to it. I know what you're about to say, but I'm talking here about stuff that runs on a virtual network that isn't even directly connected to the company's internal network so by the time you've got enough access to try probing for vulnerabilities in the production systems you've already got complete run of the corporate internal network and compromise of production systems is a minor problem compared to compromise of the source code control repositories and internal security controls.
And when the tools aren't outdated, they're likely not generally available yet. I've had questions like that about a couple of projects (door access control, RFID pay-at-the-pump tech) where the question was why I'd bothered to create them instead of just buying them in. My answer was that yeah, if I had to do them today I'd buy the solutions in, but back in '95/'96 when I originally wrote them the tech simply didn't exist (not just the software, I had to design some custom hardware for the door-control system because the existing at-door hardware of the time couldn't handle being controlled remotely and having access codes updated every few minutes).
Re: (Score:2)
if I had to do them today I'd buy the solutions in, but back in '95/'96 when I originally wrote them the tech simply didn't exist
Amen! What is missing from the whole discussion is that hard problems take time to solve, which outlives the trend of the moment. I've worked on the same codebase for thirty years. We started trendy, with Brad Cox's new hotness Objective-C, and a couple of years later ported to the Bjarne's new hotness C++ (cfront 1.2, 9-track from ATT).
At the time, C++ didn't even have multiple inheritance. Templates? Distant dream. We did it all with macros and name2 token pasting. But it worked, and has
No sympathy for practioners of self-destruction. (Score:2)
If you a business with machines running an OS or JIT that is no longer gets security updates then I have no sympathy for you because you do not deserve it. You had years to migrate between releases, you were told about the implicit shortcoming of these systems before they were ever installed (but you ignored the warnings) and you have now put everyone at risk because you don't want to expend the resources needed to maintain your systems.
If you are building software with cutting-edge programming languages t
Re:No sympathy for practioners of self-destruction (Score:4, Insightful)
What happens when there is no upgrade path for the software? Should we just close the entire business and let everyone go, because an outside vendor either stop supporting a specific piece of software or went out of business? These fantasy worlds where it is just a point n click to upgrade seem all nice n dandy in theory, but the reality of the situation is that some industrial systems need to work and be stable for 10-30 years, sometimes longer. We cant just sit here and shut down entire companies because a piece of software needs updating. There isn't always an ideal solution.
Re: (Score:2)
If you a business with machines running an OS or JIT that is no longer gets security updates then I have no sympathy for you because you do not deserve it.
Programmers are cute! It's funny when they think that things have a service life of 10 years are old.
Industrial kit is expensive and has service lives measured in decades. Stuff like that is now and has been for quite a while computer controlled. The upgrade path you're referring to is simply not viable in many cases.
If you ship hardware, it's all about CBA. (Score:3)
CBA: Cost-Benefit Analysis. Every tool change can slow your time to market. Or can inflate the compute needs of your product, requiring ever more and faster CPUs and RAM, increasing COGS (Cost of Goods Sold). Change can destroy the leverage inherent in the "sunk costs" of prior development.
I advocate tech upgrades when I can show they will pay, though all too often it means we will have to endure short-term death-marches to bridge the gap. Change is tough, slow and expensive. Ironically, change is hardest when you are leading the market, and slightly less hard when you are trying to change it, to make it bend in your direction.
But you can't advocate for an upgrade you don't know about. How best to keep a team aware of the "cutting edge" while working productively well behind it?
I like to do this in two key areas: R&D and Manufacturing. Both are critically important, but can also be isolated as sources of failure, allowing significant risks to be taken with minimal downsides.
One engineer I worked with wanted to make a production test fixture using Erlang, and we knew she'd make it work (because, well, she always made things work). That effort succeeded, but not well enough to use Erlang again. The lessons learned were still valuable, as the Erlang experiment encouraged architectural changes we brought forward. And that test fixture ran well for years.
In my own R&D work, I had the problem of C/C++ code I had written for a "Proof of Concept" prototype all too often wound up in the final product, actually reducing its quality. So I switched my R&D coding to Python, which both increased my productivity while simultaneously making the product team look at things from first principles, rather than "borrowing" from my quick & dirty prototype. Plus, I learned how to make Python go fast. Very fast. With no more than a 2x performance penalty compared to optimized C++ (though with a 6x memory penalty, which is fine in R&D).
But my favorite software innovation also caused the greatest cultural change: Our product teams started relying ever more heavily on Open Source libraries, sending our changes upstream, and shipping product faster and better because of access to the communities surrounding the code we used. (Seriously, the maintainers of Open Source code are almost always generous heroes who richly deserve to be supported.) Over three years we moved from a strictly closed-source shop to an Open Source shop with only a minimal amount of our "Special Sauce" kept private. It also makes it much easier to hire talent, since they can immediately start working with a known code base before being dipped in the Special Sauce. Seriously, before this, our hiring and on-boarding process was sheer torture, with months between the hire and productivity. But this process was gradual, step by step, library by library, based on the product and schedule demands.
So, how best to encourage forward-looking thought with near-term application? Rotate everyone through the R&D and Manufacturing "playgrounds". Have lunch talks to share new things, with learning and sharing being ongoing professional development goals. Encourage "tolerable risk" whenever possible, but still ruthlessly eliminating risk when the schedule demands it.
Glad to be in the embedded world (Score:2)
Reads like a 15 year old wrote it. (Score:2)
Not everybody using Neural Networks in their work? You don't say.
github / sourceforge "trending" statistics (Score:5, Insightful)
this is very very simple: through github, the belief that the popularisation of a developer's output correlates with and therefore *is* a measure of the software's success is poisoning and biasing newcomers minds.
there's an extremely good package called "python-htmltmpl". look it up. release date: 2001! the development statistics on it are zero. it was written, it worked, it does the job, and it requires no further development. you almost certainly never heard of it until now... why?
because the "development" statistics - the number of commits, the number of tweets, github "feel-good-about-me-me-me-the-developer-aren't-i-so-clever-for-making-yet-another-unnecessary-change" are *zero*.
as a result it is forgotten about, and re-invented, again and again. kirbybase is another example. it's an extremely elegant very simple database engine that sits on top of a human-readable CSV file format. written in 2005, does the job, no hassle, and forgotten about.
the development pace (the number of commits), the number of times the developer wiped their backside or posted how much coffee they drank today is *not* and *cannot* be a measure of the *quality* of the software, and the number of downloads does *not* correlate directly with its usefulness or its suitability for a task.
far from being the "saviour" of software, github - and to a lesser extent sourceforge (which is rescued by its ability to encourage team development) - has actually done us a massive disservice, by providing misleading signals based on the naive belief that social media drives our lives instead of intelligence and common sense.
Collect resume buzzwords like baseball cards (Score:2)
Most applications just get info from/to an RDBMS, apply some domain logic, and squirt it to/from HTML. You don't need to turn bicycle science into rocket science with microservices, async crap, mobile-friendly screens when everyone in biz still uses PC's, pulsating JavaScript widgets, 3D date pickers, etc.
They forgot the key principle: K.I.S.S. Too many parts makes shit break and expensive to repair.
As far as some orgs still using ColdFusion, my ColdFusion apps still work after 16 years. (Some converted in
How do companies change? (Score:2)
the world runs on... (Score:2)
"For better or worse, the world runs on Excel, Java 8, and Sharepoint, and I think it's important for us as technology professionals to remember and be empathetic of that."
well shit, now i know what i've been doing wrong all this time. must schedule some time to replace my debian GNU/Linux OS and 20-years-in-the-making hand-crafter fvwm2 setup with the latest windows, redo 20 years worth of python programs in excel and java, and convert my customer's website database to use java and sharepoint. then just maybe i can get a job! w00t!
The world runs on Java 6 (Score:2)
Java 8 is too hard to migrate to.
News at 11 (Score:2)
Internet guy realizes he lives in a bubble.
Re: (Score:2)
Haha exactly.
Minimum tech (Score:2)
There is a buzzword: "Minimum Necessary Technology". 8-)
Ask any Engineer...
Buzzword-Driven-Development (Score:2)
I do web. See it every day. Best example: The "virtual DOM" fad. Yeah, it's notably faster when you're updating a webchat window. Yes, SPAs are neat. Yes, doing relational trail resolution in the client is neat and GraphQL is it's newest brainchild. But it's exactly as laid out in the article: Toolstacks are so complicated and have a staggering amount of external depencies, they fall appart every odd month with some obscure bug deep down in npms abysmal dependency tree. Sucks big time to debug that shit, I'
The business case for COBOL (Score:2)
If the software is working correctly, it's not worth replacing. You can only make a business case to replace broken stuff.
Re:Hacker news (Score:4, Insightful)
Perhaps, but I'm guessing a lot of them are doing more interesting and much better paid work than you.
Hanselman isn't wrong about the unseen majority, the "dark matter developers", who are just getting stuff done. And he's not wrong that the young and oh-so-painfully-trendy crowd who tend to frequent places like HN and the programming and Web subreddits often know way too many buzzwords but have way too little actual skill and experience to back up their strident opinions.
At the same time, [HN founder] Paul Graham also wasn't wrong when he wrote about beating the averages [paulgraham.com] and "Blub programmers". Using tools and techniques that are actually better can be an advantage.
One of the most important skills in developing software, of whatever kind, is recognising good tools and techniques, while not getting sucked in by the hype. Most of these "fast-moving" industries are just reinventing the wheel every few months because those same young and oh-so-painfully-trendy developers don't know any better, but every now and then something worth knowing about does come along too.
Re: (Score:2)
The company I work for LOATHES change. They don't want change. They want .NET 4.0. None of that 4.5.2 bullshit.
And web? OH NO, NO WEB. "Real companies use Windows and desktop apps".
The problem is that they struggle to find Winforms developers. And the client doesn't feel the same way. I presented very valid use cases for web reports but they refused. "Clients don't want stupid giant screens with dashboards, that's just stupid. A decent SSRS report works for everyone. Now go back to coding Winforms, stupid k
Re: (Score:2)
There is a happy medium there. A point where you avoid ossification and at the same time avoid the constant unproductive churn. The location of that medium varies by industry.
Nobody wants to step on the brake pedal and have the fail whale appear on the driver's display.
Re:Hacker news (Score:5, Interesting)
The company I work for LOATHES change. They don't want change. They want .NET 4.0. None of that 4.5.2 bullshit.
And web? OH NO, NO WEB. "Real companies use Windows and desktop apps".
The problem is that they struggle to find Winforms developers. And the client doesn't feel the same way. I presented very valid use cases for web reports but they refused. "Clients don't want stupid giant screens with dashboards, that's just stupid. A decent SSRS report works for everyone. Now go back to coding Winforms, stupid kid".
Guess what. The client hired the competition. The competition had giant dashboard screens. Now the company is asking me if there is a way to run the Winforms reports in a "Smart TV".
Companies use old technology because they don't know any better. It's perfectly valid not to jump into the latest fad. But it's also really stupid to cling on to .NET 4.0 "because some clients may have Windows XP machines we have to support" (psst.. none of our clients do. we have the analytics.).
I can beat that. I work in a place where the two lead devs (been there for 20+ years) originally wrote the program for an embedded device (tiny printer, small 200 1bit screen, limited network functionality, Linux OS and proper BusyBox shell).
Over the years they've refused to let go of this code and instead ported it to each new hardware revision, so the most recent hardware which runs Linux, has B/tooth, WiFi and a full-color 480p screen still runs this ancient code. However since they both hate Linux with a passion, they "wrapped" all the linux-ism so that the entire stack resembles Win32.
The program doesn't use libcurl (they wrote their own network stack to resemble Win32, then wrote a set of classes for http and ftp), doesn't use openssl/gnutls (uses their own set of classes), etc. The text-only interface doesn't uses curses - they have their own routines for UI.
So we have these very nice platforms with a full Linux stack, on which absolutely none of the libraries are being used, and none of the advantages are being taken; it still looks and acts (and crashes) like a DOS-era program for dedicated hardware even though the manufacturer delivers the device with QT.
So now a port to Android is required, because the newest revision of the platform from the manufacturer runs Android, so they are "porting" the program by painstakingly porting all of their homegrown libraries. The interface is being ported by writing a wrapper around Android calls so that the sources will still look like a Win32 program. The display is being ported by emulating their existing display calls so that bits are written to a framebuffer which is then blitted to screen in one single Android call. This means that even though the customers asked for Android, and hardware manufacturer obliged, what they get is going to look and behave exactly the same as the one they have now.
Oh yeah, they're still actively maintaining a Win CE port because "it's technically superior to Linux and will eventually dominate the market", even though the company has never had a manufacturer provide WinCE as an option nor has any customer ever requested it. Pull requests that include calls to libcurl, or any POSIX API are declined with the reason of "Non-portable: doesn't work on WinCE!".
Luckily the company itself is great to work at, and as long as they remain great to work at, I'll remain with them :-)
Re: (Score:2)
Oh yeah, they're still actively maintaining a Win CE port because "it's technically superior to Linux and will eventually dominate the market"
Wow, I didn't realize WinCE still existed. That's the biggest dog of a system I've ever had to work on.
Re: (Score:2)
Oh yeah, they're still actively maintaining a Win CE port because "it's technically superior to Linux and will eventually dominate the market"
Wow, I didn't realize WinCE still existed. That's the biggest dog of a system I've ever had to work on.
I don't think that it does exist. You can't really argue with fanboys of a particular system, and microsoft fanboys are no different. These are the type of folk who complain that gcc is buggy because their UB code "works" on VS.
So, yeah, there are programmers out there who will do anything to continue programming on Windows, including emulating a very large software stack on Linux just to ensure that they touch as little "Linux" code as possible (because libcurl is too "linuxy").
Re: (Score:2)
They want .NET 4.0. None of that 4.5.2 bullshit.
This really bothers me because it betrays a complete misunderstanding of what the costs and benefits of upgrading the framework version are, to the extent that I'd refuse to work for that kind of brain-dead management. Same thing with targeting newer versions of the language. Both provide pretty big benefits to developers willing to spend the tiny bit of time to become familiar with the changes while costing near zero dollars and introducing near zero risk.
Should everyone be cutting edge and be on 4.8 alr
Re: (Score:2)
The culture has permeated into the developers though.
When I showed async/await to a coworker, she brushed it off saying "I do the same with delegates, I don't need that".
Wow.
Re: (Score:2)
Perhaps, but I'm guessing a lot of them are doing more interesting and much better paid work than you.
interesting is relative: i'm the lead developer behind the Libre RISC-V SoC (an entirely libre hybrid CPU / VPU / GPU, the first of its kind in the world) - http://libre-riscv.org/3d_gpu/ [libre-riscv.org] and we're sponsored by NLnet. it doesn't get more interesting than that [note: not a single line of code of the project is hosted on github].
better paid: you just demonstrated my point, in that the social media driven statistics are what people see (including companies), and on that (very dangerously misleading) bias, man
Re: (Score:2)
The real programmers are writing code, not articles and blog posts.
That's a false dichotomy. There's nothing that says a good programmer has to write about it and share their experience with others, and certainly there's nothing that says someone who writes about programming is necessarily good at doing it, but there is some overlap. The earliest example I can think of from the Internet age is probably John Carmack's .plan files, which were legendary (and highly informative for other programmers) long before social media and Stack Overflow were on the scene.
Re: (Score:2)
Re: (Score:2)
You will never find a greater hugbox of NPCs and self-congratulatory liberal douches.
You obviously haven't watched very many TED talks.
Re: (Score:3)
You CAN run your entire business off an Athlon K7 running in a midtower stuffed in a closet on Windows XP SP1, I've been at the company where that was a thing.
True, but just because that's a problem doesn't mean hardware is always going to mature at the same pace, nor that software demands will scale linearly. For instance, time was that a 2 year old desktop couldn't play modern video games, right? Here I am with a 3 year old desktop doing just that. Sure, the bells and whistles aren't set at 11, but it's entirely playable.
Same with server hardware. Time was 5 years was a painful wait to update your server. CPU/disk speeds matured at such a rate, and service
Re: (Score:2)
The company I work for has 10 year old servers. The demand began 10 years ago when facebook was "new".
The execs don't understand why those servers "were running fine 10 years ago and nothing has changed, but we have problems EVERY DAY now, and the system is really slow!!!". Wow.
Re: C/C++ are fucked: Why? Ok proof... apk (Score:2)
Borland used to be the compiler of choice, then VisualC++ 2.0 shipped. Nail in the fucking coffin. Ha!
Re: C/C++ are fucked: Why? Ok proof... apk (Score:2)
But it is true. Pascal (and, in particular, Modula and Oberon) were always much nicer to work with than C. Both languages have a great feel to them and are faster then C.
Add to this that Pascal languages also have a faster stack calling convention than C
Re: (Score:3)
C has a slightly heavier calling convention to support variable arguments, something you cannot do with Pascal calling conventions.
As to the "language" being faster than C- now you're just making shit up.
Or maybe you're not. I've always thought English is faster than French.
Re: (Score:2)
Add to this that Pascal languages also have a faster stack calling convention than C ... both is absolutely equally fast.
Nope.
There is nothing faster in one or the other, they only push the arguments in different order, and in Pascal the called routine clears the stack and in C, the caller does
Re: C/C++ are fucked: Why? Ok proof... apk (Score:2)
Ignorance. The calling convention is defined by the ABI and has absolutely nothing to do with the language
Re: C/C++ are fucked: Why? Ok proof... apk (Score:5, Interesting)
Which really is a shame. Borland/inprise/Codegear/Embarcadero screwed the pooch on this. Delphi and Object Pascal are very capable languages and development systems. But, somebody go greedy and started charging excessive amounts while the rest of the world started moving towards OSS and free:
Delphi and C++Builder had a thriving community of loyal developers who created amazing components that were usable. That all ended when Microsoft released .Net and Visual Studio for free and Inprise thought the future was Delphi on Linux via Wine (aka Kylix) but had a stupid licensing model.
Embarcadero still charges a lot for the Delphi and RADStudio. But, the Community Edition is free. And, you can use CE for free until you hit a $5K limit on sales.
You CAN use Delphi to create native apps for Windows, MacOS, iOS and Android. And, you can often do it with the same code base. Linux services can be created with the Enterprise edition (not free). You do need a Mac to build MacOS and iOS apps. This is an Apple thing. Delphi and RADStudio IDEs still are Windows only.
FreePascal, a Delphi 7 syntax clone can target many more environments (and, itâ(TM)s free). Lazarus is an Delphi-like IDE for FPC.
The latest incarnation of Delphi supports many of the âoemodernâ paradigms and is not as obsolete as some would have you believe.
SmartPascal can transpile to EcmaScript.
Embarcadero also purchased Sencha. Itâ(TM)s is available in, architect edition, I believe. They still havenâ(TM)t mastered the web. But, it does the other stuff really well.
Re: C/C++ are fucked: Why? Ok proof... apk (Score:4, Interesting)
That all ended when Microsoft released .Net and Visual Studio for free and Inprise thought the future was Delphi on Linux via Wine (aka Kylix) but had a stupid licensing model.
I'm not sure that they ever thought that. Borland/Inprise did, however, decide to chase the enterprise segment with very little success. They built their business selling to home/hobbyist/solo devs (hence their product had to be cheap). It's when they decided to ignore that segment and increased their prices by a factor of almost 10 (because "enterprises pay that!") that their revenue went off a cliff).
Re: C/C++ are fucked: Why? Ok proof... apk (Score:3)
I would have to disagree with you here. Turbo Pascal was developed and sold to the hobbyist or individual developer at $49.99.
At the time, Pascal compiler creates p-code and cost hundreds of dollars. Turbo Pascal generated native code for DOS and CP/M.
When Delphi was first released, it was intended to be a VB killer. It generated fast, 16 bit code, was truly object oriented, supported exception handling, and provided access to databases using the BDE (paradox for anything but Enterprise). The Pr
Re: (Score:2)
What I would love to see, however, it for Embarcadero to take the Delphi IDE approach with another language such as Swift
Wow, way to mess up something good.
Re: (Score:2)
I am a contract programmer / DevOp with over 25 years experience working in the Silicon Valley. Lots of big sites, Ebay, PayPal, Intuit, StubHub, are running their sites on Java jsp and servlets running on tomcat. Say all you want about what about Java, its a horrible language, its slow, its big, yatta, yatta, yatta. The fact remains that Java is all over the place and not going away. And companies pay good money for experienced Java back-end developers
Re: (Score:2)
The same could be said about COBOL.
The fact that companies use it doesn't mean it's good. It just means the architecture is crap that doesn't leave the door open to change.
Like shitty DB backends without abstraction layers, laced with vendor-lock-in. Oh that very convenient SELECT TOP... oopsie, you're now locked into MSSQL forever.
Re:I'm part of the 99% (Score:5, Insightful)
The same could be said about COBOL.
The fact that companies use it doesn't mean it's good. It just means the architecture is crap that doesn't leave the door open to change.
The fact that companies uses it, and had been using it for the past 30 years mean it is at least good for one thing -- letting your business critical programs to run for 20-30 years. Not like the trendy junk that expected you to rewrite everything to the latest fad every 3-5 years.
Say what you will about COBOL, the fact is there are loads of COBOL programs written 20-30 years ago still running unchanged in the world.
If you are in a real business, not some fad-chasing startup trying to get-rich-quick in 5 years, you would want your core business programs to continue to run in 10, 20, even 30 years after you invested millions to write them.
THAT's the reason why most developers don't tweet, blog, or go to user groups (of latest fads). They are busy doing their job and earning their pay, rather than trying to drum up interest in the latest fad.
Re: (Score:2)
Your "30 years unchanged" program has only been running for 30 years because your company has paid millions every year to support the hardware AND software lock ins from IBM.
Your program has been running for 30 years because IBM has maintained their infrastructure for 30 years with upgrades to let you run your crappy cobol. It's not running in a mainframe anymore. It's hosted in a VM now.
Your 30 year old code isn't "good". Someone's newer code is just making it think it's good.
Don't fool yourself. I know yo
Re: (Score:2)
Your "30 years unchanged" program has only been running for 30 years because your company has paid millions every year to support the hardware AND software lock ins from IBM.
Sure, now go and name another company that support your programs to run unchanged for 30 years for merely a few million a year.
If paying a few million a year to run your core business is "too expensive", your company is too small and your workload isn't that important. Go ahead and rewrite the whole thing every few years, it wouldn't cost much even the migration is botched.
But for real business that will lose millions because their program stopped working for a few hours, paying a few million a year to avo
Re: (Score:2)
But for real business that will lose millions because their program stopped working for a few hours, paying a few million a year to avoid those service hours lost worth every penny.
At a certain company I once worked at, you have heard their name and most likely used their services, I was told by the CTO that for every minute the site was down they were loosing $100,000.00. The entire site was Java jsp and servlets running on Tomcat.
Re: (Score:2)
Re: (Score:2)
Say what you will about COBOL, the fact is there are loads of COBOL programs written 20-30 years ago still running unchanged in the world.
They aren't unchanged, most of them are being actively maintained.
Re: (Score:2)
Hard to see why the hackers have left that treasure trove of garbage collection data alone.
Re: (Score:2)
Indeed since the garbage being collected by the garbage collectors sometimes contains sensitive stuff.
Re: (Score:2)
Simple is the easiest to exploit.
Sorry, but this just isn't true. Simple is far easier to understand and verify than a complicated mess of frameworks and dependencies.
Re: Ignoring security and privacy (Score:2)
Do you count the fact that all iPhones run on *BSD as failure?
Re: (Score:2)
Re:Me & My KIND are a dying breed... apk (Score:4, Funny)
Me & My KIND are a dying breed
Sadly, no. We seem to have more dumbfucks every single generation.