Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming It's funny.  Laugh. Python IT

How Is Computer Programming Different Today Than 20 Years Ago? (medium.com) 325

This week a former engineer for the Microsoft Windows Core OS Division shared an insightful (and very entertaining) list with "some changes I have noticed over the last 20 years" in the computer programming world. Some excerpts: - Some programming concepts that were mostly theoretical 20 years ago have since made it to mainstream including many functional programming paradigms like immutability, tail recursion, lazily evaluated collections, pattern matching, first class functions and looking down upon anyone who don't use them...

- 3 billion devices run Java. That number hasn't changed in the last 10 years though...

- A package management ecosystem is essential for programming languages now. People simply don't want to go through the hassle of finding, downloading and installing libraries anymore. 20 years ago we used to visit web sites, downloaded zip files, copied them to correct locations, added them to the paths in the build configuration and prayed that they worked.

- Being a software development team now involves all team members performing a mysterious ritual of standing up together for 15 minutes in the morning and drawing occult symbols with post-its....

- Since we have much faster CPUs now, numerical calculations are done in Python which is much slower than Fortran. So numerical calculations basically take the same amount of time as they did 20 years ago...

- Even programming languages took a side on the debate on Tabs vs Spaces....

- Code must run behind at least three levels of virtualization now. Code that runs on bare metal is unnecessarily performant....

- A tutorial isn't really helpful if it's not a video recording that takes orders of magnitude longer to understand than its text.

- There is StackOverflow which simply didn't exist back then. Asking a programming question involved talking to your colleagues.

- People develop software on Macs.

In our new world where internet connectivity is the norm and being offline the exception, "Security is something we have to think about now... Because of side-channel attacks we can't even trust the physical processor anymore."

And of course, "We don't use IRC for communication anymore. We prefer a bloated version called Slack because we just didn't want to type in a server address...."
This discussion has been archived. No new comments can be posted.

How Is Computer Programming Different Today Than 20 Years Ago?

Comments Filter:
  • by Rigodi ( 1000552 ) on Monday January 13, 2020 @04:13AM (#59614588)

    One major thing that changed in code writing is the nearly useless need to know language and APIs, parameters orders, prototypes, etc. Now code editor are here to remind us how to call every method or function, snitching parameter types, default values, variations, etc. Imagine what it was to do so with PHP in 2001, code jungle !

    • by Z00L00K ( 682162 )

      What you do today in programming for good and bad is to utilize pre-built packages without really knowing how they work internally.

      This leads to some issues when it comes to not understanding best strategy of indexing data.

    • by Anonymous Coward on Monday January 13, 2020 @05:07AM (#59614662)

      Yes, a 'smart' editor where you can just hover over and see the parameters is very useful, but they did exist in 2000 too, albeit not for every language under the sun.

      The biggest improvement to me is documentation and to use it, now i just have a dual monitor setup, monitors cost virtually nothing and you don't need a dual GPU setup to drive them. To me that's the biggest improvement. 2 x widescreen HD now vs 1x 1024x800 resolution (or lower) then.

      • I had dual monitors 20 years ago as well. Took up a bit more deskspace before the advent of LCDs though. Find a cheap, basic, PCI (not PCI Express) graphics card and an old 14 inch monitor that somebody didn't want anymore and that was it. Even as a poor student it was relatively affordable to have a dual monitor setup. I don't find widescreen to be that much of a help, and I almost wish that there were more monitors around that were designed to be use in portrait mode.

      • by skids ( 119237 ) on Monday January 13, 2020 @01:45PM (#59616208) Homepage

        The biggest improvement to me is documentation

        Really? Because I remember a day when I could look up a function and have it's behavior described
        in plain language, including all corner cases, and caveats about its potential mis-uses mentioned.

        Nowadays much of the documentation lacks this level of detail, so being able to hyperlink
        around the documentation may be nice, but of the contents are crap, it's not very useful
        compared to actually writing detailed descriptions of what the function/type/data-structure does.

        Partly this is due to dynamic "memory-safe" languages not producing SEGVs when you feed
        bad data in anymore. But that doesn't fix the problem of what the function went and did with
        that bad data.

        • by dwywit ( 1109409 )

          I remember the day our first AS400 was delivered from IBM, circa 1989. The machine didn't even have a CD drive, we loaded and backed up with tape.

          There were boxes and boxes of user and reference manuals, programming references, guides, and so on. So much detail.

          • IBMs programming docs were always pretty good. Some DB2 SQL manual or other was the best-written intro to SQL I've read, and it came free with the $2M mainframe.
            • by dwywit ( 1109409 )

              I pretty much taught myself AS400 sysadmin and RPGIII programming from those manuals.

              My early code wasn't elegant or efficient, but it worked. Thankfully, it became more elegant and efficient as time passed.

              I recently re-keyed a process monitor program from 1996 into a source file at a shared account at PUB400.com, and it compiled first time. I was quite proud of that.

    • by Viol8 ( 599362 ) on Monday January 13, 2020 @06:27AM (#59614802) Homepage

      Sure, you don't need to know day to day, but in interviews you're (unfortunately) expected to know this sort of stuff and will get called out on it if you don't at least know the basic usage of - for example - a container class.

    • by BAReFO0t ( 6240524 ) on Monday January 13, 2020 @07:15AM (#59614856)

      RAD environments were a thing, 20 years ago. I used to use Delphi, and it was great fun.

      And we always had the reference help open in the other window (or literally as a book in front of us), much earlier than that.
      Which amounts to the same, just a bit slower. Not that much slower though. Pretty well usable after some experience. I could easily come to any not too rare function reference I wanted within subjectively three page turns. (The first one got me within 10-20 pages.)

      Nowadays, you still do the same, for less mainstream languages.

  • by lucasnate1 ( 4682951 ) on Monday January 13, 2020 @04:15AM (#59614592) Homepage

    There was IRC, there used to be quick and useful answers there, at least for beginner questions like the ones in SO.

  • 20 years ago there was a vibrant asm community. today, except for some remnant pockets of diehards asm is dead.
    • Re:asm (Score:5, Insightful)

      by NoMoreACs ( 6161580 ) on Monday January 13, 2020 @04:58AM (#59614654)

      20 years ago there was a vibrant asm community.

      today, except for some remnant pockets of diehards asm is dead.

      I saw this being posited by the creators of the Hi-Tech C Compiler about 15 years ago.

      The answer now remains the same as then: Assembly is still useful for some (very specific) functions in real-time embedded applications.

    • Or game developers.

      Or embedded developers.

      It is very far from dead. In fact, you could not use any computer or microcontroller-based device without somebody having written some asm for it.

      • by mobby_6kl ( 668092 ) on Monday January 13, 2020 @12:18PM (#59615848)

        I'm sure there are some edge cases where Asm is used in embedded applications but almost everything can be programmed in C nowadays.

        And honestly the embedded space is probably the biggest change. Sure you've got Java or .Net frameworks out the ass nowadays, but 20 years ago Joe Blow had no chance of building a complex microcontroller driven device. Nowadays a decent understanding of C and an account with a Chinese PCB maker is enough to design, program and get your product made. Having been a hobby-level programmer for literally decades now, this is the closest I'm actually to building to delivering a real product.

    • by Entrope ( 68843 )

      Intrinsics in some languages (C and C++ are examples) basically let programmers use assembly instructions from a high-level language. Go doesn't have them on the theory that someone who cares that much about performance will just write a whole function in assembly. (Unfortunately, Go's designers also decided to use a novel dialect of assembly language, so people have to learn a third syntax.)

      Those are hardly "remnant pockets of diehards".

  • - Code must run behind at least three levels of virtualization now. Code that runs on bare metal is unnecessarily performant....

    My boss once told me that we did not need the performance of writing in c++ so we wrote in python.

    • Re:True (Score:5, Insightful)

      by lucasnate1 ( 4682951 ) on Monday January 13, 2020 @04:26AM (#59614614) Homepage

      Funny, am I the only one feels that for working in large teams, C++ is actually EASIER to code in than python? I think that the combination of multiple assignment (i.e. you can change a variable after it has been assigned) and dynamic typing is a dangerous trap.

      • Re:True (Score:5, Interesting)

        by _merlin ( 160982 ) on Monday January 13, 2020 @06:08AM (#59614778) Homepage Journal

        No, you're not the only one. Python is just too damn dangerous to refactor when it gets above a certain size. If you change an interface, in C++ you'll get a compile error for anything that doesn't implement new methods or hasn't updated to new signatures. You'll also get errors if any code using the interface isn't updated when the interface changes. With Python, good fucking luck. You need to carefully ensure that every single path through the code is exercised to make sure nothing is going to blow up. There are issues with the scoping rules as well - if you're expecting a name to refer to a global, but someone makes a local in an intermediate scope with the same name, the code will break in really non-obvious ways. Languages with strict name resolution, scoping and type systems are far safer and easier when you've got more than one person working on a project, you need to do any kind of refactoring, or you return to a project after it's left your short-term memory.

        • by BAReFO0t ( 6240524 ) on Monday January 13, 2020 @07:25AM (#59614878)

          It is a really nice scripting language. But a scripting language.

          Don't use it for something it, isn't for.

          If they had designed it for bigger projects, it would have strict typing, and offer automatic type inference at compile time, like e.g. Haskell. With optional declaration of types too. But it would be hard to use as a scripting language then.

          As it is, Python is great for one-off scripts, shell-scripting-like glue, and as a game scripting language (think GDScript), and would be great for replacing JS in anything but big and complex web apps (that should not exist anyway, but be normal programs).

          • by _merlin ( 160982 )

            Sure, I use it all the time for build system glue, code generators, loading stuff into databases, etc. It's great if you need fast turnaround time and the script isn't too big to understand in its entirety. It really sucks for building big applications though.

          • Replace JS with TypeScript, please. If you are going to go to a more sane implementation, don't step from one frying pan to another. At least with TypeScript, your code still transpiles down to JS and you don't have to bring another interpreter/runtime into the mix.

            I've also felt that if you have to write a throwaway script, you'd probably be better off using perl or bash. Perl for the performance, bash for the ubiquity. Don't get me wrong, Python works well (for the most part), but Perl is at least an ord

        • I know this is the wrong crowd. But visual studio is IMHO the best IDE out there. Yes, it's specific for .NET on Windows basically, but It Just Works. I haven't found a free IDE that works as well as VS or it's little brother, VS Code. Yes, I might be biased because I use VS at work, but I have had my fair share of Eclipse, NetBeans, JDeveloper and many others. VS still wins. Yes, with all its quirks, crashes, and need to "clean and rebuild" for random errors that make no sense.

          I had to refactor a "mockup g

          • Re: True (Score:4, Interesting)

            by _merlin ( 160982 ) on Monday January 13, 2020 @07:48AM (#59614914) Homepage Journal

            Eclipse's refactoring functionality works really well if you're working on a Java project and doing things the "Eclipse way", much like Visual Studio does for C# or C++ when you're doing things the way it wants you to. The problem I have is that all IDEs fall over when a project gets too big. Visual Studio's scalability has improved quite a bit, but there are still projects that are big enough to bring it to its knees. I just use good old vim these days.

          • Re: True (Score:4, Interesting)

            by lurcher ( 88082 ) on Monday January 13, 2020 @08:49AM (#59615098) Homepage

            "Yes, it's specific for .NET on Windows"

            Actually, its not, I use VS for developing embedded C for Atmel processors using gcc as the compiler, and uploading and debugging code via an AVR Dragon.

            https://visualgdb.com/tutorials/avr/

            Works well, and has all the libavr/target device context sensitive help.

      • I've never warmed up to python. My thinking just isn't compatible with it. Besides that, I find that any python project consisting of more than 1 file is already hard to maintain. I realize that whole herds of people think it's great, I'm just not one of them.

  • CoC (Score:4, Insightful)

    by AHuxley ( 892839 ) on Monday January 13, 2020 @04:23AM (#59614608) Journal
    people who cant code have political review demands for the people who do all the work.
  • by lucasnate1 ( 4682951 ) on Monday January 13, 2020 @04:24AM (#59614610) Homepage

    Frameworks, medium* level fool-proof programming languages, and faster hardware/data-centers are making it easier and easier for dumber people to program, thus making programmers less uniques. Despite the constant hype, I suspect that we are gradually approaching the "programmer is nothing but a clerk and is treated as one" phase.

    (* - I don't use the expression high level programming language for JS/Java because they are still lower than Prolog/Haskell/Erlang, which provide higher abstraction but require a more mathematical formal thinking, unlike Java and JS whose main focus is not fast development or efficency, but being "good enough" and being fool proof)

    • Re:Simplification (Score:5, Insightful)

      by AmiMoJo ( 196126 ) on Monday January 13, 2020 @06:40AM (#59614822) Homepage Journal

      Programming is stratified into many levels now:

      - Basic web devs, HTML/CSS and a bit of JS
      - Web app developers
      - High level language + framework developers
      - Database devs
      - People who write the frameworks and database engines
      - OS devs
      - Embedded devs

      etc.

      That's why I'm always sceptical of articles saying there is no demand for developers and training kids is setting them up for low wages etc. It depends what kind of coder you are - in embedded there is high demand and very few people who are even half way good at it.

      • One further level: (Score:5, Interesting)

        by BAReFO0t ( 6240524 ) on Monday January 13, 2020 @07:29AM (#59614884)

        Hardware developers. The kind who code in VHDL and Verilog.

        Right next to that, are electronical engineers. The ones who design the bare metal.

        After that, there is only physics and research.

        • by skids ( 119237 )

          This whole topic along with the other one a few nodes up about the "limits of computing power" really got under my skin... I abandoned EE/CSE after a bachelor's degree because I saw that chip development was outstripping software development and general technology adoption and decided it was better to apply technology to real world problems than to slave away 70 hours a week at an IBM-ish corporation eeking another 10% performance increase out of the silicon. Well, that and I didn't have to bother too much

      • Re:Simplification (Score:4, Interesting)

        by mr.morbo ( 6346556 ) on Monday January 13, 2020 @07:55AM (#59614930)

        I haven't met many "web devs" but the ones I have met were all pretty clueless at programming, preferring a brute-force "type as much as you can and it'll eventually work - kind of" approach to programming that results in unmaintainable spaghetti.

        And that kind of "developer" then graduates into far more important lower layers of the stack and fucks it up for everyone.

    • I may be a dumb programmer that uses libraries and frameworks. But if you want to, you can go ahead and write your own crypto libraries from scratch. See how that goes.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Monday January 13, 2020 @04:25AM (#59614612)

    Some more details:
    That standing thing is not a misterious ritual. It aktually makes sense.
    As for cushy scripting environments making programming easy and hard on memory, eating up many advantages of hardware advancements: That is soon going away with physics catching up on hardware, energy becoming more pricey and cloud/microservice latency, networkload and relyability coming back to bite everyone blindly hopping on to this fad. ... This is actually a good time to warm up those system level development skills of yesteryear once again, and with things like Rust available these days that might also be fun too,

    • by Ambassador Kosh ( 18352 ) on Monday January 13, 2020 @04:42AM (#59614630)

      I like the direction that HPC and machine learning are moving. You have things like tensorflow where all the hard work is done but the setup can be done on in python. From what I can tell in my tensorflow stuff 1% of the performance is in the python code but writing the python code in c++ would take a lot more time for no real gain.

      There are lots of nice libraries in numpy, scipy, sklearn etc which are implemented in low level languages and just called from a higher level language.

      Even the simulator we use here was written in c++ for speed but we normally control it from python. Writing all the control and processing stuff in c++ would be a marginal gain but at a huge cost in terms of writing the code.

  • by UPi ( 137083 ) on Monday January 13, 2020 @04:33AM (#59614618) Homepage

    ...but I was thinking it for a while now.

    We (developers) have cultists (cough agile cough), huge inefficiencies and generally much greater convenience. I'm not sure I'm entirely comfortable with the way our industry has gone even if I love having packaged libraries, hate agile, hate npm, love kotlin and new functional-infused language features and dread daily meetings.

    Curious what will the next 20 years bring (hopefully my retirement!)

    • by gtall ( 79522 ) on Monday January 13, 2020 @09:52AM (#59615304)

      "Curious what will the next 20 years bring" Electric Meeting Technicians (EMTs). These valuable bots will go to meetings for you, nod appropriately in the correct spots during a meeting, spout the latest in buzzword bingo to satisfy any management present, and generally make your life quite enjoyable. They will return from a meeting, plug themselves into your computer, and dump the contents of the meeting they have collected into your trash AND (this is the best part) automatically empty your trash for you.

  • by dehachel12 ( 4766411 ) on Monday January 13, 2020 @04:34AM (#59614622)
    business requirements are still not final.
  • by melted ( 227442 ) on Monday January 13, 2020 @04:44AM (#59614632) Homepage

    Numerical computing isn't done "in Python". Python is merely glue over high performance, low level C++ libs in this case. Former Microsoft engineer is talking out of his ass.

    • by Viol8 ( 599362 ) on Monday January 13, 2020 @05:34AM (#59614708) Homepage

      You could argue that for a lot of Python - most of it just calls C libraries or code. Unfortunately there's enough of an overhead in the interpreter to make it as slow as hell.

      • by jabuzz ( 182671 )

        In fact most numerical computing is done is still done in compiled languages. The vast majority is done with software packages whether it is commercial or open source. If you want to do some CFD you don't write your own code you use a preexisting software package to do it. Might be opensource like OpenFoam, or commercial like Ansys or StarCCM, but it is rarely stuff you wrote yourself and if it is it's in a compiled language.

        Hell we don't let you run Matlab .m code on our compute nodes, you can jolly well c

    • At the large financials (think quantitative modeling), lots and lots of numeric computing is done in Python.
      • by Ambassador Kosh ( 18352 ) on Monday January 13, 2020 @08:57AM (#59615122)

        They are probably using numpy, scipy etc and probably with numpy linked to a high performance BLAS library. Most of those calls are going to be to high performance lower level code. Actually it is pretty common to define a problem in python and have it solved by a high performance library.

        Defining a neural network in c++ would take a lot longer and give no performance advantage compared to python. You definitely want to run it in a low level language.

  • As of 40 years ago, I was programming on what would be considered a hobby computer - the RCA VIP (though there were others). As originally sold it came with 2K of RAM (which we had upgraded to 8K by then), a hex keypad (we'd managed to get hold of an old teletype and convert it to ASCII), video output to a standard TV with a resolution of 64x32 (pixels - and monochrome at that), and you were able to save or load programs from standard audio cassettes. It came with a small language good for writing your own
    • Me too! Except that it was a Finnish clone of the VIP, called TELMAC. It had 2k of memory, and room for another 2k, "if someone could find use for so much memory", as it said in the manual. Later we got a text-only display card with 16 lines of 64 characters, and extra 8k of memory, and a full keyboard.

      In those days I knew the instruction set by heart, and dreamed directly in hex. I wrote a lot of software for the 1802, some games, an assembler (the first version didn't have names for the instructions, sinc

  • by hcs_$reboot ( 1536101 ) on Monday January 13, 2020 @04:57AM (#59614652)
    Competences got diluted by the number of I-do-that-because-it-pays-well spawned by the many I-teach-that-because-that makes-money schools. 20 and more 30 / 40 years ago, people were programming because they liked that.
  • All of these only apply to Microsoft developers,
    elsewhere virtualization has been around for more then 20 years
    functional programming as well
    programming practices will eventually catch up inside MS as well ..

    • Yeah, functional programming? Pattern matching?

      There are some real head-scratchers there for sure. Maybe the only one that is accurate is Slack vs IRC. Though I've never used Slack so I'm just presuming that text chat is text chat.

      • Slack is much more. Slack enables you to easily share syntax highlighted snippets of code. Slack makes it easy to use web hooks for all sorts of things. Most companies have an info channel showing all commits to git, new issues in their issue tracker, new deployments, etc.

        Slack makes it easy to share files with teams. It'll happily store those files indefinitely too.

        Audio and video conferencing is integrated in Slack.

        Slack is resilient to bad (mobile) connections. IRC requires a rock stable connection.

        In sh

    • by Cederic ( 9623 )

      Yeah.

      Immutability? Been doing that since the mid 90s.
      Tail recursion? Been doing that since the 80s. In BASIC on a Commodore 64 ffs.
      Lazily evaluated collections? Been doing that since the late 90s.
      Pattern matching? Regexp were a thing when I was learning how to program.
      First class functions? I didn't personally encounter good implementations of this until the mid-2000s, although that's only because I avoided Javascript like the plague.
      Looking down upon anyone who don't use them? Yeah, arrogance has been a co

    • It's about the usless *duplicate layers* thaf so notinng but eat performance!

      It's about completely insane idiots who think the inner-platform effect is a good thing. Not like a VM. But like ... browsers.

  • Git (Score:5, Interesting)

    by Professeur Shadoko ( 230027 ) on Monday January 13, 2020 @05:15AM (#59614676)

    Source code control used to suck.

  • I would say the main differences are:

    1) Tools: 20 years ago was even before SVN, we used CVS. Which tools are used depends on which language each programmer is using.
    2) Multi-core processors: We all have to have some idea of concurrent programming, but 20 years ago everything was single-threaded.

    There are other differences as well, scripting languages can be used for a lot more than glue between binary blobs, but I would say those two are the main changes.

    • Scripting languages were never just glue, that was never an era.

      Maybe it was just a phase of somebody's career?

      • 20 years back, when Python did not support If statements outside a loop, it was unfeasible to prototype my numerical simulations in Python. The machines were too slow and the interpreter was slower than the current ones. I would implement the code directly in C++.
        Nowadays, prototyping numerical code in a scripting language is done all the time, and not just by me.

        • by narcc ( 412956 )

          20 years ago, no one in their right mind would have used Python for anything (if they even knew about it). No, not everything was single threaded 20 years ago. How old are you? What do you image when you picture the year 2000? But none of this has nothing to do with the point your trying to make.

          See, 20 years ago, the king of so-called "scripting languages" was Perl, followed closely by VBA. Real work got done in both. See, at the time, everyone believed that computers were so much faster and had se

    • I once annoyed a program manager by insisting I needed a second Pentium for testing some Windows code since there were such systems deployed. This goes back to (at least) Windows NT 4, so 1997, or so.

      One of my personal computers was fitted with a motherboard that supported dual Celerons (Abit BP6 with 366 MHz chips overclocked to 550), because I couldn't really afford dual Pentiums, but still needed MP.

      I LIKED CVS!

  • by Viol8 ( 599362 ) on Monday January 13, 2020 @05:39AM (#59614718) Homepage

    "programming paradigms like immutability, tail recursion, lazily evaluated collections, pattern matching"

    Tail recursion - hardly cutting edge stuff. You can do it in C. Perhaps a novel concept at MS.

    Pattern matching - anyone who's used *nix for any length of time has used regular expressions (or even just shell wildcards). This is a long way from new.

    First class functions - even C can pass around function pointers as variables and use them just like standard function calls.

    Sorry, amusing though this is , a lot of it seems like was written in the 1950s.

    • "it can be done" is something entirely different from "It's done routinely by everybody".

      • by Viol8 ( 599362 )

        Regular expressions are pretty routine in the *nix world and function pointers are used all the time in any serious C development.

    • by _merlin ( 160982 )

      C compilers couldn't optimise tail recursion 20 years ago. You end up with an extra stack frame for each tail call. Since then C compilers have learned to turn tail calls into loops. And if you think C function pointers are first-class functions, you're missing a lot of functional programming.

      • by Viol8 ( 599362 )

        They're not first class functions, but they do most of what most people ever need. And of course C++ has lambas.

        Anyway, functional programming is overrated - its just procedural programming with a different style. Passing around anonymous functions or assinging data to functions to create closures doesn't do anything that couldn't be done in plain old procedural methods with hardly any extra code and in some cases a lot more efficiently as the latter doesn't tend to burn up stack space like its going out of

        • by _merlin ( 160982 )

          Sure, you achieve anything with a procedural language. You can also achieve anything with a Turing tarpit language like Brainfuck. We invent higher-level languages because they're more expressive and can make us more productive. When used properly, functional programming techniques can make code easier to understand and statically analyse.

          Of course, functional programming is more than two decades old. Ericsson was rolling out Erlang-based phone switches in the late '90s, and LISP machines were around ye

          • by Viol8 ( 599362 )

            "When used properly, functional programming techniques can make code easier to understand"

            Recursion is rarely easier to understand than iterative. Even simply recursion often requires a lot of head scratching to figure out exactly where it'll stop and return and complex recursive code can be a virtual black box to anyone except the author.

    • You think emulating a half-assed version of them is the same as the real thing integrated at the core.

      By your logic, you can do that in asm too, because you could code the frameworks that enable that in other languages yourself.

      But C coders also still haven't realized the error-proneness of constantly watching memory integrity instead of automating it away like ... like a *programmer*. So what do they know.

  • by sonamchauhan ( 587356 ) <sonamc.gmail@com> on Monday January 13, 2020 @05:44AM (#59614724) Journal

    - The laptop is the default development device. Meant to enable 'work from home', it also enables 'support from home' now expected for important releases.

    - The cloud is the default deployment environment. Not as cheap or transparent as envisaged, it does make it easy to rent additional compute, data-transfer and storage.

    - Service management (ticket and change management) is hosted with cloud services now. Changing providers implies some information loss.

    - Version control software (Sourcesafe, CVS, Subversion and friends) are mostly superceded by git.

    - Monitors are now much wider than tall. As a consequence, side-to-side glancing and lateral mouse movement has increased. Mouse mats have mostly disappeared.

    - Devops seems to be a 'developers with pagers' initiative designed to reduce the need for IT and network admins.

    - The concept of 'work phones' (mobile and wired) and 'business cards' is slowly disappearing.

    - The era of paid classroom training has mostly ended. Nowadays you are expected to learn from vendor-provided online resources -- nowhere as effective as in-person training.

  • No respect (Score:5, Insightful)

    by syousef ( 465911 ) on Monday January 13, 2020 @05:47AM (#59614730) Journal

    Back then if you wrote clean code, or code that was clever and earnt or saved the company money or got customers, you were respected. Now they just wonder why you haven't been outsourced yet and lump you with enough red tape to drown in.

  • ..no one has mentioned the 200lb elephant in the room yet? Ok.. I'll say it.

    Porn.. it's much better.

    [tell me you weren't thinking it]

  • The more things change, the more they stay the same.

    • - Management still trying to control and micromanage programmers like they are simple spreadsheets. They even took over agile to go to the exact opposite of what it was intended for
    • - Lots of tools and strategy that supposedly makes programming faster, simpler and cheaper, yet somehow software development costs still rising and software more complicated than ever
    • - Cycle of insourcing/outsourcing and centralizing/decentralizing still cycling as ever
    • - Still e
    • - Despite endless effort to standardize interfaces, integrating different software systems is still a disaster

      What, you mean things like XML and JSON magically doesn't enable software to talk to each other, and more importantly understand eachother?

      Perhaps we need something new and hyped, like blockchains! And AI!

      I mean, API has both A and I in them, so there must be something there!

      A significant amount of my time is spent moving and comparing/cleaning data between seperate systems..

  • One silver lining (Score:5, Insightful)

    by NoNonAlphaCharsHere ( 2201864 ) on Monday January 13, 2020 @06:39AM (#59614820)
    Nobody (deliberately) starts new projects in Perl anymore.
  • by cowdung ( 702933 ) on Monday January 13, 2020 @07:09AM (#59614848)

    One thing that is interesting about the Python/Numpy trend though is that vectorization is a big deal. Since python can't handle loops very quickly vectorization is no longer optional, so code becomes performant because of a different-not-necessarily-sequential model of thinking.

    That is an advance in the right direction. Since people are getting used to programming in non-sequential ways (good for parallelism).

    (But then again, maybe that's how it was in Fortran, I don't know.. Python and AI programming is making that sort of programming more popular as well)

    • Yes, Fortran vectorizes code. It's one of the main reasons that that Fortran outperformed C before the restrict keyword (and Fortran still outperforms C in some situations because many programmers don't use the restrict keyword and the optimizer takes a cautious approach)

  • All I can add, is that each level of virtualization requires its own full-blown OS. Which, of course, must be shittier than the one below.

    Also, 20-40 years ago, HTML, plain text config files and using a CLI shell were normal *user* things (and I think they still are), not programmer things. As users were expected to use their brains, not twist a clickweel and drool "WANT!" to express wanting the dumbest common denominator they were told they want.

  • A sign of disaster (Score:5, Insightful)

    by TheDarkMaster ( 1292526 ) on Monday January 13, 2020 @07:24AM (#59614874)
    In my humble opinion, when a programming language needs a package manager to be used properly it is a sign that something is very, very wrong.
  • by bradley13 ( 1118935 ) on Monday January 13, 2020 @07:28AM (#59614882) Homepage

    People use more and more frameworks and external libraries. In the best case, this massively increases your productivity, because you don't have to implement everything yourself. In the worst case, those external dependencies bring in other dependencies, which bring in even more dependencies, and your cute little application becomes an behemoth, almost impossible to maintain (because *something* in those dependencies is always changing), and with a completely unknown set of vulnerabilities.

    Web development let's me answer TFS very directly: I last did web development about 20 years ago, when you coded everything yourself. PHP, HTML, CSS (v2) and maybe just a smidgen of JS. I am, just now, being asked to do a website "for fun" (read: no pay). I look at web development now and...jeezum...how much external Javascript am I supposed to use? Sure, you get a lot of stuff for free but...do I trust the code? How stable is it? How long does it take to learn to use it? How often is it updated? How often will those updates break something? After considering all of that over the long-term: is it *really* easier and more productive? I'm not entirely convinced...

    So, yeah: What's changed in 20 years is the amount of external crud that people tie into their software, thinking that it saves them time and effort.

    As far as some of the points in TFA: "immutability, tail recursion, lazily evaluated collections, pattern matching, first class functions". Those are all great, but they certainly aren't being used. The current LTS version of Java, for example, still doesn't implement tail recursion correctly. Lambdas are a kludge. Functional programming paradigms just do not belong in a fundamentally imperative language. If you want functional programming, write in Scala - it runs in the JVM and can be mixed with Java as needed. Fake functional programming in imperative languages is just a dumb idea. Oh, and get off my lawn.

  • ... and what was a craftsmanship has turned into a sort of religion.

  • Lazy. (Score:5, Insightful)

    by UID30 ( 176734 ) on Monday January 13, 2020 @07:38AM (#59614900)
    In general, people who learned to program 20+ years ago do not believe in the words "unnecessarily performant". For the most part we look at the wasteful code as it exists now on most "popular" systems and just shake our collective heads. A typical "app" now starts up and consumes10MB of memory for "hello world", while the computer that landed the first astronauts on the moon had 64KB of RAM ... total. Powerful hardware has made us lazy.
  • "A package management ecosystem is essential for programming languages now. People simply don’t want to go through the hassle of finding, downloading and installing libraries anymore. 20 years ago we used to visit web sites, downloaded zip files, copied them to correct locations, added them to the paths in the build configuration and prayed that they worked."

    While Windows & MacOS still don't have one.
    If you want to install software on those OS' you still need to visit web sites, downloaded zip fil

  • by SharpFang ( 651121 ) on Monday January 13, 2020 @07:54AM (#59614926) Homepage Journal

    The paradigm of "not reinventing the wheel" has been taken to an unhealthy extreme.

    Even if your function were to take 2 lines of code to get the work done, you're not supposed to write it, if it already exists in some library. Instead, you import the library, setup its instance through a factory (often a good screenful of code), massage your input into format digestible by the function (eh, say, you have it in std::string while the function takes, UTF-16). Then you run the function with the input, then massage its output back into shape you need. And so, within 3 screens of code and about 4 libraries (one to do the job, two to handle the conversions, and one to generate an abstraction layer to hide everything under the hood), extra 5 seconds of compile time, roughly 8000% the runtime load, 30x the RAM usage, and about 15 new dependencies (Did you think these libraries don't use any libraries themselves?!) you get the job done, taking only about 3 days to write what you'd take 1 minute to write 20 years ago... but you didn't reinvent the wheel! You used libraries meant to do your job!

    • by Junta ( 36770 )

      I think Javascript ecosystem *particularly* suffers for this (an odd mix of Javascript being very much batteries not included and npm craziness).

      But yes, in general there are a lot of folks deathly afraid of doing simple things if there exists a library that purports to 'take care of it for you'. Nevermind how bug ridden that library is and how it was designed for someone else's requirements...

  • by Junta ( 36770 ) on Monday January 13, 2020 @11:11AM (#59615536)

    The OSI 7 layer model has been superseded by the HTTP 1 layer model.

    (Yes I know the OSI 7 layer model was never perfectly relevant, but it is a succinct expression of the general thinking of networking then versus now).

  • Another one: even a lot of systems programmers have never coded assembly language and don't know how processors work. And, a lot don't need to.

    Which is IMO too bad.

One man's constant is another man's variable. -- A.J. Perlis

Working...