Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Software

Agile is Killing Software Innovation, Says Moxie Marlinspike (theregister.com) 184

There's a rot at the heart of modern software development that's destroying innovation, and infosec legend Moxie Marlinspike believes he knows exactly what's to blame: Agile development. Marlinspike argued that Agile methodologies, widely adopted over the past two decades, have confined developers to "black box abstraction layers" that limit creativity and understanding of underlying systems.

"We spent the past 20 years onboarding people into software by putting them into black box abstraction layers, and then putting them into organizations composed of black box abstraction layers," Marlinspike said. He contended this approach has left many software engineers unable to do more than derivative work, lacking the deep understanding necessary for groundbreaking developments. Thistle Technologies CEO Window Snyder echoed these concerns, noting that many programmers now lack knowledge of low-level languages and machine code interactions. Marlinspike posited that security researchers, who routinely probe beneath surface-level abstractions, are better positioned to drive innovation in software development.
This discussion has been archived. No new comments can be posted.

Agile is Killing Software Innovation, Says Moxie Marlinspike

Comments Filter:
  • Comment removed based on user account deletion
    • by Z00L00K ( 682162 )

      AI is just going to create a new even more bizarre and inexplicable bugs.

      • by will4 ( 7250692 ) on Friday August 09, 2024 @10:29PM (#64693942)

        Agile is not the top thing killing software innovation.

        Speculations on higher ones:
        - Packages by the dozens in modern applications, packages of unknown quality, uncertain maintenance levels, and uncertain lifespans
        - Vendors pushing major technologies for 3 year full lifecycle and too early end of life
        - Documentation and examples on the web being little more than how to connect up a glob of possibly working software with above mentioned libraries
        - Decrepit protocols (HTTP), legacy languages (JavaScript) used for the basis of web apps along with 12 year old security protocols for web apps
        - Overloading of topics taught in computer science college degrees, leading to little deep knowledge and just a 1 inch deep survey of a large set of ideas

        - Fragmentation, cloud shortcuts and non-programming used for an increasingly larger set of computing tasks; ensuring that new developers in the field are much less likely to work on a system of significant size and a large code base. Configuring a pipeline of small scripts plus cloud does not give large code base development experience.

  • by OffTheLip ( 636691 ) on Friday August 09, 2024 @01:46PM (#64693086)
    A good programmer using the C language had all the tools necessary to build systems for many purposes without the black box abstraction. Of course with this power comes responsibility and top skills are required. Business frequently don't want to pay for such skills. Incoming down mods expected.
    • by Darinbob ( 1142669 ) on Friday August 09, 2024 @02:38PM (#64693208)

      To the point of the topic, I interview candidates with fellow employee, and his standard opening question is to get the candidate to describe, at a high level, how one of their projects work (or submodules, etc). Without giving away any proprietary information of course, and a project that is indeed listed and highlighted on their resumes. Over time, so many candidates were just completely unable to do this. They appeared to have no idea how their product actually worked. They were apparently stuck inside a black box themselves, with different developers isolated from each other. Possibly not actually isolated but in an environment where you never had to talk to anyone else or understand your role in the big picture.

      Biggest source of this appeared to be a certain large router manufacturer that rhymes with Frisco. Ask if how data gets from their CPU to the next CPU, what sorts of synchronization techniques do they use, etc, and the answer is invariably "I just call the API". And yet, their resume on the surface describes something cool and interesting! "Was member of 2024 US Olympic decathlon team" but you find out they just carried water bottles. So they just worked on "configuration manager", then you ask about that and what it is, and so they apparently just worked on the interface of configuration manager, then ask about how the interface worked, and it's just an API and they translated the requirements into slightly extending the API. Have to ask yet again, "This is what you feel was your biggest contribution to Frisco in the eight years you were there?"

      It feels kind of depressing to turn programming and design into a gray dreary 9 to 5 job...

      • Meh. It depends on what you are hiring them for. You obviously wanted people who could work very low level, maybe even to the point of inserting assembly into the C code. Not all C programming is at that level. There are many places still around that create code addressing business issues in C. Maybe your issue is that you didn't specify the level of coding you were hiring for.
        • If he's working for a large company, he's probably not able to choose who he interviews. There can be several layers of HR recruitment between the candidate and the job, and that's ignoring that the candidate also might be trying to change their own career direction in this case.
      • by Daina.0 ( 7328506 ) on Friday August 09, 2024 @09:48PM (#64693910)

        My career started in real-time processing. I was given a subroutine specification and an expected time to complete. The design was outside my knowledge, but that was okay. I didn't have to know anything outside the subroutine. The approach was difficult for the lead programmers/designers because they had to know everything and give out the assignments. It actually worked well and produced excellent results.

        Since then I've worked with 5 different companies that did Agile/Scrum of some sort. Every company does it differently and different groups within the same company do it differently. Some use it as a bludgeon to drive employees forward and some use it as a mild framework to guide product development. No one does it right. Some do it horribly wrong and drove developers away. I had one company that starting tracking story points and claimed they would never use them for employee performance evaluations. Guess what? Within two years it was used for performance evaluations.

        "If you ever use metrics to punish or reward, you’ll never see the truth again." -- Sally Elatta, founder, AgilityHealth

    • by kwalker ( 1383 ) on Friday August 09, 2024 @02:40PM (#64693212) Journal

      The problem is actually a bit wider than that.

      First off, a lot of corporate Developers are not high-achievers to begin with. Often times they started out wanting to make video games as a kid, but somewhere along the line discover how hard it is to do in a relatively small, highly competitive arena. Those who don't like that arena quite frequently end up developing corporate applications.

      Second, agile tends to prioritize short-term goals. Sprints are 2-weeks most places, that is 80-90 hours at most (Again, we're not talking about crunch time at EA or Rockstar Games, we're talking about some random Sprint at a company that is using the software they're writing, not licensing it to others) but is quite often much less than 80 hours (Meetings, status updates, company/department ceremonies, rabbit-holes and side-quests given by your boss...) Still, the tasks are expected to be done within the Sprint, and Management does not like it if things slip.

      Agile is supposed to be developer-driven, in theory. In practice, the Company has their own priorities that get communicated to the Developers, who then have to deliver within some arbitrary time frame, usually not set by them.

      All this tends to push the Developers into finding the least-time-intensive ways of solving whatever Task they have been given.

      Also, for those who are a little more motivated, the Company's short-term needs outweigh any sort of innovation (quality of life improvement, stability fix, framework overhaul, etc) the Developer(s) would like to see.

      None of this is language or framework specific.

      • by jmccue ( 834797 ) on Friday August 09, 2024 @04:41PM (#64693460) Homepage

        Often times they started out wanting to make video games as a kid

        Not me, when I was a kid, I wanted to write a personal finance application on DOS. This was before Quicken came out. But when I discovered blackjack and hookers, that dream went away.

        • I actually wrote a business chart app and. Spreadsheet as a kid. Won a prize too. It honestly, this was 1987 and the fun part was writing the device driver and low level graphical primitives forr the a client 16 color graphics card at the time.

      • At a business, software is supposed to solve business problems. The user stories are supposed to be from the end users/business. They set the priority on what needs to be done. You see, if the business can't do business, the business won't be in business. And the developers won't have a job. And businesses need the software that helps them do business, in a time frame that allows them to do business planning, and so can't be completely at the mercy of developers. Things take an actual length of time to buil
      • Second, agile tends to prioritize short-term goals.

        I was dropped into a complex government program that was 3 years in to a 5 year disaster that was absolutely destroyed by the endless need for each team to demo "something" every sprint. The goals were a lofty mountain, but for every sprint it was easier to go downhill for each team. Even worse, there were multiple contracts with teams working for different companies so it was in no one's interest to make other teams look good. It eventually was the di

    • A good programmer using the C language had all the tools necessary to build systems for many purposes without the black box abstraction. Of course with this power comes responsibility and top skills are required. Business frequently don't want to pay for such skills. Incoming down mods expected.

      Down mod? Nah. I've actually held a similar opinion... that languages like Java were the start of this downfall because while they were great for product delivery speed, they actually hurt the ability to program because they took away the need to understand efficient program statements and ways to minimize resource usage while maximizing speed. If I were 20 years younger, I could probably write a doctoral dissertation on this.

      • by egilhh ( 689523 )

        If I were 20 years younger, I could probably write a doctoral dissertation on this.

        No need, The paper "A principled approach to software Engineering Education, or Java considered Harmful" was written 16 years ago, see https://www.adacore.com/uploads/techPapers/principled_approach.pdf [adacore.com]

      • Idiot very much? they actually hurt the ability to program because they took away the need to understand efficient program statements ????

        Efficient code, in Java, or C, or C# or C++ or Ada or Pascal: is exactly the same.

        Java, C# and in a degree Ada, frees you from the burden of memory management - in a certain sense - and that is all.

        C++ on the other hand has neat tricks - but the learning curve for the syntax alone is quite steep.

        If I were 20 years younger, I could probably write a doctoral dissertation

    • "A good programmer using the C language had all the tools necessary to build systems for many purposes without the black box abstraction. "

      That sounds cool, but doesn't help you if you ever get to be on a project big enough that it requires more than one person.

    • by nmb3000 ( 741169 ) on Friday August 09, 2024 @03:28PM (#64693350) Journal

      A good programmer using the C language had all the tools necessary to build systems for many purposes without the black box abstraction. Of course with this power comes responsibility and top skills are required. Business frequently don't want to pay for such skills. Incoming down mods expected.

      This feels like a non-sequitur. Suppose you're a genius C programmer on a team tasked with creating a tool for your customer. The tool is required to run in a web browser so that it easily runs anywhere, desktop or mobile, etc. It also needs an API over HTTP so that other systems can make use of it. This is non-negotiable and is very common today.

      What do you do? I don't care how great you are at C - it's absolutely the wrong tool for most of this job. You can deride the website, or mobile apps, but for better or worse, these are expected and common in modern software. You might decide that something like Electron is too bloated (and jesus, ain't that the truth) but that doesn't mean the best alternative is the lowest-level language available.

      I respect C, both historically and modern use, but anyone claiming that "this one tool can do everything" is either trying to sell you something, deluding themselves, or stuck so far in the past where the scope of the topic was so narrow that one tool really could do most of it.

      • Exactly - and the top 5% will be handing it to the rest of the industry to maintain.

        Do you want to burden a guy with a 1-yr programming certification with being perfect with pointer dereferencing?

        Back in the day I would do some prototyping work and I wrote most everything in Java with only the essential bits in JNI (C). My goal was always to minimize every line of C possible. "Can I push that up to Java?"

        I wasn't going to maintain that code but the guys who were weren't the prototyping type.

        THIS IS FINE.

        • I really hate the "everybody who isn't at my level is shit and ignorable" type of vulnerable narcissist‘

          Especially when this same narcissists sneer at Users“ who only operate at the top level of the software system, the UI and have no understanding of what runs underneath their buttons. Nerds who think that non computer people are subhuman idiots are the problem here.

          Example. WhatsApp. I have a set of nerds in my company who sneers at WA users and to prove how dumb they are because they do not

        • Only one of us on the planet doesn't have somebody better than them.

          TRUTH

      • by sjames ( 1099 )

        The problem there starts when the people are siloed. Once that happens, discussions of the API between the silos get bogged down in management when it should have been a text chat or water cooler conversation. It works much better if the front end people have some understanding and input of the back end and vice versa.

      • While I agree ...
        To nitpick: C runs under WebASM (or how ever that is called atm) just fine in the web browser.
        For a guy who seriously only knows C, and he has to win a competition to write as fast as possible a tick tack to game that has to run in a web browser: it might work out.

    • by znrt ( 2424692 ) on Friday August 09, 2024 @03:35PM (#64693362)

      the reason agile spread like wildfire in the business isn't technical, but that it provides plausible denial in the face of failure at every management level, and the only thing management loves more than that is money.

      the tragedy was watching actual developers pushing for it. most of them were wannabe managers, but many weren't. they got sold on some of the concepts that were good ideas in isolation (pair programming, extensive testing, iterative approach and continuous refactoring (this could easily become a ticking bomb, though)). it was a bad deal, though, it not only negatively affected software quality but also programmers quality of life:

      see, when something goes wrong in an agile project, you can't blame the design and specification process because it doesn't nominally exist (it's just built up one user story at a time, and that's gospel), neither the project management becauses as long as it fulfills the ritual (meetings, sprints, retros, whatever) it's assumed to be infallible too, so the only conclusion left is poor team performance expressed in whatever way, and then ... it's crunch time! what else?

      it's effectively a way for management to push down responsibility all the way down onto developers (who are powerless), and to plausibly deny any shorcommings all the way up the chain right to the top (who are clueless). so guess what happens in business when you let *all* people with decision power in the process be unaccountable. what could possibly go wrong?

      i'm happy this is slowly coming out now. i've been saying this since from about the 2000s to no avail. i've been programming for most of my life, it's been a blast to witness every new breakthrough and technology, but the whole agile bullshit has been the one thing that sucked the soul out of a profession i loved.

      then again, i'm not at all optimistic about what bullshit will replace agile in the age of generators. i really feel like i was born into a golden age of programming that was a unique moment in history that will never return.

      • My experience was that a lot of developers embraced Agile to push the Waterfall Model out.

        • Exactly. Waterfall had so many failures that PM philosophies were bubbling up everywhere. Gated Phases, ITIL, and a host of more/less waterfallesque variants, were breaking the large project horizontally and chronogically. However, most felt they didn't need the rigor of a Space Mission in Specification or QA, so as deadlines slipped, quality went down. Business rarely have an appetite for months-ling systems without a change in features.

          Agile was the concept of feature delivery in a fuzzy-then-crysta

          • Agile isn't more than an umbrella term for a set of behaviors that say "we allow you to change your mind throughout the project". It doesn't replace anything else in architectural design, goal tracking or testing.
            Completely correct, but most /. who got butt hurt in an "agile project" that was not agile in fact, simply do not grasp it.

      • Another idiot hater?

        but that it provides plausible denial in the face of failure at every management level

        The point of agile is to fail early and fail fast.

        You fail three sprints in a row? Then there are two possible reasons.
        a) you are simply bad in judging how much time a task takes - and hence can not finish everything in a sprint
        b) you are simply not capable to do the tasks because you lack the expertise - and hence do not finish everything

        First: you learn from a) and get better at estimating what you ca

        • by znrt ( 2424692 )

          Another idiot hater?

          lol, ofc you had to be a believer ... enjoy your agile with those bananas, and good luck with that evangelism.

    • C programmers take a jump in the lake. Yes you know how to work with low level hardware,etc,etc You also know how to screw everyone else over with your terrible memory management. . You can't take the highpoints of the job without understanding how much you've screwed everyone else over too.
      • by MinusOne ( 4145 )

        As an old school C programmer, this comment could not be more correct. C/C++ should be on the trash heap of programming languages. Unsafe data access, craptastic memory management, far too much craptastic legacy code, etc etc. I've spent my entire career working with systems in these languages and know in detail how messed up the whole ecosystem is.

      • The fun fact is: on low level hardware, all the magic of C is not really used.
        It is as straight forward as writing something in Pascal or Ada.
        It is just ordinary code, usually no pointers, or certainly no moving pointers.
        You have a little bit - depending on the problem - global memory.

        And the code is just exactly what super big projects can not have: the global memory is accessed at random times with random writes or reads form just everywhere.

        And? Why does that work? Because a washing machine only has 1k o

    • Abstraction and encapsulation is what lets large systems work at all. It's also what bogs them down eventually, but that doesn't mean it isn't necessary. C just doesn't do abstractions very well. It's possible, but it isn't what the language is good at. Get your abstraction right, and your code becomes obvious and your productivity skyrockets. Get it wrong, your project will bog down. Using C minimizes the risk of getting it wrong, but it also makes it nearly impossible to get it right.
  • Go for it! (Score:5, Interesting)

    by timeOday ( 582209 ) on Friday August 09, 2024 @01:54PM (#64693106)

    Marlinspike posited that security researchers, who routinely probe beneath surface-level abstractions, are better positioned to drive innovation in software development.

    Fun fact, All versions of WordPerfect up to v5.0 were written directly in x86 assembly language. Didn't help them in the market though...

    But seriously, I agree that a depth of knowledge will allow insights that aren't otherwise possible. However I don't think that's very important for most run-of-the-mill feature additions and bugfixes, and those are a majority of all paid programming work.

    • Fun fact, All versions of WordPerfect up to v5.0 were written directly in x86 assembly language. Didn't help them in the market though...

      WordPerfect dominated the market. Whether assembler helped or hurt is perhaps impossible to say -- though the performance and responsiveness of WP were probably enhanced by it -- but WordPerfect was incredibly successful. They faded eventually due to management missteps and failure to make the transition to Windows and WYSIWYG smoothly... which they did at the same time they moved from assembler to C. If you looked at only sales numbers and programming language, you might even conclude that they lost mar

      • I agree a distinction must be drawn between short and long-term. It was the right decision at the time, and it made them tons of money compared to losing out by waiting around for a good C compiler to come along.

        But pertinent to the article, they clearly switched horses for a reason when the time came - the control wasn't worth the overhead.

      • Re: Go for it! (Score:5, Insightful)

        by RightwingNutjob ( 1302813 ) on Saturday August 10, 2024 @08:34AM (#64694428)

        You know, WP in the 80s was written on bare metal and screamed on a 20MHz 386. Word 365 is written in a pile of abstractions running on a pile of abstractions and is slow as fuck on a multicore 3GHz machine with 100GB of memory and a gigabit wireline connection.

    • by jbengt ( 874751 )

      Fun fact, All versions of WordPerfect up to v5.0 were written directly in x86 assembly language. Didn't help them in the market though...

      One of the big selling points of Wordperfect was that it was built to work with as many OS's and platforms possible. Writing in assembly was not sustainable, though.
      Also, Wordperfect 5.x was the market leader. A lot of people think that Wordperfect 5.1 was the best version (I personally liked 6.0 the best). This was also around the time the Microsoft started building

      • by czth ( 454384 )

        ... This was also around the time the Microsoft started building in undocumented calls into MS Word to make it work better than Wordperfect.

        Which undocumented calls were these?

      • Your viewpoint is biased from the future. Wordperfect was a DOS program, the idea of building software to work with as many OS and platform combinations as possible was not a practical or business goal in those days, unless by OS you meant DOS 2.x,3.x,4.x,5.x,DR-DOS,etc.

        One of the keys to Wordperfect's success was that they had printer drivers for virtually all printers on the market. That was a competitive advantage.

        The transition to Windows was disastrous, the printer drivers were standardised in Wind

      • by hawk ( 1151 )

        It wasn't the system calls that got Word past being a distant third-place also-ran (other than on Mac, which was dominant, and an *entirely* different product with the same name).

        It was machines shipping with hard disks.

        MS already the contacts, and ws already collecting a royalty on every machine.

        They extended that deal to include Office on every machine, at a fraction of the individual cost.

        This changed the cost of WordPerfect, etc., rather than the price *difference* (in whichever direction) to be instead

  • The stack is deep (Score:5, Insightful)

    by bradley13 ( 1118935 ) on Friday August 09, 2024 @01:55PM (#64693108) Homepage

    The thing is: the software stack is a lot deeper than it used to be. When I started: you wrote code in a language that compiled to assembler, which then executed on a processor. Any serious developer understood those three levels, at least to some extent.

    Today, a developer may work primarily with a framework - let's take Spring as an example. They then understand Java, which is the next layer down. What do they know about byte code? How the JVM works? Where does the code turn into machine code? What actually runs on the processor - which now uses caches, speculative execution, and other complex things that affect the code actually run.

    • by dfghjk ( 711126 )

      "The thing is: the software stack is a lot deeper than it used to be."

      Maybe it shouldn't be, but so what? It's not relevant.

      The relevant problem with Agile is that it is predicated on a bottom-up mindset. Bottom-up design simply does not work and a fundamental reason why Agile approaches have the problems the article mentions. It isn't that there are "black boxes", it's that these abstractions get set in stone before they are sufficiently understood and Agile performance metrics ensure that there are cat

    • This is what I do. I write in a language the compiles to assembler, occasionally writing assembler as well, and then it runs on a bare board. Yes there's an operating system, but I also maintain and fix the operating system. So the stack is small... But when interviewing candidates this often seems far beyond their capabilities, or when some are hired they keep wanting to make things more abstract. They select a third party library that takes up 95% of available code space, and when asked if they can ju

      • by dfghjk ( 711126 )

        You're a real programmer, doing work a lot like the majority of all programming work. /. doesn't understand programmers like you.

        For all the script kiddies that fancy themselves AI geniuses that draw half million dollar salaries at openAI, there are hordes of processor designers and real programmers that design instruction sets and executable code that make those enormously intensive tasks run. /. thinks the Python jockey is the programmer, where in reality its the guy in the back room looking at executable

    • by PPH ( 736903 )

      The thing is: the software stack is a lot deeper than it used to be.

      True. But once you understand the concepts of encapsulation and abstraction, they are applicable at all levels. Understanding one particular stack is quite a bit different than understanding the concepts and applying them to various implementations.

      And if you understand Spring, Java knowledge could be helpful. But if you find yourself worrying about virtual machines, byte code, execution, etc. then I contend that your stack is garbage. I understand the 7400 TTL IC family. I know how to hook them together.

    • Machine language and assembly are two different things. Assembly is machine language made readable for human beings, a processor cannot execute this. Machine language is what the processor finally executes. Don't confuse the two.

    • All these geological-deep layers of cruft were made precisely to remove the visibility of what's going on with the system.

      You just define a magical virtual machine that just works and that cleans up your mess after you're done, then you can forget about how to read a character, print it on screen or saving it to a bunch of sectors on a complex rotating magnetic thingie.

      We've just taken this approach up to its logical conclusions: all levels of software can be made like Russian dolls, if you have the spa
      • by sjames ( 1099 )

        However, an unfortunate number of frameworks have gotten so tangled up in themselves that it becomes easier to bypass parts of them. Some are doubly unfortunate in that they make bypassing parts of them more complicated than replacing them outright.

    • Giving that UCSD Pascal's "P-Code" virtual machine already existed in the seventies, the JVM isn't actually all that significant for "today" ;-) Then again I don't disagree to the "software stack" being "a lot deeper" now. In today's projects, we rather connect existing libraries and frameworks instead of doing basic programming. It's a natural thing, though, if we don't want to busy ourselves with reinventing wheels. Whether it's as rewarding is a completely different question.

      • by sjames ( 1099 )

        p-code is one reason I took the Java hype with a few pounds of salt. The trade rag articles would have us believe Java invented bytecode."Only" twenty-something years later.

        • UCSD P-code and Java Byte Code is nearly the same.

          No idea what pissed you. Perhaps there was bad journalism?

          But at your age and my age it is pretty obvious that the JVM inventors basically copied UCSD and added some things for method calling and monitor entry and monitor exit: and that was it.

    • by sjames ( 1099 )

      But much of that stack is self-inflicted. They should choose parts they actually do understand in depth and only when functional requirements make the framework the least complex option to meet them.

  • by Thelasko ( 1196535 ) on Friday August 09, 2024 @02:13PM (#64693140) Journal
    This doesn't sound like a problem with Agile. A single Agile team can create really great products. This is a problem with scaling Agile. Scalability is really Agile's biggest problem.
    • by Junta ( 36770 ) on Friday August 09, 2024 @02:20PM (#64693160)

      Agile's biggest problem is that it is nothing and anything but it is at least one thing: The go to buzzword for leadership teams that would inflict bad management but look towards "Agile" as an appeal to authority to say they are, in fact, good leaders regardless of how they bungle things.

      Some people want to white knight for "Agile" but the reality is that reality has diluted any meaning and all that remains is a hollow buzz word that is used to suppress conversation around reforming processes because the current process is "Agile" and thus can't possibly be flawed.

      • by dfghjk ( 711126 ) on Friday August 09, 2024 @02:56PM (#64693260)

        I appreciate and share the contempt, but cannot agree. Agile does mean something specific, whether it is implemented well or not. The problem is that what Agile means is fucking terrible.

        Agile is generally implemented terribly and management justifies it for terrible reasons, but even if those were not true, Agile remains at its core a terrible idea. Problems should well understood, solutions should be well designed and right sized, then implemented with care. Agile is an insult to these fundamental goals. With Agile, problems are never understood, can change at any time, and solutions are never invested in because they will be abandoned before they are ever complete.

        Agile software a skim coat of feces on a software stack dung heap that no one ever knows, understands or is accountable for. And then there's Agile's idiot partner in crime, CI/CT. The perfect way to pretend the garbage you produce works.

        • by bsolar ( 1176767 )

          Problems should well understood, solutions should be well designed and right sized, then implemented with care. Agile is an insult to these fundamental goals. With Agile, problems are never understood, can change at any time, and solutions are never invested in because they will be abandoned before they are ever complete.

          Nowhere Agile states that problems should not be well understood, solutions should not be well designed or badly sized or implemented without care. Whether problems "can change at any time" is not a question of software methodology but of accepting reality: if a problem changes, it does: it doesn't matter which methodology you are working with. What might be different is how you deal with the problem changing, but whatever methodology you are using, Agile or not, it better be able to deal with it because re

  • by Frobnicator ( 565869 ) on Friday August 09, 2024 @02:16PM (#64693150) Journal

    Better headline: Longtime security evangelist giving talk at Black Hat security researcher conference says only security researchers understand the full stack, audience loves it.

    Yeah, he knows a bit about security and has founded some companies, but really, he's preaching to his own at a conference of his like-minded colleagues in a keynote speech to celebrate Black Hat attendees. He's not giving cited research or presenting studies showing any correlation or causation or anything like that. This was a "Go Team!" talk.

    In the talk he was talking about what is often called the "Agile Industrial Complex", and he evangelized about how only security researchers are poised to understand the full stack from the abstract ideas down to the hardware. I'd say the talk was not quite pandering to the crowd at Black Hat, but very close. "I think you, the people in this room, have actually inherited their Earth. ... You people are the ones who've been sitting in the library, learning the spells, and actually understanding how all of this works." It's basically a talk designed to feed the conference attendee's egos and stoke them up.

    • by sinij ( 911942 )
      Empirically [computerweekly.com], this security evangelist is correct.
    • But this shouldn't be a rarity. You'd kind of want the entirety of the senior R&D staff to understand the whole product. Maybe not all the details of course, but enough that they could draw out a good diagram of all the pieces and their fundamental design. It shouldn't be just the security experts.

      The bigger issue with Agile I think, is not putting people into black boxes that limit the scope of what they need to think about, but putting people into a tiny period of time, where they only need to thin

    • //TODO: Think of witty sig statement

      There goes Frobnicator again, shipping production Slashdot posts with unfinished code... How did this post get past peer review?

  • by Junta ( 36770 ) on Friday August 09, 2024 @02:17PM (#64693152)

    Whether it's Waterfall or Agile or a various 'dialect' of Agile, the fundamental problem is just pervasive mismanagement.

    One facet of mismanagement is an appeal to some authoritative "methodology" to fix their lack of competence. The "winner" of that honor falls to "Agile" for the last couple of decades. Any follow-up to "Agile" is doomed to be just as disastrous and just as obnoxious.

    • > Whether it's Waterfall or Agile or a various 'dialect' of Agile, the fundamental problem is just pervasive mismanagement.

      Isnt that the whole point of agile though?

      What it comes down to, at the heart, is less management. To not waste time predicting what cannot be predicted, planning instead of doing, and imagining non-existent overhead into existence where non was needed. To have fewer and fewer ceremonies and meetings, until you reach the state where you have essentially none at all.

      Heck, the engineer

      • by dfghjk ( 711126 )

        Waterfall approaches can have the same flattened hierarchy. That is not unique to Agile. What is confusing you is the HR aspects of management. Agile disprespects the individual and has no need to provide HR services because the programmer is just a replaceable asset. You consider this a virtue, which is ironic because this is inherently a management perspective.

        "The key problem with "Agile" is that people arent doing it."
        If only!

        "...their version of "agile" is just waterfall with extra steps..."
        So shit

      • To not waste time ... planning instead of doing. ... The key problem with "Agile" is that people aren't doing it.

        This is the one thing that has always bothered me about how people portray Agile as a "good" thing. Doing instead of planning is anathema to every other engineering discipline known to humanity. You don't build a bridge then test it to see if it works. You don't just start cutting lumber and putting it together to build a house, figuring that you can just "add in" the plumbing and wiring later,

        • by bsolar ( 1176767 )

          This is the one thing that has always bothered me about how people portray Agile as a "good" thing. Doing instead of planning is anathema to every other engineering discipline known to humanity. You don't build a bridge then test it to see if it works.

          You sure do, or at least did. The reason we know how to build bridges that we know work is because we have actually built bridges and learned what works and what not. There were plenty of bridges that would work *on paper* but turned out to not work as well in reality and we definitely learned from that.

          Even today if we are to engineer some new kind of bridge we would definitely not only reason over it on paper but we would also build models to test it in reality.

          Agile has never meant "do without planning":

      • If nobody does it right, maybe that's an indication that it can't be done right? I don't mean to blame the people who initially pushed it; rather I'm wondering if the corporate world is able to implement it.

    • by dfghjk ( 711126 ) on Friday August 09, 2024 @03:02PM (#64693274)

      Agile IS PERVASIVE MISMANAGEMENT. Waterfall is not, it's just a stupid name for responsible engineering and development. Waterfall can be done poorly, Agile can only be acceptable when it is done poorly, if at all.

      The mythical man-month has been understood longer than most anyone here has worked professionally, yet the fallacy it describes is the CORNERSTONE of Agile's approach. That's how fundamentally dumb Agile is.

    • Yup, if your code is fine, and you follow the process correctly, then you keep your job. So just point to Agile and claim you got all your sprints done on time with proper unit tests (but very little non-test code) and then the fault of the project failure lies with the stake holders and management!

      Also anytime management comes down and tries to speed up the team and points to looming deadlines, just point to Agile or the Kanban board and say "we can't do that!"

  • and get paid for it. Then with $$ received you can eventually finish it. Nope, then you add more features with more bugs but you never fix all the bugs.
  • Is he Mark Shuttleworth's kid?
  • Take the creativity out of the software development cycle, and it is easier for AI to step in and replace developers.
  • The Agile fundamental premise, is that you never have clarified, accurate, absolute requirements, or documentation, until you're done. To prove that point, Scrum, XP and other concepts come into play to give Agile a usable support structure.

    Another core aspect of Agile, moving fast, if you're thinking or planning, it's not Agile, because you need to be doing, both Scrum and XP have planning built in, because Agile doesn't. This leads to a lot of developers, especially the more jr developers, grabbing l
    • In most places I've worked, the daily Scrum standup meeting can take 4-6 hours every day, where the Scrum master, often a PM from marketing demands why each dev hasn't done everything on their plate, the dev points to someone else (usually Ops), says they are blocked, the other party claps back, then it goes to the next entries and the next dev. Hours later, usually everyone gets lunch, and it takes some time to mentally prepare for any work after that session of kangaroo court.

      With the daily standup meeti

  • by Schoenlepel ( 1751646 ) on Friday August 09, 2024 @03:06PM (#64693286)

    x86, arm, risc-v, etc. Doesn't matter.

    The thing assembler teaches you is what really happens under the hood. With assembler there is no hand holding whatsoever, so you're quickly taught to be careful when programming. There's also things you can do in assembler, you cannot do in any other programming language.

    • by ffkom ( 3519199 )
      x86 or RISC does matter, as x86 assembler is translated into so much microcode that to really understand what's going on you need to go one level deeper, still. And I have met programmers who understood assembler, but honestly believed there are no costs incurred by "atomic operations", because they never heard of cache coherency protocols and how those can slow down execution.
      • Yup, the day you could understand everything have passed us a few decades ago. But.. for a hardware designer making not too complicated stuff, it is still doable. Personally I learned a lot writing assembler for an Atmel microcontroller. Very easy to do. Switched to C when the compiler became free. The output was... horribly inefficient. Eye opener.
  • I read the article. The summary given of (ugh) Marlinspike's speech was vague. But it doesn't sound like his complaints had too much do with agile. Agile is when you try to do a change that only takes two weeks to release, instead of working for an entire quarter before releasing. He seems to mostly be complaining about "siloing", which is more of a symptom of the microservice craze than of agile methodologies.

    But whatever. What he really means to say, like all us old gits, is "future bad".

  • Yeah sure okay (Score:4, Interesting)

    by DarkOx ( 621550 ) on Friday August 09, 2024 @03:26PM (#64693346) Journal

    Go to infosec event, and tell a bunch of infosec people they are smarter then the guys making stuff. That is obviously a message the audience is going to lap up..

    I would counter as having worn both hats for a long time. At least the offensive side of infosec is a lot easier than development. I can write code that makes me puke to look at it, ignores all corner cases and safety checks, and gives no thought to maintainability all. Its great, I brake a lot of stuff, grab the screenshot or csv file and make people go 'Oh my goodness' in meetings. I also get to exploit things they rely on faulty assumptions because at the place in the stack the security decision is made the procedure has to work the data that is flowed into it and whatever dependency injection is going on. Maybe an object that has not been persisted to the database already should not be passed in but it can be under certain code paths. I can find that and abuse that, easier then the poor dev can fix it, easier than the architect can correct a bad assumption made early on..

    Moxie might not be wrong about a lot of developers not seeing the big picture of what they are working on, there maybe some something to be said about architects having a lot hubris to think they can design effectively without ALL the requirements known up front. It sounds true, but than we find CVEs in stuff that is a lot older than Agile as well. I suspect the problem is less Agile, than it is any software development strategy that seeks to 'deskill' the practice of it. Which I think you can argue Agile does, but so do many others.

    • From my experience, Agile, especially Agile + Scrum, in some implementations, turns making an object into small, siloed tasks, that nobody really cares about the whole that they are making. They just want their little part to pass validation checking so it can be moved out of their swim lane, and they don't have to explain they are working on it during the 4-6 hour daily standup meeting. There is no reward in Agile/Scrum development for making the entire gestalt work, because if one does anything other th

  • Pied piper like John McAfee talking about something that's old, dead, and buried.
  • by RogueWarrior65 ( 678876 ) on Friday August 09, 2024 @04:08PM (#64693416)

    It's straight out of the Machiavelli school of business management. If you want to ensure that employees are beholden to the company, compartmentalize them. They will only have the skills needed to work that specific job. They won't be threatening to people of higher rank because they won't have a broad understanding of why they are doing what they are doing. If they ask for training, give them "team" training instead of transferable skills. Nobody will hire someone who has had a lot of "team" training but doesn't know how to produce anything.

  • it is increasingly hard to make a blip of difference in this world - and it is demoralising. Innovation is sucked away by corporate theft protected by the incredible costs involved in court cases, leading to a global hegemony of the super-rich - who themselves are able to play governments against each other for their sport.
  • by sdinfoserv ( 1793266 ) on Friday August 09, 2024 @05:14PM (#64693524)
    A missing component here is the "everyone needs to learn to code" mantra that pushes lazy phucks, under achievers and those who don't care or question anything into development. We're missing a generation of highly creative, highly intelligent people who really enjoy solving problems - sure, there's some there, and those people are great programmers. But the notion that everyone should learn it is utter nonsense. I'm completely fine with not everyone learning brain surgery, just like I've had to debug code from people who write more problems then they solve- and I'm completely comfortable stating these people should find a different career and couch to dent.
    • Maybe if people learned how to code, and could take pride in coding, in limited environments. A lot of people learned how to code with environments that have kilobytes at most of RAM, so being able to code in that is a lot more important than just writing some basic stuff in Python in environments where resource allocation isn't really a thing. Coding is great, coding where the environment is tight is what is important, as well as defensive/secure coding.

      • I just finished a personal project where I designed a 16 bit RISC CPU from scratch includingVGA and sound and wrote a game in my own assembly language. And I made a point of cramming it onto a Lattice UP5K FPGA which has 128K of RAM and is pretty small. That really teaches you the meaning of the word limits‘

        • This is what should be taught, IMHO. Something like what is mentioned above with a RISC CPU on a FPGA can be compared to writing a book or a novel, while "learning coding" can be compared to just making a word salad that means nothing.

  • I'm not sure about innovation, but I'm very glad I retired recently from software development. Being forced to do "agile" would have killed the last remaining enjoyment I had from software development.

    I maintain a bunch of hobby projects still, and doing that gives me joy, especially since I do it on my own and don't need to use any specific "methodology".

  • The upside for the developer, in theory, is limited scope of concern. Just like a hamburger flipper doesn't worry about where the beef comes from. The upside for owner, in theory, can put anyone in that role. But, people aren't cogs. Creativity and innovation can't come out of a machine. If you design your organization to work as a machine, then you'll get exactly only something within the limits of that machine.
  • Maybe they aren't the most "innovative" car company, but they do make cars that are exceedingly reliable.

    Maybe "innovative" isn't always the most important goal for a car company, or a dev team.

  • Back in the days Skype was the king of messaging apps. They followed agile to the tee. All the processes, retros, ceremonies, celebrations, docs, stories, the works. Ran it on a fancy cluster of app servers.

    One day the Skype guys realized they were overtaken and blown away in the market by a small team that were decidedly not using agile methodologies and ran hundred of millions of users on 2 or 3 FreeBSD boxes with Erlang. That product was WhatsApp.

  • To me, it's not a problem that coding is less about innovation. If you build a house, it's also not about innovation, and it's not about milling your own lumber, or building your own nail gun. It's about mounting pre-fabricated parts together to get this one house. And to nail your two-by-sixes together, you don't even need to know much about joinery, but still, your house will be fully functional.

    Agile coding is about solving customer problems. And those problems are not a research project. So don't get

  • Blaming Agile on blackbox development is just ridiculous. Agile doesn't say anything about having to use blackbox stuff, it's just a way how to go about developing software, which can just also mean lowlevel assembly code. It's actually the use of all the 'complete' frameworks which makes developers not understanding how it exactly works under the hood. But then again, does everybody need to know that or should we just leave it up to the people who are really interested in it? Yes, I too love working on low
  • @timeOday [slashdot.org]: “Fun fact, All versions of WordPerfect up to v5.0 were written directly in x86 assembly language. Didn't help them in the market though..

    Bill Gates Oct 1994 [edge-op.org]: “I have decided that we should not publish these extensions .. We can't compete with Lotus and Wordperfect/Novell without this
  • The problem is trying to use developers as a replaceable serviceable commodity rather than having them own, design and build things.

Algebraic symbols are used when you do not know what you are talking about. -- Philippe Schnoebelen

Working...