Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Bug Python

Do Strongly Typed Languages Reduce Bugs? (acolyer.org) 456

"Static vs dynamic typing is always one of those topics that attracts passionately held positions," writes the Morning Paper -- reporting on an "encouraging" study that attempted to empirically evaluate the efficacy of statically-typed systems on mature, real-world code bases. The study was conducted by Christian Bird at Microsoft's "Research in Software Engineering" group with two researchers from University College London. Long-time Slashdot reader phantomfive writes: This study looked at bugs found in open source Javascript code. Looking through the commit history, they enumerated the bugs that would have been caught if a more strongly typed language (like Typescript) had been used. They found that a strongly typed language would have reduced bugs by 15%.

Does this make you want to avoid Python?

This discussion has been archived. No new comments can be posted.

Do Strongly Typed Languages Reduce Bugs?

Comments Filter:
  • Bug Conservation (Score:5, Insightful)

    by ThosLives ( 686517 ) on Saturday September 23, 2017 @04:25PM (#55251637) Journal

    I suspect that there is something like a "law of conservation of bugs" or something in software - you take away one vector for bugs to originate and you just move them into another place.

    Dynamic languages do have an easy way to introduce bugs - especially languages like javascript that simply create new variables if you have a typo.

    But there is the old adage in statically typed compiled languages "Hey, my code compiles! Now I get to find out where all my bugs really are."

    This also applies to other aspects of programming languages. Consider the arguments about manual vs automatic memory management. Managed code still has bugs, just not memory management bugs.

    • by gweihir ( 88907 ) on Saturday September 23, 2017 @04:38PM (#55251685)

      I fully agree. Bugs are just getting more destructive and harder to find the less permissive a language is. Also, if you cannot make type-errors, then any random person can write type-error free code. Type-errors simply cease to become a quality-metric in that case. That does not mean that the code is better in any way though.

      • by vux984 ( 928602 ) on Saturday September 23, 2017 @06:30PM (#55252221)

        That does not mean that the code is better in any way though.

        It doesn't mean it is good, but of course its "better". A certain class of bugs eliminated.

        All the subtle bugs and shitty coding practices may remain, but at least one issue is taken care of.

        I despise javascript because in code of any complexity, a stupid typo in an infrequently used if-then-else clause explodes at runtime, possibly weeks or even months later. Sure its simple and easy to fix, but holy shit how can anyone want to use a language that doesn't catch just catch this stuff for you before it even tries running it.

        • by Ichijo ( 607641 )

          in code of any complexity, a stupid typo in an infrequently used if-then-else clause explodes at runtime, possibly weeks or even months later. Sure its simple and easy to fix

          Sure, if you ignore the cost of re-testing and redeploying, plus the cost of lost business caused by the bug and so on. The earlier a bug is detected, the cheaper it is to fix [wikipedia.org], by orders of magnitude.

      • Re:Bug Conservation (Score:5, Informative)

        by jeremyp ( 130771 ) on Saturday September 23, 2017 @08:22PM (#55252551) Homepage Journal

        But the destructive hard to find bugs are still there with the weakly typed languages. It just takes you longer to get around to dealing with them.

    • Re:Bug Conservation (Score:5, Interesting)

      by alvinrod ( 889928 ) on Saturday September 23, 2017 @04:51PM (#55251751)
      That seems to imply that the number of bugs is (relatively) constant. Adding automatic bounds checking to a language prevents a whole category of bugs, particularly ones that can be quite nasty. Sure if you had only skillful and exceptionally careful programmers, you wouldn't get those types of bugs either, but that's wishful thinking in most cases.

      If there is some truth to what you I suspect its for tangental reasons. I'd hypothesize that languages that solve many of those categories of errors have much lower barriers to entry and therefor you get more cowboys using them who are making other kinds of mistakes because they're idiots and not because a programmer must make some fixed number of mistakes somewhere. Those people wouldn't be trusted with languages that didn't prevent them from introducing certain types of bugs because they would be counter productive for their team and create more problems than they solve, but languages that cut down on the potential types of mistakes that can be made mean there are entire categories of bugs that they can't introduce and that they can probably be productive.

      It's another old adage: "Make something foolproof and the universe will invent a better fool."
      • by gweihir ( 88907 )

        That is pretty much how the argument goes. And if you look at the abysmal state of competence of most modern "web coders", you can see a very nice example of this idea in practice.

    • Re:Bug Conservation (Score:5, Interesting)

      by swilver ( 617741 ) on Saturday September 23, 2017 @04:56PM (#55251777)

      And I suspect there is no such law at all as it seems based on nothing.

      Better languages do result in less bugs. They may open vectors for a different type of bug, but that does not mean they are as frequent or as dangerous.

      Take memory management. One misstep and your program gets killed or crashes because it reads or writes from memory that does not belong to it. Very easy to do in some languages, while other languages completely eliminate this class of bugs. Of course poor memory management can then result in memory leaks, but it won't crash, and not all such problems become memory leaks.

      Some languages also practically eliminate arbitrary code execution and injection attacks, making them inherently more secure, at the cost of having bounds checks compiled into your program. This again eliminates a whole class of problems and introduces no new problems (maybe performance problems, but those are not bugs).

      Then there is code maintenance and refactoring, something which is near impossible with loose languages. This is often why is not even attempted, leaving your with a code base that, if not planned in excruciating detail, will wither away and die. Just look at all those Javascript frameworks that rise and fall and don't even try to maintain backwards compatibility... I think it is because they all turn into unmaintainable pieces of spaghetti code that no sane programmer wants to keep working on for long.

      But there is the old adage in statically typed compiled languages "Hey, my code compiles! Now I get to find out where all my bugs really are."

      It happens often enough that people write hundreds of lines of new code in a compiled language and actually find it runs correctly the first time. Perhaps you haven't experienced it yet... I have on many occasions.

      • by gweihir ( 88907 )

        It is funny how code written in your "inherently more secure" languages gets exploited at the same or higher rates these days. What actually happens when languages get "safer" is that coder competence drops and bugs just move to a higher level, without being any less destructive. If you cannot see that happening over the last few decades, then you seem to be blind to what is going on.

        • by swilver ( 617741 ) on Saturday September 23, 2017 @06:15PM (#55252155)

          Incorrect. What actually happens is that with safer languages, more people can write software and the *average* competency drops. It does not however affect the competency of already competent coders when they use more secure languages.

          You'd think that be obvious, but please keep convincing yourself.

    • I suspect that there is something like a "law of conservation of bugs" or something in software

      I know, with the same degree of certainty that there is an objective universe that exists independently from my perception, that what you just said is bullshit.

      There is correct code, and there is flawed code. It is possible to write software that doesn't crash, that can't be exploited, and does precisely what it is supposed to do for all possible inputs. The only thing standing in the way of that is incompetence

    • by jeremyp ( 130771 ) on Saturday September 23, 2017 @08:19PM (#55252541) Homepage Journal

      This is not true. Sure, strongly typed languages won't eliminate all the bugs and the ones that are left may be quite tricky to find, but it is absurd to suggest that the ability to run a program where you typed a variable name wrong means that those tricky bugs are magically not there.

  • A friend of mine likes to say: "If you want to do something robust, code it in Ada. If you want to do something quick, code it in Python".
  • by asackett ( 161377 ) on Saturday September 23, 2017 @04:46PM (#55251729) Homepage

    This observation doesn't make me wish to ditch a programming language, but it does make me glad I do test-driven development.

  • Well duh (Score:4, Insightful)

    by DrXym ( 126579 ) on Saturday September 23, 2017 @04:53PM (#55251767)
    The more you catch at compile time, the less there is to bite you on the ass at runtime. Cheaper in terms of development effort too to fix bugs before customer reports them.
  • by heretic108 ( 454817 ) on Saturday September 23, 2017 @05:02PM (#55251807)
    Easy enough to add strong typing in Python by adding a type check decorator to each function and method.
  • by Jane Q. Public ( 1010737 ) on Saturday September 23, 2017 @05:15PM (#55251863)
    The title of this thread incorrectly conflates "strongly typed" with "statically typed".

    They are two completely different things.
  • Makes sense to me (Score:4, Informative)

    by blindseer ( 891256 ) <blindseer@noSPAm.earthlink.net> on Saturday September 23, 2017 @05:19PM (#55251873)

    As someone that has had to program in a number of languages I can say that strongly typed languages can catch a lot of trivial bugs quickly. One example is an if/then statement that allows non-Boolean arguments. If I mistype a comparison in an if/then statement then I should expect an error on compile. If I type an assignment "if (foo = bar)" instead of a comparison "if (foo == bar)" I expect this to get flagged, but some languages don't see this as a problem.

    I prefer strongly typed languages as it can catch a lot of typographical errors and sloppy logic. It can also be frustrating at times since it can mean nesting type conversions to near absurdity. VHDL comes to mind in this. It can also be frustrating if trying to do something quickly and the compiler complains on what I would think is a pretty obvious implied type conversion.

    It's interesting to see someone try to get an idea on how many errors strongly typed languages would catch. I'm not sure this makes an argument for one language over another. It might make an argument for testing, coding style, and such though.

    • by holophrastic ( 221104 ) on Saturday September 23, 2017 @11:41PM (#55253031)

      "if (foo = bar)" isn't a bug in the code. It's only a bug in your brain.

      bar = ;
      if( foo = bar )
      {
            foo += 2;
      }

      so foo is bar+2 if bar is true, otherwise foo is the same false as bar, be it undef, zero, null, or blank. And if you add some local scoping, being able to manipulate foo without manipulating bar often makes a lot of sense, especially with complex objects, and especially with functional logic like if(foo = dclone(bar)) -- or the much more routine if( record = dbgetrow(statement) ) which I'm absolutely certain you've done more than 100 times.

      The bug in your brain is actually not a programming one. It's a visual one. Why are "=" and "==" so visually similar when they are functionally different? I might suggest using instead of == in perl, although the boolean would be reversed. At least cmp covers you for strings. .

  • by Waffle Iron ( 339739 ) on Saturday September 23, 2017 @05:19PM (#55251877)

    Have they also looked at bugs that typically plague statically typed languages but dynamically typed languages usually don't suffer?

    For example, many statically typed languages do little or nothing to help you avoid integer overflows, which can result in severe crashes and vulnerabilities. Many dynamically typed languages, such as Python, gracefully switch to big integer types as needed.

  • I've been using Janson and Bookman lately. Futura for san-serif. What was the question, again?

  • by david.emery ( 127135 ) on Saturday September 23, 2017 @05:34PM (#55251949)

    In my experience, type errors are a lot more likely for scalars than for composite objects, i.e. I'm less likely to "add apples to oranges" than I am to add "count of apples" to "count of oranges". (Or horizontal pixels to vertical pixels, a real mistake I made once.)

    I suppose it's possible to do typed scalars in C++, not sure about Java (without tool extensions). But making a scalar into a full 'class' is probably overkill (with runtime impacts).

    The combination of typed scalars and named parameter associations that languages like Ada have can catch a lot of errors at compile-time (with good quality diagnostic messages). And this supports refactoring by making it easier to find the impacts when an interface changes (for instance, if you go from a single type 'pixel location' to separate types for 'horizontal pixel location' and 'vertical pixel location'), you just find and work off the type errors reported by the compiler. (Been there, done that.)

    Of course, it's not Politically Correct to favorably mention Ada (so often a technology is panned by those without substantial experience using it.)

  • by rossz ( 67331 ) <ogreNO@SPAMgeekbiker.net> on Saturday September 23, 2017 @06:26PM (#55252203) Journal

    After the initial introduction to computing via BASIC, I picked up Pascal (Turbo, of course) and fell in love with it. I prefer a language where you have to declare all your variables and won't let you assign one type to another type without type casting or properly converting. I also want it to be case insensitive because I've seen code where the idiot used "foo", "Foo", and "FOO" as different variables.

    I haven't touched Pascal in years, but I miss it occasionally.

  • Preference. (Score:3, Interesting)

    by Anonymous Coward on Saturday September 23, 2017 @06:35PM (#55252239)

    I prefer strongly, statically typed programming than programming in e.g. Python. Like even Java, which is in many ways a horrible language, is more understandable for me. You need to understand the importance of the first two and last two words in that sentence though.

    Why is it better for me to catch stuff at compile time? Because I prefer it that way.
    Why is it preferable for Pythonistae to do whatever they do? Because they prefer it that way.

    I have a very intelligent and productive friend who loves Python and has no problem with the way it handles types. I find Python annoying and avoid it, partially for that reason. But what would it even mean for him to be "wrong", or for me to be "wrong", or for either of us to be "right"? Nothing, that's what.

    Stop bitching at each other and having stupid arguments like the "other side" is trying to steal your toys or come in your ice-cream.

    Oblig: "LambdaConf 2015 - Dynamic vs Static Having a Discussion without Sounding Like a Lunatic" https://www.youtube.com/watch?v=Uxl_X3zXVAM

  • by xgerrit ( 2879313 ) on Saturday September 23, 2017 @06:59PM (#55252301)

    The question isn't really "Does a strongly typed language reduce bugs?", because the obvious answer is: Yes, it does. If you went to the logical extreme and created a language that only had 3 commands, you could eliminate whole classes of bugs. The more strict the rules, the harder it is to do the wrong thing.

    But the question is really: Would developers spend more time fighting against the type system in a strongly typed language or against type related bugs in a dynamic one?

    The answer to that question seems much murkier, and I don't think a study looking at the types of bugs checked in on GitHub can answer it.

  • by doctorvo ( 5019381 ) on Saturday September 23, 2017 @08:52PM (#55252611)

    The article talks about static typing, not strong typing; the two are different concepts. Strong typing means that type errors are always caught, static typing means that if type errors are caught, they are caught at compile time. JavaScript is both weakly typed and dynamically typed; weak typing is probably a bigger problem than dynamic typing. In any case, whatever conclusions you derive about type systems from experimenting on particular languages really only apply to that language. TypeScript is nice for JavaScript; that doesn't mean that adding static typing to Python would be as useful.

    In addition, there is a price to pay for static typing: software becomes more complex, people tend to implement their own "dynamic type light" libraries, etc. So, even when static typing reduces bugs, it's not clear that it results in a better product at a lower cost, which is what you ultimately care about.

  • by holophrastic ( 221104 ) on Saturday September 23, 2017 @11:32PM (#55253007)

    The article shows javascript as buggy and untyped with 3 + "0" = "30" -- the classic stupid example.
    Everyone likes to say that Perl is weekly typed, because "1.5" + 1 = 2.5.

    Everyone is incorrect. Perl is very very very very strongly typed. Not the variables, the operators. Not the nouns, the verbs. Developer says "add these two variables", and perl adds them. Because the developer said so. If the developer said "concatenate these two variables" then perl will concatenate them. Every time.

    That sounds strongly-typed to me. "3" + "1" = 4; 3 . 1 = "31". Every time.

    I dare you to find out what perl does with 3 + "information". Go ahead. I dare you.

    My point is this. Human beings don't care about type. An apple is an apple, and an orange is an orange, and I can eat them together, cook them into a sauce, bake a pie, cut them up, juice them, or put them into a basket. Whatever I tell you to do with them, you'll never respond with "but apples can't do that because they aren't oranges". Just bake the g.d. pie, because I told you to.

    Now, if you can explain to me why any language would ever use the same symbol for "add" and "concatenate", then you're smarter than I am. For the life of me, I've spent 30 years trying to understand that one. What idiot makes one symbol do two things, and then builds the language to guess which one to do based on the values themselves, at the language-level no less? Idiotic.

  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Sunday September 24, 2017 @02:54AM (#55253331)
    Comment removed based on user account deletion
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Sunday September 24, 2017 @07:25AM (#55253787)

    I basically program exclusively in PLs with sloppy typing and no compiling and stuff. Python, PHP, JavaScript the works. The speed at which you get stuff done is notable, especially compared to classic "Type Nazi" languages. However, the trade-off is clear as can be: Write critical code beyond a certain scale in sloppy type PLs, and you're asking for trouble. Type Nazi languages force you to think before typing ... errrm ... hitting the keys ..., and that is a good and useful thing if the use case isn't a trivial scripting stuff that you can debug and modify on your Smartphone willst sitting on the bus.

    Sloppy typed PLs have some stopgaps (code standards, frameworks, hacks, transpired dialects (such as JavaScripts TypeScript) but those are things intended to cover the gap and cater to specific needs.

    Long story short: Nazi typing is more work to front but prevents lots of trouble downstream of large non-trivial projects, including a specific subset of bugs.

The "cutting edge" is getting rather dull. -- Andy Purshottam

Working...