C# Memory Leak Torpedoed Princeton's DARPA Chances 560
nil0lab writes "In a case of 20/20 hindsight, Princeton DARPA Grand Challenge team member Bryan Cattle
reflects on how their code failed to forget obstacles it had passed. It was written in Microsoft's C#, which isn't supposed to let you have memory leaks. 'We kept noticing that the computer would begin to bog down after extended periods of driving. This problem was pernicious because it only showed up after 40 minutes to an hour of driving around and collecting obstacles. The computer performance would just gradually slow down until the car just simply stopped responding, usually with the gas pedal down, and would just drive off into the bush until we pulled the plug. We looked through the code on paper, literally line by line, and just couldn't for the life of us imagine what the problem was.'"
I'll show you mine if you.. (Score:4, Funny)
Re:I'll show you mine if you.. (Score:5, Insightful)
This just tells us once again that our wonderful editors on /. don't even try to understand what's behind an article, but they just find some sensationalistic title (the more AntiMS, the better) and done. This results in more comments of the type... "See, M$ id teh SuCkS", or "thanks god for my Linuzzz."..., so they got more profit for their /. ads (oh, the irony often MS ads, BTW).
Yellow press..... yes, I know, /. is not supposed to have any credibility like any other parasite news sites, but anyway....
Re:I'll show you mine if you.. (Score:5, Insightful)
However I take exception to your use of the world "Editor". Slashdot does not have Editors. They have guys who accept submissions.
They don't read The Fucking Articles, They don't check links, The don't edit submissions...
Slashdot *does* have editors. When I submitted... (Score:4, Interesting)
Re:I'll show you mine if you.. (Score:5, Interesting)
The Slashdot editor who posted it moved the link so it looked like I was linking to the original study, not the article about the study. It's like they felt compelled to make a change, so they made one even if the change didn't improve the quality of the article.
I will say that the rest of the text remained unchanged, and really the only problem with the submission is that people who thought they were going to a study were actually going to a newspaper article about a study, but the point is Slashdot editors *do* make changes all the time.
Re: (Score:3, Insightful)
"We set a timer to restart the computer after 40 minutes".
Wait. What? Really? That's classic programming by accident. "I added one and it started working, so I moved on."
Re: (Score:3, Interesting)
Since every other garbage collected language, from every other company, would have had the same problem, how does it show that?
Re:I'll show you mine if you.. (Score:5, Informative)
I wonder is MS could sue Slashdot for slander?
in the URL (Score:5, Insightful)
http://www.codeproject.com/showcase/IfOnlyWedUsedANTSProfiler.asp
"IfOnlyWedUsedANTSProfiler"? That didn't raise any flags?
Of course, I'm trying to assume good faith and not just conclude that the editors knew this was an advertisement, but they sure are making that difficult.
Actually it's probably our fault (Score:3, Insightful)
I suspect it is the fault of slashdot user base as much as the editors. I bet a lot of users were in the firehose, saw the sensationalist title, etc, and rated it highly. The editor comes in, sees it has a sensationalist title and is now colored read, meaning users really think it is great, and posts it.
Re: (Score:3, Insightful)
There was a memory leak but it was due to their code, not with the Microsoft
Re:I'll show you mine if you.. (Score:5, Funny)
Re:I'll show you mine if you.. (Score:5, Insightful)
...
O.K., I'll bite, what part of that line do you not understand? If /. billed itself as a "blog" then I'd understand your point. However, /. is a corporate based, self-billed "News" site. Whether reader submission driven or not, it is a news site. You are wrong. But, thanks for your thoughts.
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
Slashdot not a news source? Agree! (Score:5, Interesting)
People complain that Slashdot sucks: the headlines are sensationalistic, the editors get commissions based on the number of dupes they post, and articles about 6-month-old events get posted as "news".
So why do I even bother visiting Slashdot? The answer is two things: the community of posters, and Slashcode moderation.
The value of Slashdot is in its community. You and I, dear Slashdotters. Our collective mind will pick through the various articles, point out their flaws, expose sensationalist FUD for what it is (and, surprisingly, will do this equally for anti-Linux and anti-MS FUD), debate various trends, and provide a signficantly international (though heavily USA-centric) perspective.
This value is enhanced by Slashdot's moderating system, so that information and insight can bubble to the top among the mass of inane posts. Metamoderation limits the amount of crack that the moderators can be on.
So, Slashdot editors, take note! *WE* are the reason we are here. *YOU* are not. Many of us don't even bother to read the articles any more, preferring to soak up the collective wisdom of techies from varying age groups and fields. If you piss us off, and the collective community of Slashdot deteriorates, then there's no reason for me (or others) to keep coming back.
Think about it.
Re:I'll show you mine if you.. (Score:5, Informative)
Re: (Score:3, Insightful)
There is no leak in C# per se. They kept a reference to the objects, so the CLR wasn't even supposed to delete them.
Morons shouldn't be making car software.
Re: (Score:3, Informative)
The next version of the
Perpetual motion machine vendors (Score:5, Funny)
I was working at a coal-fired power plant which needed a new pollution control device before 2010. There, I would dig through the literature, and try to find suitable products and operating conditions for this device. Anyway, this involved a lot of meetings, conference calls, and business lunches with the suppliers in question.
Then there was Joe.
Joe was our Alstom sales rep: portly, humorless, slow to speak and slower to understand. He was also a devote Utahnian.
Well, one day, we were killing time while waiting on a conference call, my supervisor left the room, and we started talking about universities. Then he dropped the bomb:
"In my Senior year, I worked on developing perpetual motion machines."
My supervisor then reentered the room, and we got back to work. I felt like I'd just seen a dancing frog.
Re: (Score:3, Interesting)
A moderately experienced programmer would recognize the problem very easily by, say, noticing that a listener method is getting called 100,000 times for each event.
Re:I'll show you mine if you.. (Score:5, Informative)
Re:I'll show you mine if you.. (Score:5, Interesting)
Re: (Score:3, Interesting)
If you're trying to imply that errors can be made in any language, you're right, but the big difference is that leaks in a manually allocated language like C++ are a heck of a lot easier to find and fix than leaks in a language that tries to be smart and "help" you avoid leaks.
If you're failing to dispose of an object, look at the places where it should be freed and make sure that it is. Generally, there aren't a lot of these places. If you have a dangling reference, it will show up in the form of a cra
Well, there's your problem! (Score:5, Funny)
Re:Well, there's your problem! (Score:5, Funny)
Re:Well, there's your problem! (Score:5, Funny)
Re:Well, there's your problem! (Score:4, Insightful)
It obviously doesn't work in situations like this where the bug is in the runtime and not the application.
Re:Well, there's your problem! (Score:5, Informative)
there are very clear constructs in place in the language/runtime to allow any object to unregister itself from event registrations it initiated.
this was VERY MUCH a bug in the end-user software, not the runtime (i've written code almost IDENTICALLY to this and blew lots of time having made this same mistake).
the only thing the runtime could do to protect the idiot developer (myself included) is automagically make all event references WEAK references, but that has plenty of undesirable side-effects too... in clr, you can do this yourself if you're so inclined... (just like in a JVM)
cheers.
Peter
Re: (Score:3, Insightful)
Re:Well, there's your problem! (Score:5, Informative)
Re: (Score:3, Informative)
Re:Well, there's your problem! (Score:5, Interesting)
I first ran into this sort of problem in 1983 when working on a CDC mainframe. The only way to find the bug was the line by line analysis method since even compiling the code with debug caused it to run slower and the nature of the problem changed. That's as much detail as I remember.
I expect to see a lot more of these kinds of errors pop up as multi-core CPUs become more prevalent (true parallel execution) and people continue to assume that they can just crank out code without taking the time to understand the design. I'd also expect the prevalence of multi-core processors to create a demand for more parallelism. If you don't take advantage of the additional cores, your program will only be as fast as if it were on a single core system. If the competition can create a program that uses the additional cores, your program will seem slow.
Cheers,
Dave
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
I think there's a rite of passage in programming. First, you see a bug and immediately assume your code is right and the compiler is wrong. With a little experience, you learn that it's actually a darn good bet that your code is the thing with the problem. With a lot of experience, you learn that while it's a darn good bet, on rare occasions the compiler really does get it wrong, but that it usually takes a horrendously long, assembly-level debugging session to prove it.
Much the same is true of the standa
Re:Well, there's your problem! (Score:4, Funny)
Slashvertisement (Score:5, Informative)
Re:Slashvertisement (Score:5, Interesting)
Re:Slashvertisement (Score:4, Insightful)
Re:Slashvertisement (Score:5, Insightful)
But I do believe that articles written by companies pretending to be written by end-users are not terribly useful and probably shouldn't end up on
I mean, the article clearly states at the top "By Red Gate Software.".
So where did the "Bryan Cattle reflects on
Seriously.
"One of our team members downloaded the 14-day trial of ANTS Profiler"
"To our amazement, it was only minutes before we realized that our list of detected obstacles was never getting garbage collected"
"If Only We Had Used It Earlier..."
ANTS Profiler helped us fix a problem in minutes that would have taken us weeks to track down. If only we'd thought of it before the competition, we would most likely have finished the entire race and had a chance at the top prize money.
All this stuff sounds either very naive or very marketing. You choose.
Don't Slashvertise. Ever. (Score:5, Insightful)
You may think you're pulling one over on the editors, and maybe you are. But you aren't pulling one over on us, and I think after all these years, the editors know this. So, just don't. Unless your product or service is absolutely bulletproof people here are more likely to shoot it full of holes than rush out and buy it.
Re: (Score:3, Insightful)
Kinda scary when they start writing systems for medical applications, industrial controllers and power supply chain management, let alone nationwide air traffic monitoring or emergency services interactions management.
"Hang on, we have to reboot our systems every 6 hours in order to manage this natural disaster - You c
Re:Slashvertisement (Score:5, Insightful)
Some old Visual Basic programmer jokes come to mind when I read this article. People use to make fun of Visual Basic programmers because it was to easy to write programs in VB. They thought it would produce sloppy code with errors, and other similar things. To some extent, it appears that same case could be made for C#, and to a lesser extent Java. In the end this is simply a case of not stress testing their event stack.
Re:Slashvertisement (Score:5, Insightful)
Agreed. I was a good VB programmer. But my VB experience was an eight-month interval between C++ jobs. I've knocked out minor MS Office applications when needed since then, but that's it. I'm sorry to say that most of the VB programmers I've worked with were very poor engineers. Admittedly I've not worked in VB for wealthy companies whilst I have in C++, so that colours things somewhat. I don't doubt that there are some good VB programmers out there. But in the cases of most of those I worked with, I could very clearly see how VB led them to be poor engineers. The amount of shortcuts and wizards and instances where they would start their program by dragging a form object onto the design panel and dumping form objects onto it was obviously a leading reason for their poor skills. VB *led* them to take this approach. It works for small Excel apps, barely for database front-ends and not well at all for large projects.
As is common with the lower end of Microsoft products, the selling point is that they make it very easy to do what they think you want to do. The ability of VB to knock out an interactive form with near-zero knowledge of programming has encouraged a lot of colleges to sell people the idea that a ten-week course of dragging and dropping text box objects is programming whilst a lot of cheap or ignorant employers have taken the graduates at their word and plunged them in over their heads.
VB is a poor language in many ways and not, imo, suited to a large or sophisticated project. But you can find good VB programmers (was one). It's just that it encourages bad ones.
Oh, the summary is also wrong. C# hasn't started springing leaks. The programmers missed a reference to objects that they were creating and the garbage collection therefore never triggered to unallocate the memory. I don't doubt it's not easy to automate a vehicle to drive any even 9 miles, but this could have been detected with more thorough debugging. At any rate, the article submitter and overseeing editor should be ashamed of twisting this into an anti-Microsoft jab. I'm a Linux programmer. I can tell you that Linux can compete happily without sinking to the level of lies and misinformation.
Re: (Score:3, Insightful)
I'm sure you're a much better programmer than I am, so I have to ask... why does using the form designer makes someone write bad database access? I'm also curious about what makes VB a poor language. I hear this pretty frequently, but have never gotten a honest explanation of why. Personally, I don't think I've ever encountered a task accomplished, or bit of code written in C# that I couldn't easily translate to VB, and vice-versa. Someone once told me there was some limitation on utilizing System.Refle
Re: (Score:3, Insightful)
An interesting observation. Java is a lot harder to write bugs in and is easier to debug (with more runtime information), so coders spend a lot less time in the debug cycle and therefore get less experience doing it. So should we design our languages to be obtuse and hard to read (ie. C++) or continue trying to design languages that make it harder to get things wrong?
Java has made some wron
Re: (Score:3, Insightful)
Re:Slashvertisement (Score:4, Informative)
The company I worked for, in the efforts to get something out the door, deployed a product to a customer site that had a similar flaw (but, not a
In my own work, I wrote NT services that HAD to run 24x7 and were not allowed to crash - especially due to memory leaks. The components we purchased and used, contrary to their marketing ploy, often had memory and resource leaks - we won't even begin to talk about the runtime library that shipped with the compiler.
I used a variety of freely available memory managers and commercial QA tools to track down most of the "leaks" and fixed them. If I didn't have source to the component in question, I replaced them or rewrote them from scratch taking time to make sure it didn't leak. Guess what? It worked and those applications/services run 24x7 (well, until they restart server for some other reason).
Moral of the story - if something is critical - take the time to profile your code and use QA tools to find other potential problems BEFORE you deploy.
RD
Stupid Slashdot headline (Score:5, Interesting)
It's not C#'s fault. The team had references to the obstacle list (event handlers), which prevented garbage collection. The
Re: (Score:3, Interesting)
Maybe so. But if they explicitly call delete to invoke the garbage collection of an object, would it not be better for the system to destroy the object and then throw an exception when it tried to send an event notification to a non-existing object?
Furthermore, if delete is called and the garbage collector does not delete the object because it realizes that the object is registered on certain events, would it not be just as easy to then un-register the object for the event? Or at least report it? After al
Re:Stupid Slashdot headline (Score:5, Interesting)
Re:Stupid Slashdot headline (Score:5, Insightful)
I think you're getting hung up on the method name. There is no standard "delete" function that marks something as unused (dispose on the other hand sort of gets there). The article itself is unclear but I would assume that they were simply deleting the collision objects from a collection of potential hazards. Whilst that would remove the object from the collection itself it is *not* a delete. As references to the object existed elsewhere the object still exists (look ma, no null pointer exceptions) no delete happens. You cannot specifically say to the GC "We're done with this, delete it", the GC sweeps on a regular basis looking for objects with no references.
Would you really want the GC deciding that just because an object is no longer part of a collection it's safe to unsubscribe it from events and delete it? I know I wouldn't.
Re:Stupid Slashdot headline (Score:4, Insightful)
Re:Stupid Slashdot headline (Score:5, Interesting)
Now, if you have control of the implementation of the object who accepts Listeners you can store them internally in a weak collection, which allows them to be garbage-collected. This would work but may not be what the programmer intends. Actually in a language like Java I'd hazard that usually the programmer wouldn't want that at all: consider an application that listens to UI events. As a programmer I want to be able to stick listeners wherever they are needed and leave them there permanently. If I don't need a pointer to the object, I don't want to keep it around, and thus may not have a reference to the listener EXCEPT in the event-management collection. That's the advantage of GC languages: as soon as the object which creates those events (say, a dialog box) goes away, the objects it refers to have one fewer pointer and may be eligible for GC.
Anyway, lots of code has issues like this: we had a problem at my work where an Apache taglib was caching some compilation in a cache that would grow for ever. It was a simple code fix to solve that problem, but there was no way for us to even SEE the problem until we ran our application under load in a profiler. Fun fun fun.
Re:Stupid Slashdot headline (Score:4, Insightful)
What is interesting is to see that garbage collection changes one class of bugs (forgetting to explicitly deallocate memory) to another one: unintentionally keeping objects around. Princeton's "obstacle object" lifetime policy was stepped upon by a Dotnet library; Java has similar problems in its libraries. For the Princeton car software, an explicit deallocation routine (like in C/C++) would have been easy to implement.
Problem is that both C/C++ style memory leaks and C#/Java hidden reference bugs usually remain hidden until the system crashes or trashes after some time. It makes them hard to find in the course of ordinary testing.
Re:Stupid Slashdot headline (Score:5, Insightful)
Decent programmers might understand that, but let's be honest, it's not like Java (and other GC languages) haven't been presented as if memory leaks were a thing of the past.
As a matter of fact, some people will probably still claim that it's technically not a memory leak, but instead an object life-span issue.
What surprises me is that outspoken proponents of managed languages use the garbage collection so often as a good thing, as if now you can be a sloppier programmer and get away with it.
In reality you have to identify/control the lifespan of objects anyway, so I personally never understood what the big deal is about freeing memory manually. Not to mention that memory leaks in say, C++ code, really aren't that hard to find. The tools have become pretty freakin decent.
And also not to mention that garbage collection might be handy for memory, but memory is only one of a plethora of resources that can be leaked. And since for many resources it isn't nearly as appropriate to 'lazy' free them, as a programmer you still have to be aware of the allocate/free paradigm. (as just one silly example, it would suck if you wouldn't be able to explicitly close a file, because you can't delete it before it's closed)
In other words, you are right. Of course you can have memory leaks in garbage collected languages. And I wish people would stop using GC as an argument why languages as Java are so much better to use than C++.
Re:Stupid Slashdot headline (Score:4, Interesting)
I must apologise in advance if this is a bit of a rant. I have a graduate degree in, well, programming language design, and I find some things close to my field just very upsetting. You wrote:
Perhaps you write very C++-adapted, boilerplate code. The reason garbage collection is essential in a programming language is that without it (a) you cannot provide a safe implementation of first-class functions, since they implicitly grant indefinite lifespan to arbitrary objects; and (b) you cannot build an abstract data type, whose implementation is hidden from the user, since no matter what other features the language may have, you can always tell whether the type a library has handed you is an automatically managed 'atomic' object, or a 'reference type.'
But why get so upset about weird advanced programming techniques not coming out quite right?
Because the kicker is, that to those of us who grew up with garbage collected languages, first class functions and abstract data types are elementary programming techniques. They are the bricks and mortar of which everything else is made. "Data structures + Algorithms," you see. Sure, C++ programmers consider it rocket science and discuss ad nauseum their clever smart pointer techniques and their baroque fifty-line function object implementations (or, if they advocate Boost, their two line function object implementation that requires a five thousand line header file and employs a completely different syntax from everything else they do). That's because they're now used to getting through life with no arms and artificial legs.
The sense in which garbage collected languages make memory leaks a thing of the past is this: that if you received a non-C++-adapted education, focussed on data structures and algorithms and not the fifty-three (or five thousand and six - they make money, let's invent more) Programming Patterns that help you evade the design flaws of the One True Language, and so you are in the habit of thinking and coding using callbacks, strategy functions, abstract types, state encapsulators - all those basic things that (unless the goal is avoiding the shortcomings of C++) are taught in school, and, indeed, all those things that both functional programming and object oriented programming were invented to make notationally direct, then you can just go ahead and code what you think, and you won't be bitten on the bum. The abstract model of computation comes reasonably close to matching the reality. Without it, you're still tracing through the execution in your mind at every step, because relying on the abstraction itself will get you burned.
Yes, a competent programmer can adapt. Yes, a competent programmer can think at the level of assembly language and either work out exactly the lifetime of the data, or do a second explicit computation, woven in with the main one, to determine it dynamically. A competent programmer can also deal with a language having divergent notations for data, expressions, statements, type expressions, templates, and type expressions within templates; or to phase of the moon dependent name resolution (templates again!); or to notational 'abstractions' requiring manual instantiation in real implementati
Re:Stupid Slashdot headline (Score:4, Insightful)
It is easy to leak memory in C++, if you don't know what you're doing. Its easy to leak memory in C#/Java if you don't know what you're doing. That the language makes it easy for you to avoid becoming a better programmer is NOT a good thing.
I have a good analogy - Firefox. I use FF a lot, I like it, but it does tend to increase its memory usage over time, and has been rightly criticised for it. Now, I'm sure the 'bug' is an aspect of its design and not a programming bug (and I don't want to start a FF memory discussion - I'm only using it as a real-world example) but just imagine if *every* program was like FF - slowly using more and more RAM over time until you restarted it.
This is what you're asking for when you say that its ok for the average programmer to use C# because the app will have fewer leaks: you'll get fewer leaks. Not none. If the app leaked like a sieve, it'd be spotted in development/test and fixed; if it leaks slowly enough so it passes an hour's testing in certification, then it'll be shipped - and its you that'll get to find the bug as you use it.
Re:Stupid Slashdot headline (Score:4, Informative)
"NO! It is not a good thing, if a program slowly leaks memory then it just makes it harder to find the bug. If you have to reboot the app every week because it has a little leak, no-one's going to be bothered (except the users who see it slowly getting slower). If it has to be restarted daily then you're going to be looking to fix the bug."
Actually the good companies do debug the slow memory leaks, and the bad ones don't debug the slow ones. Besides, any memory leak in a Java program is possible in a C app, so you are eliminating a class of leaks, not replacing them with harder to find leaks. Thus your entire argument is moot. Furthermore, where are you getting the idea that Java memory leaks are going to be slow while C memory leaks are going to be fast? I've seen slow C memory leaks and fast Java ones. I can think of nothing regarding the nature of garbage collection that would effect the speed of the leak.
"I have a good analogy - Firefox. I use FF a lot, I like it, but it does tend to increase its memory usage over time, and has been rightly criticised for it. Now, I'm sure the 'bug' is an aspect of its design and not a programming bug (and I don't want to start a FF memory discussion - I'm only using it as a real-world example) but just imagine if *every* program was like FF - slowly using more and more RAM over time until you restarted it."
First, thats not an analogy, thats an example. Second, Firefox is not an application, at least not in this day and age. Today its a platform for web applications which are just vulnerable to memory leaks as any other. If that cool new javascript app that is running on the page you are loading leaks memory, there really isn't a whole lot Firefox can do.
Third, I fail to see your point. Do memory leaks suck? Of course. Is it best to get rid of them? Of course. Will garbage collection get rid of memory leaks? Of course not. Will they make the problem any worse? No, any code that leaks in Java will also leak in C. Will it make it better? Of course, there are types of leaks which simply are not possible in Java. Those will be eliminated resulting in fewer leaks (though it is of course impossible to eliminate the completely).
no one claims C# prevents wasting memory... (Score:4, Informative)
Re: (Score:2)
I personally prefer using C++ with Valgrind not some fancy language with garbage collection.
This is not a C# memory leak! (Score:5, Informative)
Though we thought we had cleared all references to old entries in the list, because the objects were still registered as subscribers to an event, they were never getting deleted.
So references were held to the objects in two places - the list of encountered obstacles, and the list of event subscribers. They were being removed from the list of encountered obstacles, but not being unsubscribed from the event.
How do you think event subscription works? Something has to hold a reference to the objects that are subscribed to the event! That thing is going to hold a reference until you unsubscribe the object - it neither knows nor cares about any other list of references you may be maintaining separately, how could it?
This is a coding error. A subtle, non-obvious one perhaps, but a bug nevertheless. It is not an error in the CLR, and in fact the article never paints it as such. That particular bit of spin is wholly down to the submitter.
Hard/weak references for event handlers (Score:4, Interesting)
So I guess the real question here is whether event handlers should be hard-referenced (as they are here), or just soft/weak referenced...
From a developer perspective it's quite natural to think that, as long as his code doesn't hold any reference to an object, it should be garbage collectable. If registerEvent() shall hard-reference handlers, documentation should be *very* explicit about it (and the need to unregister a handler for GC to work on it).
On the other hand, if handlers are not hard-referenced you can no longer register anonymous class event handlers...
Re:Hard/weak references for event handlers (Score:5, Interesting)
The poster of the article was trolling, and not only trolled with the post, managed to get a troll posted to a slashvertisement which was not even trolling.
Impressive on the part of the person who submitted it, but disappointing considering Taco's comments a few weeks back about articles that are truly nothing but advertisements.
Shasdotvertisiment at is best (Score:3, Insightful)
Re: (Score:2)
Pretty useless point to make though. People aren't using C# for memory management. They are using it because Microsoft basically no longer does any worthwhile C/C++ gui development.
Re: (Score:3, Insightful)
Or better yet, in C++ and use the RAII idiom. I.e. utilize the power of deterministic destruction, that C# and Java lack, to arrange it so that resources, including but not limited to just memory, are auto-freed. (You *can* run into this same kind of problem using reference-counted smart pointers in C++, but happily much of the time they aren't needed.)
Categorization of Resource Management Issues (Score:4, Funny)
Background:
There are more types of resource leaks than just memory leaks. A memory leak is when your program keeps hold of memory it's not using. An object leak is when your program keeps hold of objects it's not using. A file descriptor leak is when your program fails to reuse the descriptors for files it has closed and will not reopen. Many other types of leaks could be considered.
Exercises:
1. Determine which issue this scenario describes.
2. Figure out which issue can be handled by automatic memory management.
3. Discuss whether, and if so why, the answers to Exercises 1 and 2 mean there is some conceptual discord between the wording of the scenario and the use of the term "memory leak".
stop programming hardware, please! (Score:2)
c#? (Score:3, Interesting)
They were using massive cooling systems and having very thorough code reviews, sounds like a perfect reason to use C over C#.
Re: (Score:3, Insightful)
Why MS Windows? (Score:2)
The immediate problem here was evidently a programming error, not a bug in C#, but I do wonder why they are using C# for this. That forces everybody involved to use MS Windows and eliminates the possibility of hacking the system if they need to as well as the source as documentation. If they want a C-level language with garbage collection, why not Java or D or any of several others?
only 10KLOC? (Score:4, Interesting)
Of course memory leaks can happen with garbage collected languages, but these leaks are a little easier to find....
Maybe they should have coded in a higher level language like Ocaml, Haskell.
And yes, I'm sure most of an autonomous vehicle software is not low-level drivers, but in the planification & perception tasks. On such tasks, higher-level languages definitely make sense.
I also did not understood what kind of libraries these teams are using.
I'm also surprised that it is apparently so easy to get funded to have only 10KLOC inside a car!
Re:only 10KLOC? (Score:4, Insightful)
Re:only 10KLOC? (Score:5, Insightful)
Seriously, the performance of these cars is amazing, a huge step from just a few years ago. The hard part of this project was certainly not the programming, but the concepts behind the obstacle detection and such. This is not an implementation exercise, but more of an academic experiment to test out new ideas.
(Nice work on mentioning Haskell to guarantee an upmod btw.)
Nothing new here move along....OH Wait? (Score:2, Funny)
don't these kids learn anything anymore? (Score:5, Interesting)
(2) You are particularly supposed to test your software if you send $200k and 1 ton of hardware careening through the street on autonomous real-time control.
(3) Garbage collectors do not prevent memory leaks.
(4) Garbage collected systems can be good for building real-time systems, but you need a real-time garbage collector or you need to treat the system as if it didn't have a garbage collector at all.
What "ruined their chances" was not that they overlooked a memory leak, what ruined their chances was that they didn't know what they were doing.
I think they just learnt something (Score:3, Informative)
They also didn't pick a very good hack because it didn't leave the car in a safe state when the software broke.
Lack of practical experience I'd say. A few more events
Another similar Problem... (Score:4, Informative)
However, i had another memory 'leak' problem where the Garbage Collector simply didn't collect in time which caused my application to use more and more memory until it reached the system limit and crashed... i found that simply calling
GC.Collect();
GC.GetTotalMemory(true);// (the true 'forces' collection
once would fix this problem... i though i needed to call it every minute or so... but when calling just once it did SOMETHING that prevented this problem from occurring again.. no idea exactly what.. but it works
Swing (Score:4, Informative)
Designed for safety ? (Score:3, Insightful)
Yikes. So these guys have the smarts to make a computer drive a car on its own, but managed to forget some basic safety mechanisms such as a watchdog and other failsafe mechanisms ?
Geez guys - real world engineering 101: Do not let a computer control anything that might have a remote chance of harming someone without appropriate safety mechanisms.
This article is badly Slashdot-spinned. (Score:3, Informative)
- "so it wasn't a memory leak per se"
- "It was the closest thing to a memory leak that you can have in a "managed" language. "
- "Unfortunately, our system was seeing and cataloging every bit of tumbleweed and scrub that it could find along the side of the road."
So they just goofed up.
The objects didn't get deleted in time, because there were always ( literally
Bad Slashdot. Bad Slashdot.
Wow, how embarassing (Score:3, Interesting)
Criticisms of the team aside, I would like to say that neither Java nor C# have made any steps to remedy problems like this with seem to be all too common with inexperienced developers. Both Java and C# need to support attaching to event handles with "weak" handlers. That is, the handler will not hold onto the object which defines the handler (and will automatically deregister itself sometime after the object has been collected). In many cases, there is a need for an object to listen and handle an event from another object, but only whilst the object that is listening is still referenced (with the exception of the reference held by the object firing the event).
In C#, the (admittedly ugly) way to implement this is to use an anonymous method and a weak reference: The "closure" that is created for the anonymous method does not hold a reference to "this" as it does not access any of "this"'s fields or methods unless it's through the weakreference.
The code has a flaw where the event handler code (only a few bytes to hold the closure) will never deregistered be collected unless the event is fired sometime after the owner object has been collected. This can be fixed by using a NotifyingWeakReference (a weak reference that raises an event when it has been collected).
Bad Bad article title (Score:3, Insightful)
if the moderator read the article he would have noticed that the article was an advertisement for the profiler product, not just a review of it (it was written directly by Red Gate).
Second, the article itself says that they found that the error was in how they coded the application, because they left some reference so the garbage collector didn't trow away the objects.
This is a really bad article and bad information.
The right answer to memory management (Score:3, Informative)
There's actually an accepted safe way to do memory management - reference counts and weak references. That's what both Perl and Python have settled on, and it's worth noting that programmers in those languages seldom have serious memory management problems. In C and C++, one has to obsess on memory management issues, and even in Java and C#, which are garbage collected, it takes more attention than it should.
Reference counts have the advantage of repeatability - deletion will occur at predictable times. This allows the use of destructors. You can safely use destructors to manage other assets, like windows, open files, network connections, and such.
Destructors in systems with garbage collection make for an unhappy marriage. Calling a destructor or finalizer from the garbage collector is essentially equivalent to calling it at some random time from another thread. So race conditions are possible. Check out Microsoft's "managed C++" for an attempt to get all the cases for this right. It's not pretty.
The classic complaint about reference counts is "what about cycles"? There's a simple answer - cycles, that is, loops of strong pointers, are errors. This isn't a severe restriction; it just requires some data structure design. With trees, for example, links towards the leaves are strong pointers, and links towards the root are weak. (I've revised Python's BeautifulSoup HTML parser to work that way; "down" and "forward" links are strong, while "up" and "backwards" links are weak. It took about 20 lines of code and eliminated annoying problems in programs dealing with HTML trees.)
If you really need a symmetrical circular list, which might happen in, say, a window library with many links between widgets, there's a simple solution. Have all the objects owned by some collection, then use weak pointers between them. When the collection is dropped, all the bits and pieces go away, in a well defined order.
In Python, you can turn off garbage collection while leaving reference counting active, then list any orphaned cycles at program end for debugging purposes. This is a practical way to program without leaks or garbage collection. It's generally easy to find cycles, because cycles are created by data structure design, not by bugs. So if a program has cycles, it will probably have them every time, and thus they can be found early in debugging. With better language support for debugging, cycles could be caught at the moment of creation, which would make it easy to eliminate them.
Now if we could get this into a hard-compiled language, we'd have the problem solved. Repeated attempts to bolt reference counting onto C++ via templates have resulted in fragile systems. The fundamental problem is that C++ still requires access to raw pointers to get anything done, and this puts a hole in the protection provided by the reference counting system. It takes language support to make this work right.
Bad summary. but obvious FREE solution (Score:5, Insightful)
Just because you *can* do something doesn't mean you should.
Brett
Re: (Score:2, Insightful)
Re:As a C kernel programmer... (Score:5, Funny)
Re: (Score:2)
Re: (Score:3)
Ahahaha! (Score:3, Interesting)
This kind of thing makes me so happy. Sure, it's not really a bug in C#, but this is even better, a perfect demonstration of how GC does next to nothing to prevent this type of bug, and instead fools people into complacency while making the bug much more subtle.
In my opinion there is a proper language level for nearly any task. For kernel programming, drivers, or RT stuff, C. User-level stuff is usually better in C++. Well, I'm a big fan of C++ and more comfortable there so I'll usually extend its range do
Re:Ummm... (Score:4, Insightful)
It's the programmer and the language. Give the world's best carpenter a ball-peen hammer and ask him to build you a beautiful armoire, see what happens.
You can say now that they'll be much further next year, but until then "Which means that the language did the job very nicely" should be "Which would mean that the language did the job very nicely." If you put in a reminder of some sort to come back and say I told you so, I'd be more than happy to eat my words if they continue using C# and place in the top 33%. Hell, I'd even concede that you might be right if they manage the top 50%.
I say, however, that there is a right language for the job. Sure, there's overlap, but you don't implement your FFT in Perl when the problem is that you need the fastest FFT possible, you don't write a word-processor in assembly, and you don't write anything in Brainfuck even though they're all Turing-complete. Anyone who says you can do anything in any language is trying to justify using their favorite language for absolutely everything.
Re: (Score:3, Insightful)
Its my experience with it that apps written using it are poor too though. I've been headhunted twice now by companies that rewrote their old apps in this cool, new language only to find that it performed so badly they couldn't sell it. (I used to be a performance
Re: (Score:2, Insightful)
Re: (Score:2)
More than likely. Basically they're saying "C# didn't garbage collect objects that were still in use!", which is quite obvious. I've been caught out a couple of times by not thinking through which object have references to other objects, and end up leaking memory. Other times I've come across bugs in the JDK where a class refuses to let go of an object, causing a leak (In fact, it was something todo with the focus manager IIRC, keeping a reference to a GUI component that I had disposed of).
Regards
elFarto
Re: (Score:3, Informative)
This problem is actually les
Re:C# Garbage... (Score:5, Informative)
The main problem with garbage collectors (I like GCs, so this isn't a diatribe against them) is that far too many mediocre programmers assume they have a magical ability to know precisely what they want their code to do. The reality of course is that they use algorithms to decide what should be collected, when it should be collected, and how it should be collected, and those who are unfamiliar with the particular strategies that their GC uses can therefore not only write code with more than a few memory leaks, but also code that results in the GC being used so inefficiently that it does vastly more work than would be necessary if the same functionality was implemented in a slightly different way.
There are plenty of articles about Java memory leaks that can be found by Googling "java memory leaks". Googling "java GC tuning" will produce some useful links to articles containing tips on ensuring that it's not used inefficiently.
Re:Friends do not let Friends use Windows and Driv (Score:5, Funny)
Re: (Score:3, Informative)
I just checked, and C# apparently uses reference-count garbage collection.
erm... no. the CLR implements a mark-sweep-compact generational GC pretty much like Java's.
these don't have any problems with circular reference structures - if it can't be reached from a root and marked, it'll get collected.
still just a blunder, as you say.
this article should be binned - misleading title and nothing but a puff-piece for a profiler. i much prefer YourKit, incidentally:-)
Re:Reference counting (Score:5, Informative)
Actually, C# doesn't reference count at all, it 'Reference Traces' :)
Please, let me explain; it's quite sad how often people don't get this ...
.Net has its block of managed memory, called the Managed heap. It's separated into 3 'generations'. This heap has 2 areas, free space and reserved space, from top to bottom.
When you allocate and object to the heap, by using the new command (object o = new object();) there is a set of rules? that have to be enforced:
The GC manages Reference tracing, and this doesn't occur when the object goes out of scope, it actually happens when the Heap is full and you attempt to allocate a new object.
In something called 'the sweep', the GC goes through each object in the heap to see if it's reachable. To do this it starts with so-called 'roots'. It then traces to see which objects are referenced by these roots.
A root identifys a storage location, which referes to objects on the managed heap, or objects that are set to null. For example, all of an applications global and static objects are considered to be it's roots. (hence the reason that all C# apps have a static void main).
When the sweep starts, it assumes that all objects are garbage. So for each root object, it builds up a graph of the objects that root references, and marks them as being live.
However, if it finds an object that's already in the graph, it stops traversing that path. This is two (massively) increase performance by not scanning the same object twice, and more importantly, it stops you getting into an infinite loop by scanning a circular list.
The pinch is, it prevent the circumstance that you mentioned! :)
Because the strong reference to a linked circular list is gone, the circular list isn't attached to a root object, so it gets disposed. If you don't want it to get dropped, unless it theres a memory shortage, the C# GC also supports something called Weak References, but I'm not going to go into those here as it's headhurting
So once all the roots have been checked and we've got a nice graph of all the objects that are referenced by the live parts of the application somehow, the second stage of GC happens.
Any objects that haven't been touched by the walk are of course still marked as Garbage. The GC now walks up the heap linearly, looking for contigious groups of garbage which are now considered to be free space. The GC looks for the next live object and moves it to the start of this free space with a good old memcpy :)
This ofcourse invalidates all the root pointers, so the GC then updates the points in the root objects.
:)
So now, we've got rid of all the garbage and our heap is pleasantly compacted; Take that Heap Fragmentation, Kerpow!!
But, that's not all she wrote of course
Now we're free'd and compacted, the 'nextObjPtr' is moved to the top of the heap. At this point the new object creation that triggered the collection is performed and the new object appears at the top of the heap.
This is a dramatic over-simplification and I've not attempted to explain finalization or weak references, but it's still good to know this stuff, it helps us as .Net programmers to consider how to write our code properly :)
The other thing I've not explained is how the Generations work: