Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming Books Media Book Reviews IT Technology

Pattern Hatching: Design Patterns Applied 95

Tal Cohen, one of our best book reviewers thought that the Slashdot audience would appreciate Pattern Hatching: Design Patterns Applied, a book written by John Vlissides. This book is not one of the many new pattern catalogs, this book shows how patterns are applied in real life -- and also includes some interesting guidelines for would-be pattern writers.
Pattern Hatching: Design Patterns Applied
author John Vlissides
pages 172
publisher Addison-Wesley, 06/1998
rating 7/10
reviewer Tal Cohen
ISBN 0-201-43293-5
summary Not one of the many new pattern catalogs, this book shows how patterns are applied in real life -- and also includes some interesting guidelines for would-be patternwriters.
Along with all the hoopla surrounding patterns these days comes more than a little confusion, consternation, and misinformation. This is partly a reflection of how new the field is to mainstream software developers even though it's not, strictly speaking, a new field. It's a fast-moving field too, creating vacuums of facts. And yes, we pattern proponents deserve some of the blame for not educating people as thoroughly as we'd like, though not for lack of trying.

John Vlissides, a member of the "Gang of Four" -- the authors of Design Patterns, presents another book on the subject. In part, this book presents some important aspects in using patterns; hence the subtitle "Design Patterns Applied" is right on the spot. Vlissides shows when certain patterns are used in real life, and even quotes a lot of e-mail from other users, showing patten advantages and, notably, weaknesses. As Vlissides states, "This book conveys something very important that's missing in the more academic and published Design Patterns book -- that patterns are the result of real, working programmers who don't get everything right the first time, and who struggle with the pragmatics of recurring design practices" (page x). Herein lies the book's greatest strength. Another part is aimed for pattern writers, rather than users. Sadly, these is no clear separation between these two aspects of the book.

The book is basically a collection of Vlissides's columns in C++ Report, and this is arguably the book's weakest point: if you're not intimately familiar with the little details of C++, you'll miss much of the action. This is sad, because the original Design Patterns book was careful to present patterns in a relatively language-independent manner, both in C++ and in Smalltalk. Here, the view is strongly biased towards C++, in ways that sometimes would annoy (or amuse) users of other languages, such as Smalltalk or Java.

Consider, for exmaple, Chapter 3: "Themes and Variations". A large portion of the chapter deals with the de-allocation problem of Singleton objects. Any programmer who uses a garbage-collecting language will be simply amused by the entire hoopla. The only conclusion from the entire discussion, one that the author himself admits (p. 69 and elsewhere), is that the lack of garbage collection greatly increases design complexity, in addition to program complexity. A design that had the potential of being elegant turns out into an ugly monster by the need to manually manage memory deallocation. Not to mention the suggested use of "delete this" (p. 41). The author is "not sure why, but [...] people wince at delete this". Hint, John: objects sometimes reside on the stack. In which case delete this can have very... interesting consequences. (True, in the context of a Singleton, where allocation is controlled and cannot occur on stack, this specific problem is not present, and yet "delete this" is such an evil construct, with such horrible possible results, that I'd rather avoid it even in such controlled cases.)

And sadly, the entire discussion on the use of Singletons ignores what I've come to conlude is the pattern's worst, and most common, misuse: Singletons, in my experience, are a tempting (and in practically all cases wrong) replacement for global variables. A designer that finds a need for a global variable simply replaces it with a Singleton, and pronounces that not only he's not using globals, but rather he's using a known pattern, which must mean his design is good.

Chapter 2 is a tour-de-force of using "classical" patterns, those presented in the original Design Patterns book. The examples look a little forced, and life too simple (every problem has a corresponding pattern that fits it like a glove), and yet it is a most educating read. For one relatively simple problem, the author comes up with an elegant solution that uses no less than six patterns (Proxy, Composite, Template Method, Visitor, Mediator and Singleton) -- and suggests an enhancement using a seventh one (Observer). This discussion alone could be worth the effort of reading the book, as it examplifies the correct and practical use of patterns in real life. The end of the chapter presents a very nice kind of diagram, invented by Erich Gamma, which could be very useful indeed. This "pattern:role annotation", as Gamma calls it, allows people who are sufficiently fluent in the pattern language of design to immediately grasp the role of different objects, even in a large design.

Chapters 4 and 5 are aimed mainly for pattern developers, though this is not clear when one begins to read them. It includes an example of the Gang of Four's elaborate discussions about pattern design, with lots of quotes from e-mail messages, etc. This is interesting only to a certain extent. The discussion, basically, is about whether Multicast should be presented as a pattern in its own right, or as an adaptation of the Observer pattern. Most of the discussion seemed to be dealing with minute neuances, but the conclusion (the presentation of a whole new pattern, "Typed Message"), is a beautiful one.

The concluding chapter (5) is titled "Seven Habits of Effective Pattern Writers". The name says it all. Though some of the advice presented (e.g. number 5, "Presenting Effectively") are trivial, others (e.g. number 1, "Taking Time to Reflect") are very educating. (Don't be mislead by that simple title, "Taking Time to Reflect". Vlissides provides some real important tips under this seemingly obvious one.)

In conclusion, if you do a lot of object-oriented design, and use (or consider using) design patterns, reading this book can be a good use of your time; but unless you spend time on designing patterns, too, you might wish to simply skim the last two chapters.

Pick this book up at ThinkGeek.

For additional book reviews, please visit http://www.forum2.org/tal/books.

This discussion has been archived. No new comments can be posted.

Pattern Hatching: Design Patterns Applied

Comments Filter:
  • Excellent review, which almost forced me to buy the book. The finest thing about the patterns books are that we can all finally put the same names to practices we've used for years.
  • by maroberts ( 15852 ) on Friday February 04, 2000 @05:30AM (#1305879) Homepage Journal
    If you didn't understand a word of this review [and I've got to admit I came close] then perhaps this web site will help

    Design Patterns [hillside.net]

    This is the homepage of the previous book in the series.

  • by Dacta ( 24628 ) on Friday February 04, 2000 @05:38AM (#1305880)

    I love patterns. I suspect that many /. readers are going to say "patterns - and all that OO stuff is just another software development technique that will die in five years".

    It's not - and those of you who code large systems that are underfunded, and have too short deadlines know this.

    Patterns - and to a certian extent - OO design, allow software development to become engineering, with all the advantages that brings.

    In particular, reuse of subsystems has made (non-software) engineering a profession that produces reliable products most of the time.

    If you compare that to software engineering, we are still in the early years of last century. Before standadised engineering components (girders, bolts, whatever) everyone had to make them for themselves. Now, you can just go and buy them and know how reliable they are.

    Patterns allow us to develop systems in repeatable ways that are known (proven) to work. Combined with component reuse, I'm convinced they are the only way that we are going to be able to keep up with the increase in complexity of the systems we build.

    The other advantage is that Patterns give us a shared vocabulary. Yesterday, I was discussing with another programmer how we could allow various parts of the system we are building to get notification of user logons and log offs.

    He starts trying to explain this complex system of events, and callback registration, and I go - "The Observer Pattern?", he goes "Yep" and we both know exactly what we are doing. You can't beat that.

    I don't know about this book - I'm not a big C++ fan anyway, but if you haven't read about patterns get yourself the Gang of Four's book and do yourself a favour.

  • by duplex ( 142954 ) on Friday February 04, 2000 @05:39AM (#1305881)
    by this review. Having read the Design Patterns from GoF and the "System of Patterns" by some five guys and this little book. this one is definitely the most down to earth one. The subjects of destroying singletons is a very important and overlooked issue and the fact that singletons can be misused is in no way detrimental to Pattern Hatching. The examples from the book often touch on problems I encountered myself when coding.

    The fact that the book is focused on C++ stems from the aim of the book that is to show the patterns applied. If he settled for some abstract or rarely used lingo would that be showing design patterns applied?!

    I think the discussion on the Observer is the best I have ever read and the treatment is solid and all tradeoffs are discussed. It's a pleasure to read and very enlightening approach to the Most Misused Pattern in the World(tm).

    By all means ignore this review and pick up a copy. It's a masterpiece from a guy that really knows what he's on about. If you understood the GoF book you'll find this a great addenum. This book deserves a place next to books like Writing Solid Code and Code Complete.

    I'm embarrased to see slashdot giving this book such a poor review. The book deserves all the praise and I strongly suspect that the reviewer never tried applying desing patterns. Pattern Hatching shows you how it all fits together. It doesn't discuss individual patterns but brilliantly shows you how they all interact. Buy it!

  • Hey, look -

    10010110110100101110110100100111001101001
    Looks suprisingly like...
    1001010110110001011010111011011010001001

  • I found a hole a while back that would allow you to post images via the poll.pl script. They fixed that, and the url in my post didn't work anymore (because /. stripped the encoded HTML from it automatically). Probably the same thing happened to you.

    I didn't get -5'ed though - infact, I think I ended up +1 (after a huge moderation war). I've never seen -5 before.

    BTW, there is still a hole that lets you post images - not on the story, though. That was a cool hack.

  • I've read design patterns and agree that it is one of the most useful books I've ever read (right up there with books by Scott Meyers, Sedgewick and Bjarne Stousstrup)...Click on the Amazon link to read several comprehensive user reviews [amazon.com] of the book just in case the link above seemed rather dry to you.

    PS: Please do not buy this book on Amazon. Go to Addall [addall.com] and find a vendor. Please keep up the boycott...thank you.
  • Im seriously beginning to think that they are bringing in the victims of severe head trauma to moderate posts. Somehow my original post is 0 (off-topic) and yet your reply gets modded to 2? That makes a whole lot of sense.

    Morons.
  • by Tower ( 37395 ) on Friday February 04, 2000 @05:54AM (#1305888)
    Why does the "ThinkGeek" link point to http://slashdot.org????????
  • I found the reviewer's bias toward languages with built-in garbage collection a little annoying. For some projects it's appropriate to use a language with garbage collection (web scripts), and sometimes it's not (operating systems). While he may be "amused" by the need for manual deallocation of memory, he should understand that it's generally far more efficient.

    Unfortunately, I think the reviewer doesn't have much experience with low-level system programming to understand these trade-offs. Until he does, he should resist making judgements.

    P.S. This is not to defend C++ as a language...


    --

  • I agree patterns are beautiful! I've developed in C++ for 6 years. The most influential book in my development career is design patterns.

    Right now I am working on an underfunded project. We are developing a realtime system to monitor many (150+) seperate subsystems. In general only 1 update a second from each subsystem is good enough but there are times where we will need updates from a couple of susbystesm 1000 times a second for a couple of seconds. The subsystems can communicate across a couple of different communication lines. Each of these lines uses different communication protocals. Right now we are useing RS-485 and CAN BUS 2.01b. The development of this is unimaginable. The patterns that I have found most usefull are the singleton and the factory. So much detail is abstracted using the factory I can add new subsystes/communications to the system by only affecting 5% of the code. Another pattern that I am using (I think it is actor, it's been a while since I read the book and it isn't handy right now) has abstracted the details of communicating with different subsystems so much that development is moving along almost too smoothly. On top of all this there are multiple, different GUIs that are to interact with the subsystems and what one GUI does the other GUIs (if applicable) are to respond accordingly. Either by displaying the info or acting on the change. The GUIs are to be able to connect and disconnect from the main system at will and any actions they performed are not to be lost.

    We are using Linux as our development and deployment platform. I am heavly using System V IPC. If it wasn't for "Design Patterns" I would not have known where to beging. It opend my eyes to many different aspects of OOA OOD and OOP that other books just simply fell short. "Pattern Hatching: Design Patterns Applied" will be the next book that I add to my library. For the GUI I am using GTK. It fit so nicely in its OO approach that I couldn't pass it up.

    I'm sorry if the above text is a little confusing but I cannot divulge any details due to the fact of the paranoia of managment of the competetors (Which in this industry is well founded)

  • His post started as a 2, it wasn't moderated up. You can see the moderation done to a comment by clicking on the comment #.
  • Absolutely right. It is one point in a sliding scale of tradeoffs.
    --
  • Oops.

    My brain tends to freeze up when I haven't had my cup of coffee and I have to start fixing NT boxes. :(
  • by SEWilco ( 27983 )
    I should look in the current SlashCode and fix something myself. Sometimes there isn't time for all trivial reported problems to be fixed.
  • Must just be /. trying to increase its hits. :-)
  • > Where did you hide the last few years ?
    Well that comment was a little thoughtless, but if you must know....

    I've been busy supplying performance improvements to css-auth. [you know, a certain Linux DVD decryption program]. Also I occassionally contribute to KDE; not often enough, as I have a real job [telecomms s/w design ] which surprisingly often involves C & C++.

    Unfortunately it's no longer possible to read everything that comes out in the field of software design. Design patterns have not even been seen on my reading list and are not required reading for telecoms software either.

    I actually came away thinking this is another design fad like: Yourdon, Schlaer-Mellor, SSADM, UML....the list is long and tedious.

  • Is this an American thing?

    No, it is not.

    I found patterns useful because they are solutions to problems... problems that reoccur in many, many software projects. So instead of having to find myown way of solving the said problem (again) I can pick up my catalog of design patterns, see if any of them fits the problem I have at hand, check what forces and counter forces (for example, scalability vs. efficiency) a particular pattern may be affected by.

    Patterns are solutions that have gone through a process of extensive peer review, therefore they tend to be elegant and simple, and just save me a ton of time and several headaches.

    YMMV
  • 8^) Yeah... well, I work on an AIX (OB-JFS-PLUG) box now, so my nutri-grain bar and mountain dew breakfast combo gets me all sorts of ready to go in the morning. Mmmm... forbidden Donut.

  • It does not matter how much effort goes into something like this, there will be no substitute for thought. This, like much of the OO philosophy, are (in my view anyway) attepmts to reduce thinking to something you can do mechanically. Just follow the right instructions, and youll write great programs.

    Mind you, Im not saying that this is all bad. It might help some people sometimes, but most of the time stuff like this just takes space in peoples brains and gets in the way of reasoning. Instead of following their own thoughts, they try to remember and to live up to the wise sayings of GodFather(tm) Suchandsuch... and invariably get it all wrong.

    Besides, I think it is a questionable practice to face a problem with a "in what pattern would it fit" attitude.

    My view, anyway,

    rmstar
  • I have this book. I haven't read it cover to cover (yet), but have absorbed sections as appropriate.

    I like it. It seems down to earth. I agree that occasionally, the examples seem a wee bit forced, but I'm not bothered by the C++ focus as that is my preferred language.

    I'd agree that the book is a good read if you are into patterns. If you aren't, you should check out the patterns web site, and the original Design Patterns book.

    Also, I went nuts and bought the original Christopher Alexander architectural patterns books. They look fantastic!

    --
  • i.e. program it for the deadline and then draw lots of sticks and boxes to justify your design
    Um, not to sound harsh, but it sounds pretty darned idiotic to justify a design AFTER you finish implementing it. The point of the design is so that it can be reviewed to make sure it it meets the requirements (otherwise, you won't even get funding for the implementation phase) and to keep programmers on the same page during implementation so that the project can be effectively divided amonst a team of programmers without the communications overhead getting out of control.

    The point of using 'patterns' is so that people who design systems for a living can speak the same lingo during design, independant of the language being used for implementation. Same with UML. Their complexity characteristics are already well understood and the metaphors being used ("observer," "factory", etc) are also high-level enough so that the overall design of a large component can be held-in-the-head all at once. (you know, that 7+/-2 thing?).

    If you design large projects using an object oriented approach (which doesn't exclude using C or even Perl), I'd think that you'd find patterns essential. And yes, people had been doing OOD without using "patterns" as such for quite a while; but more often than not, the same general ideas were still being used, except that any metaphors used to explain them were basically ad-hoc.

    -NooM
  • I wouldn't say this review was poor. The reviewer had an issue with the author's emphasis on C++, and I'd agree about that. There are several OO languages in reasonably common use (C++, Java, Delphi, even VB, with fairly large amounts of code in Smalltalk, Eiffel etc still in existence), and among object orriented languages as a group C++, while possibly the most commonly used (although newer projects tend to be in Java), is distinctly eccentric in several ways.

    If the author intended to make his advice generally applicable, it was possibly unwise to pick a language who features and terminology are so unusual.
  • Not to mention the suggested use of "delete this" (p. 41). The author is "not sure why, but [...] people wince at delete this". Hint, John: objects sometimes reside on the stack. In which case delete this can have very... interesting consequences. (True, in the context of a Singleton, where allocation is controlled and cannot occur on stack, this specific problem is not present, and yet "delete this" is such an evil construct, with such horrible possible results, that I'd rather avoid it even in such controlled cases.)

    Seems that much the same things were said about "goto", quite a few years ago: It's bad, many times you should not use it, definitely not without understanding the consequences, use only in certain controlled situations...

    Well, okay, but personally I think that applies to just about any programming construct. Use it where it belongs, don't use it if it doesn't belong.

    The review appears to fall into the trap of He wrote a book, but it wasn't what I would have written, so lots of it must be wrong. No, I don't think so.

  • At the risk of descending into techie wibble, I'd take issue with a couple of the reviewer's comments.

    First, Singleton is _not_ just a misguided substitute for globals. It's an answer to the so-called "static initialization order fiasco" - the compiler doesn't guarantee that globals in different obj files are constructed in any particular order, so if the constructor for one such global requires that another one already be initialized, you either take your chances or you adopt an initialize-on-first-use idiom, i.e. Singleton. It isn't necessarily good enough to say "but there aren't any dependencies"; there might be in the future, and this is a sufficiently subtle problem that the designer should be considering it from the start.

    Second, I wouldn't agree that "delete this" is primarily bad because it screws up objects on the heap. If you're using it, your class shouldn't allow heap allocation. The main problem is that it invalidates a pointer in the caller's scope without giving the caller any clue that this has happened.

    Rant over. It is a good book; take a look.

  • Try the Learning Guide to Design Patterns [industriallogic.com].

    If you are interested in Human Computer Interaction and Object Oriented Techniques (like I am :) ) take a look at uidesign.net [uidesign.net]. There is section about patterns for UI.


    --Ivan, weenie NT4 user: bite me!

  • Chapters 4 and 5 are aimed mainly for pattern developers, though this is not clear when one begins to read them.

    And who should not be developing patterns? (or who might read the book, but not those chapters?)

    Anyone who designs or codes programs finds himself continually seeking solutions to different problems. Each solution represents a pattern of solutions to similar problems, so either the designer/coder is using existing patterns or he is inventing new (or at least undocumented) ones. So I think every designer/coder is using and/or creating patterns. Consequently every designer/coder who wants to improve his work should seek to improve his use/creation of patterns, and could benefit from all of the book.

    Just my $2e-2.

  • I found the reviewer's bias toward languages with built-in garbage collection a little annoying.

    The one thing I find myself wishing for most often with garbage collection is a feature that is available in most Lisp implementations: a function to explicitly trigger the garbage collector. This gives me all the power that I usually want in a language with explicit storage allocation and deallocation. I often just want the ability to say when it is convenient. That is a bit of knowledge that programmers almost always have and the runtime library frequently doesn't. So right before a big task that is going to take a while and chew up memory, or right after a big response gets handed to the user:

    (garbage-collect) ; Emacs Lisp dialect
  • Oh you must be joking about replacing Singletons with global variables...in projects I've worked on recently (such as Litestep) their use of global variables has made a great deal of the code completely unportable. We've had to throw out massive amounts of work because there simply was no way to disconnect the code from the global variables. Globals make code far more cohesive than it every needs to be, and certainly do not belong in a solid OO application.

    The concerns about 'delete this' are certainly understandable, but generally unfounded. In well-behaved code, an object will not delete itself unless all clients have released their references to it, just as in a garbage-collected environment. Certainly a pointer to the object may reside on the stack, but there's no harm done in passing or manipulating a pointer value with nothing attached to it. As long as nobody uses the original object anymore, there's not a problem.

    The memory management issues under C++ are certainly heinous, I agree. C++ with garbage collection would likely prove to be the most useful OO language around...but Java's getting close to that anyway. Following certain rules for memory allocation and deallocation in C++ can usually iron out the troubles.

  • Equally frank: Bullshit.

    First of all, if you are having to use reference counts, then you've probably screwed up the design of your memory management (there are cases where they're useful, but not many).

    But moving on, it's trivially provable that garbage collection is slower, because you have the additional overhead of tracking references on EVERY allocation, versus manual memory management where you don't (in a properly design system, that is).

    But that's not even the full story. The other problem with GC is that suddenly everything is allocated using dynamic allocation, rather than being able to story temporary objects on the stack. Dyamic allocation is WAY slower, and creates significant problems with memory fragmentation.

    The other "human" factor I could bring up is that "allocation abuse" happens much more often in languages with GC, because it's so easy to create/destroy and throw around objects/memory willy-nilly. Unfortunately, there is such a buffer zone between the programmer and "what's really going on" that they don't the realize the inefficiency of what they're doing.

    Now, this is not to say that I'm always against GC. In many cases, the inefficiency is worth it to gain the advantage of fewer memory allocation bugs. For example, in embedded systems this is a great thing.

    If you haven't finished smoking whatever your smoking, perhaps you'd care to give an example of this "magic" garbage collection that is faster than straight malloc/free or new/delete?


    --

  • Comment removed based on user account deletion
  • The reviewer was just using the GC issue as an example of how patterns in the book tended to be C++-centric. This is a problem in the pattern community in general - patterns are presented as language-independent chunks of OO design knowledge, but in reality most of them are just the standard ways of getting around limitations of popular OO languages (which tend towards the lowest common denominator, like most popular things).

    Don't just take my word for it, go check out Design Patterns in Dynamic Programming [norvig.com] in which Peter Norvig shows that 16 of 23 patterns in the canonical Design Patterns book are unnecessary or much simpler in Common Lisp due to the availability of multiple dispatch, first-class types, and real macros.


  • <I>If you compare that to software engineering, we are still in the early years of last century.</I>

    Do you mean the beginning of the 20th century or the beginning of the 19th century.

    I ask that because given that 2000 is either taken for the first year of the 21th century (by popular belief) or as the last year of the 20th century (because of the way the calendar was done) when you say the beginning of the last century we don't know wether you are taking the century's definition from popular belief or from scientifical truth, so we can't say.
  • by Anonymous Coward
    There have been studies that show that most programmers do worse by doing there own memory management.
    One good example is
    Zorn, B. The Measured Cost of Conservative Garbage Collection. Technical Report CU-CS-573-92. University of Colorado at Boulder, 1992.
  • I think you miss the point made by the previous poster: yes, GC improves productivity, because it removes the burden of memory management from the programmer. However, there are classes of problems where speed and/or effeciency is part of the problem space; in other words, where the solution does not only need to be correct, the implementation of the solution needs to run as quickly as possible (a good example would be writing the GC for a high level language :-) Once you reach that point, if you understand that you're increasing development complexity (doing your own memory management) and speed (reviewing and testing to make sure you did it right) in order to meet performance requirements, then not using a GC is just a rational choice.
  • Too much of this "pattern" stuff is an attempt to codify workarounds for language problems. There are parts of C++ that are just wrong, mostly for legacy reasons. (It's not so much C legacy any more; C++ has its own legacy problems. For example, "this" ought to be a reference, but references went in too late.)

    Even worse is the "standard template library". That has become a playground for language fanatics. People are trying to implement LISP lambda-expressions using C++ templates, which is an example of using the wrong tool to implement a mediocre idea. The STL has resulted in a number of programming paradigms that are very complicated, bug-prone, and involve coordination of items far apart in the source text.

    "Patterns" are being promoted as an answer to some of these problems. Just follow the ritual, avoid the taboos, and it will work. Don't try to understand why.

    Anyone remember how the Symbolics LISP gurus messed up Common LISP, by insisting that all the crap their specialized hardware could support should go into the language and libraries? The STL crowd is doing that to C++. It's not good.

    Humor item of the day: The Symbolics machines had the "MIT Space Cadet" keyboard, with a SHIFT key, a CNTL key, a TOP key, a META key, a SUPER key, and a HYPER key, all used as shift keys and all usable together. This came from a joke attempt at MIT to outdo Stanford, where the SAIL machine had TOP and META. Symbolics shipped that keyboard standard to customers. Symbolics [symbolics.com] is out of business.

  • Samrobb has already said it well, but I'll just add that you don't really define "do worse" in your statement.

    If you are talking bug-free code, then clearly GC is a big win. If you're talking fastest code and smallest memory footprint, then GC is a big lose.


    --

  • by Anonymous Coward

    Sure, it's called generational garbage collection. It relies on the fact that in practice, most objects are very short-lived, and that a small fraction have longer lifetimes.

    The memory is broken up into several (usually 3 or 4) sections, called "generations". When an object is created for the first time, it is created in the generation called the "nursery". Every time a sweep of the memory space is done, an object is moved from whichever generation it currently resides in to the next older one. And sweeps of the older generations are done less frequently than those of the younger.

    For functional and OO languages (which tend to have high object-creation rates) a well-tuned generational gc has performance that blows away malloc()/free() and is on par with stack allocation.

    This is because the generational gc gives you enormously better locality of reference than say malloc can. Since almost all objects are short-lived, most will spend their entire lifetimes in the nursery, without ever being moved into the older generations, and the copying of long-lived objects to older generations prevents the nursery from becoming highly fragmented.

    As a result, for the common case there are many fewer cache misses, and thus fewer pipeline stalls during execution. Hence: much better performance than with manual memory management. It works well enough that you can afford to heap-allocate even things like execution frames, and thus cheaply implement things like restartable exceptions and first-class continuations.

    HTH. HAND.

  • It sounds interesting, but where has this been demonstrated in real world implementations?

    Since objects can be moved around by the memory manager, you have to have "handles" with double pointers, locking, all kinds of nonsense.

    It sounds good in theory, but I would be interested to know if it's ever been used in a real application outside the lab.


    --

  • Hello again. you quoted my comment above questioning your knowledge of DP. Frankly I don't question your coding cred but I think you just didn't get Vlisside's book. This book tried to show how the stuff they discussed in GOF fits together. And frankly I recon John did a very good job at that.

    As the title implies the book was meant to show how the patterns are applied. When reviewing it you only managed to point out several minor issues that were actually well defeated by the comments here. Especially the delete this stuff. I still don't buy your argument that it's evil all the time every time. The garbage collection argument was commented on by somebody else and needs no repeating. However, even though you didn't find anything fundamentally wrong with your book you critisized it on the basis of those minor issues that were not to your liking.

    I replied to your review quite compulsively because I recon that you have done a very bad thing here. You discouraged the purchase of a book that could otherwise enlighten a lot of people about proper code design. Listen patterns are not some fancy academic theory. They work in practice. People apply them on daily basis (conciously or otherwise) to solve real programming issues. Pattern Hatching is one of the more easily digestible patterns books but your silly review may detract a lot of readers from this book and possibly from the topic of patterns all together. Again I think your critisism of this book was quite groundless and you got the comments you deserved.

    Thank you.

  • Comment removed based on user account deletion
  • What you're missing is that the phenomenon you're referring to is simply a maturing of the software engineering discipline. EVERY engineering discipline has gone through this. Look at typical hi-tech consumer products today. You have cell phones and PDAs, devices that take tens or hundreds of people to design and build. A few hundred years ago, watchmaking was probably considered hi-tech and could be done by one person. The wright brothers built an airplane, but now you have thousands of people working on space shuttles and you have to deal with safety standards, and other "bureaucratic gobbledygook" and there's no way one person can do the job. Software engineering is still a very young industry, which is just beginning to mature. In 20 years, I think you'll be wondering how we ever managed to produce large-scale software at all in the 20th century with such primitive processes.
  • I find Linux geeks' bias towards operating systems with preemptive multitasking a little annoying. For some projects it's appropriate to use an operating system with preemptive multitasking (trivial GUI wrappers), and sometimes it's not (realtime IO). While linux geeks may be "amused" by the need to explicitly hand off control to other threads, they should understand that it's generally far more efficient.

    Unfortunately, I think many Linux geeks don't have much experience with realtime IO programming to understand these trade-offs. Until they do, they should resist making judgements.

    P.S. This is not to defend MacOS as an operating system...
  • I agree. I have had a lot of success developing solid OO designs simply by analyzing at the problem, and breaking it down into subproblems that can be solved by existing patterns. Then it's usually just a matter of implementing the specific details. I no longer tear my hair out trying to decide what the best way is to do something - the best solution always involves one or more of the GoF patterns, so I just go through them until I find the one that fits. It is remarkably simple and elegant.

    Now, I AM simplifying things a LITTLE bit. It's essential to have some experience, and a sound understanding of the principles of OO design. You can't just pull Joe Programmer off his terminal and turn him into a great designer instantly by supplying him a copy of the GoF book. But it's certainly a great start. It was after reading that book that I really started to fully understand what OO was all about. Buying "Design Patterns: Elements of Reusable Object Oriented Software" is probably the best $50 I've ever spent. I probably will sound like the biggest geek ever by saying it, but that book changed my life. :)
  • Note that you don't need to use "handles", because in all of these languages there's no such thing as explicit pointers, so it's possible for the memory manager to know where all the pointers are and update them transparently.

    Well, OK, maybe you're not using handles at the language level, but so what? There is still all that logic under the hood, and that is a big performance drain.

    All the languages you have named (including (notoriously) Java) are not practical for any sort of large-scale application where performance is a concern. You haven't really given any evidence to support that GC is practical for a performance-critical major application.

    In fact, I asked if this GC method had ever left the lab, and it could be argued that Lisp and Smalltalk have never left the lab. :)


    --

  • I don't know what you're talking about with all this static initialization whatchamacalit. My compiler has no such limitations. Oh, I see, you are talking only about C++. Well, you should have said so :-).

    [I agree that the reviewer is off his rocker about delete this, though.]
  • I think you make an interesting point. Given how many people defend explicit, manual memory allocation/deallocation because of its efficiency over garbage collection it seems strange that so many of the same people aren't rabidly defending explicit, manual processor allocation/deallocation. It seems like they are very similar to me. Letting the processor take care of context switching usually ends up making less sense than letting the compiler/runtime take care of memory deallocation, at least in the real time programs I write for a living.

    I'm not sure the problem is with Linux geeks, so much as with /. geeks. Given the preponderance of posts about PHP and Perl and Apache and whatnot it is hard to escape the impression that many posters on /. don't spend much time doing hard real time programming.

    Oh well, I guess it just means that in the same way the Unix fascists lord their superiority over Windows peons, real time programmers can lord their superiority over the lowly web hackers that seem to populate /. ;-)
  • A couple of weeks ago John Vlissides took part in a debate at the POPL conference in Boston (Principles of Programming Languages), and although the academic community favours functional languages and is thus biased, a strong view was expressed that design patterns are prompted by the lack of appropriate language features (tools).
    Alternatively, you could view it as "getting round" the disadvantages of object oriented programming paradigm.
    Although I do not expect there to be any functional programmers around on /., I would be interested to hear arguments for straight C, i.e. procedural progamming.
    Thanks
  • Just out of curiosity, which side are you on? :)


    --

  • Agreed, to an extent.

    However, "goto" used improperly will cause your code to spaghetti. Very rarely will it cause memory corruption.

    "delete this" will certainly cause memory corruption if the object is allocated on the stack. The more special rules you impose on an object's use, the more likely it will be that those rules will be broken.

    Just my $0.02.
    Educational sig-line: Choose rhymes with lose. Chose rhymes with goes. Loose rhymes with goose.
  • No, I mean you don't have handles at the implementation level. And the advantage of putting that logic into the compiler is that you only have to get it right once, by the compiler authors. This is sometimes called code reuse. ;)

    Well, I suppose you could keep a list of all the references to the object, and update all the references when the object moves. The does mean you need all sorts of locking nonsense if you want preemptive multithreading.

    I must admit you've given some food for thought about whether a practical GC could be done. I suppose the logical question is why hasn't it been done for a C++ compiler? If ever a language cried out for GC, it's C++.

    Common Lisp is very much a practical language...

    Well, I've never programmed a phone switch, so I can't say about that one. And I'm sure it may have snuck in as a scripting language in a game or two, and maybe some rogue programmer did something in an operating system.

    But the question is whether it should have been. You can't deny that C is by far the most dominant language for doing high-performance system programming (leaving out numerical programming in FORTRAN), and there is a reason for it.

    As for AI, don't cite that to me as a "practical" application, or I'll laugh at you. (AI is to Science as "1,2,3,Many" is to Mathematics).


    --

  • I think that the hope is not that you won't have to think but what you will think about moves on to other things. You probably don't worry where you next meal will come from, you're pretty comfortable with going to the grocery store every week. You also probably don't think about how to divide 2 binary numbers, someone else has thought about it a lot and you can use the fruits of their labors. You still have to know what not to eat and what kinds of things go together but you have less to worry about.

    Patterns always take some thought to apply, most of them will tell you situations when they will not be very good. You need to think if what you have learned is appropriate.

    You'd also never say "in what pattern would (this problem) fit". You'd say, I have a problem, what patterns could help me solve it. It takes thought.
  • Interesting comment. Check out the new extensions to the Haskell type system, though - some of these are pretty powerful. The disadvantage is that it takes quite a lot of type systems background to understand and program with these type systems. I do not want to start the typical comp.lang.functional static vs dynamic type checking, but I it's worth keeping in mind the benfits of both.
    Cayenne is also a in interesting language to look at.
    Coming back to design patterns, though - the opinion that design patterns are nasty hacks is worth spreading - this will yeild better languages, tools, more productivity and domain specific langauges... After all, we don't want to program *everyithing* in C. (ooh, controversial here on /.) Look forward to your reply, aas23
  • Anyone ever see how much code is in a Linux kernel now?

    Perhaps you would prefer the Windows family of Object-Oriented operating systems?

  • Runs the garbage collector.

    Calling the gc method suggests that the Java Virtual Machine expend effort toward recycling unused objects in order to make the memory they currently occupy available for quick reuse. When control returns from the method call, the Java Virtual Machine has made a best effort to reclaim space from all discarded objects.
    --
    Whether you think that you can, or that you can't, you are usually right.

  • C++ is hobbled by C pointers. You *can* get a C++ garbage collector, but it only works provided that you don't do evil things with the pointers. This is also the reason why Fortran can do much more loop unrolling etc.
    --
    Whether you think that you can, or that you can't, you are usually right.
  • Smalltalk has been using this for years, if not decades. If you look at the way java has been developing you'll see a lot of similarity to what smalltalk has had for a long time. Generational garbage just makes sense ( once you see how it works, the hallmark of a great idea).
  • I'll let you try and work that one out for yourself.
  • 'hobbled'? lol
    If a programmer understands the environment he's programming in, he shouldn't be hobbled by anything...
    Pointers don't cause any problems if you're careful, experienced, and well-planned.
    And nobody would want any haphazard, inexperienced, or poorly-planned programmers anyway...
  • Depends on the situation, naturally. :-)

    I think that garbage collection is like many other easy-features. It usually makes life easier, but there are always going to be cases where it makes life impossible. The classic example of such an easy-feature is, of course, the concept of an HLL itself. I'd much rather spend my time working in C than assembler, but when trying to write a kernel, a fast blitter, certain drivers, and that sort of thing, there's really no choice - one must use assembly.

    I think automatic garbage collection is a bit like that. Most of the time, it does make life easier - it's one less thing to worry about and one less thing to get wrong, leaving that much more time to focus on whatever it is that makes the program worthwhile. But there are, and will always be, situations in which you just can't get there from here without manual allocation and deallocation with no interference.

    Which side am I on? Both, naturally. :-) If I were writing (say) a RAID driver, I sure wouldn't want to worry about a memory manager kicking in and destroying my realtime video stream. Same for just about anything where speed is critical. But if I were writing a web browser, a word processor, a graphic editor - garbage collection would be a godsend. So I want a hybrid system where I can allocate and deallocate at will, or let the memory manager clean up for me, or perhaps both at once. And nothing says it all has to be done in one language.

    -Mars
  • This is what I love about many of you people: You're so classy and gracious.

    Unlike you, I was not born knowing all human knowledge, and I even occasionally like to learn something. My ego is not so fragile that I can't admit when I learn something.

    My "pontification" on the performance is based on actual experience. There are a lot more factors than just performance that come into play when using a GC language. I find it interesting that there are so potential solutions to the historical performance problems, but this does not mean that it's appropriate for all problems.

    I might also point out that I'm am FAR from convinced that using Lisp or Smalltalk (which have notoriously bad performance compared to lightweight languages such as C) as an example is proof of anything.

    The fact that you think these problems are so simple that you just "RTFM" instead of actually using your brain and experience tells me you have very little of either.


    --

  • Other than if I grant that GC could utilize stack allocation in some cases in the second statement, what is wrong with my first statement? GC will be slower than manual allocation, because there is overhead to doing "something" (tracking references) rather than doing "nothing" (not having to track references).

    The burden of proof is on you to demonstrate how a method of GC can be as fast or faster than manual allocation.

    Incidently, I'll even grant to you the fact that the reason Smalltalk and Lisp are slower relative to C is not completely GC, but just the nature of the languages. Still, that does not prove that GC is as fast as manual memory management.

    The other thing is that I feel I'm getting a bad rap here. I'm not against GC in languages, but there are cases where you don't want the unpredictability of GC and manual memory management makes sense. Are you trying to say that GC is appropriate for all problems?


    --

  • Stop "lying"? Look up lying in the dictionary. Apparently you don't know what it means.

    Oh, OK. In my endless quest to educate the unwashed like yourself, let me help you out. "Lying" is the intentional attempt to mislead another person by putting forth knowingly untrue statements. In my case, assuming I don't know what I'm talking about, I would be "ignorant", since I'm arguing from an incorrect set of facts.

    So the proper sentence would be "We'll be 'gracious' when you stop being ignorant". But even that has problems. First of all: "We'll". You look foolish when you presume to speak for others. It would be better to say "I'll be 'gracious' when you stop being ignorant."

    Better. But still not correct. The statement is rather contradictory, since the time to be gracious is when educating somebody out of ignorance. But I don't think I can fix that personality problem of yours in this post. Maybe next time.


    --

  • You said that it would be "trival" to prove that GC is slower than hand management. If its trivial, it should be easy for someone of your broad knowledge of computer science to provide it. If you're going to make that kind of statement, the burden of proof is on you.

    You're not paying attention. I gave you the proof. Someone with your awesome abilities should be able to blow it down in one whiff.

    CMU floating point bench marks

    LOL! A prime example of "lies, damn lies, and benchmarks". A trivial floating point benchmark proves absolutely nothing. Of course that type of problem is going lend itself well to compiler optimization. I could write similar things in APL, and it would probably be very fast, but that would say nothing about APL's viability for general problem domains.

    You would be a bit more credible if you gave me a non-trivial program written in both languages. Say, a C++ compiler and optimizer. Or how about a Photoshop-like graphics manipulation tool. Or heck, how about a version of Quake in Lisp?

    What is your theory on why people don't use Lisp for any non-trivial project? Is my apparent level of ignorance so rampant in the industry?

    You must be in academia.

    Quote from Harlequin: "By supporting a variety of manual policies and tuning, Harlequin delivers significantly better performance than can be achieved with other vendors' existing 'one size fits all' allocators."

    Apparently they suffer under the same delusions that I do.


    --

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...