Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Has Software Development Improved? 848

earnest_deyoung asks: "Twenty-five years ago Frederick Brooks laid out a vision of the future of software engineering in "No Silver Bullet." At the time he thought improvements in the process of software creation were most likely to come from object-oriented programming, of-the-shelf components, rapid prototyping, and cultivation of truly great designers. I've found postings on /. where people tout all sorts of design tools, from languages like Java, Perl, Python, Ruby, and Smalltalk to design aids and processes like UML and eXtreme Programming. I'm in a Computer Science degree program, and I keep wondering what "improvements" over the last quarter century have actually brought progress to the key issue: more quickly and more inexpensively developing software that's more reliable?"
This discussion has been archived. No new comments can be posted.

Has Software Development Improved?

Comments Filter:
  • by tres3 ( 594716 ) on Tuesday November 26, 2002 @10:37AM (#4758461) Homepage
    Open and free source and collaboration will become the ruling way that software is developed in all but the most vertical markets. Things like controlling the amount of radiation that is released from a medical device and that sort of things. Everything that has a large consumer base will shift to an open collaboration on free software with all of the freedoms still intact.
  • Functional languages (Score:5, Interesting)

    by nick255 ( 139962 ) on Tuesday November 26, 2002 @10:37AM (#4758467)
    Not too sure if it an improvement, but I know some people use languages in which programs can be proven to work like ML [napier.ac.uk]. Of course if you actually want to write a program which *does* something it is probably not for you.
  • by vasqzr ( 619165 ) <vasqzr@noSpaM.netscape.net> on Tuesday November 26, 2002 @10:37AM (#4758468)


    I can say I honestly don't like Java.

    Nowadays we've got great tools like Flash, scripting languages like VB Script, and markup languages like HTML.

    Not every programmer these days is a old COBOL nerd, ASM coder, or C junkie.

    I yearn for the days when Borland was great. The Turbo C++ and Turbo Pascal products probably got half of the programmers in the 80's, late 90's started.

  • While I'm only in my mid 20's and I'm no veteran by any stretch, it seems like there have been huge leaps in programmer productivity made possible by things like OOP and off-the-shelf components.

    However, I think they're equally balanced out by huger demands on programmers. Once it's realized that a programmer can do 2, 3, or 10 times as much work by using more efficient methods management is quick to pile on 2, 3, or 10 times as much work!

    This isn't really unique to programming either. I think it's universally applicable to any area where technology permits greater productivity.

    For example, look at all those ads from the 50's. Things like the microwave, the vaccuum, and the dishwasher were supposed to usher in a new era of leisure. Do we have more leisure? No, we have less, as those luxuries become necessities and we cram in more activities in out new-found time in order to stay competitive.
  • The Software Process (Score:2, Interesting)

    by vikstar ( 615372 ) on Tuesday November 26, 2002 @10:41AM (#4758494) Journal
    Probably the biggest improvement is due to the creation of software processes, whether it is the legacy waterfall or the latest XP.
  • by exhilaration ( 587191 ) on Tuesday November 26, 2002 @10:42AM (#4758499)

    I'm in a Computer Science degree program

    If you're in the U.S., GET OUT WHILE YOU STILL CAN - the tech job market (and the economy in general) are in horrible shape. My friends are coming out of college with CS/IS/MIS degrees and finding NOTHING!

    And if you stay in CS, may God have mercy on your soul.

  • by MosesJones ( 55544 ) on Tuesday November 26, 2002 @10:45AM (#4758526) Homepage
    Now I'm sure that some people out there will rave about how great XP is, but reading the Mythical Man Month and working on any large, or even medium scale project with a long term life-span will tell you that while some elements of XP are good, these are the ones that existed before.

    1) Write your test cases up front... this is ages old. XP isn't as rigourous as others who say "and make sure other people write them".

    2) Pair Programming, works for two people of equal ability. The two headed surgical team from the Mythical man month is a much more effective way of using two heads.

    Basically things like XP sum up how long computing has to go to become an engineering discipline. In every other engineering subject there are critical elements:

    Requirements
    Design
    Testing and approval of design
    implementation
    testing of implementation (throughout implementation)
    Delivery.
    Maintainance

    For a construction project all of these elements are mapped out well in advance, which is why the construction industry can work on lower margins.

    To become better requires not a "Silver Bullet" as Brookes says, the technology won't make the improvement. Its actually about people applying the rules _rather_ than looking for the Silver Bullet. Some projects succeed, others fail, there are reasons for the failures and the successes. But rarely do we learn from either.

    XP is the embodyment of the non-engineering approach to computing that pervades this marketplace. The idea that you can build it wrong and change, don't design "code and check", have a unit test written by a bad coder to check his own bad code.

    Brookes is right. At the end of the day computing success is down to a realisation of the soft-skills allied to technical talent.

    If you have 10 brilliant people leading 100 average people... fire the 100 and support the 10 to do the delivery effectively. Make sure they follow a process, and make sure that the requirements are defined and change as little as possible. Make sure designs are verified, make sure code is reviewed.

    Sure its less exciting that "just coding" but in the end it takes less time, costs less to maintain and delivers what the customer wants.

    Engineering is a discipline, XP is just glorified hacking, only by becomming disciplined will software improve.
  • by fungus ( 37425 ) on Tuesday November 26, 2002 @10:46AM (#4758530)
    You must be kidding.

    Of course it is now easier to create software than before.

    First of all, source management software wasnt available 25 years ago. Try creating a huge piece of software without any way to rollback changes, share the same source tree with other developpers, etc... (cvs/sourcesafe/starteam/etc)

    Second, profiling tools. Hey, you want to know where that memory leak is? Where that CPU bottleneck is? Pretty hard to do when you were coding in cobol many years ago... Doing the same is way easier now with OptimizeIt and stuff like that.

    I could go on and on but I must leave for work =)
  • One day... (Score:5, Interesting)

    by pubjames ( 468013 ) on Tuesday November 26, 2002 @10:46AM (#4758533)

    I've always believed that one day some bright college kid is going to come up with a completely different style of computer language and interface, and when that happens we will all slap our heads and go "D'oh! So that's how we should be doing it! Obvious!"

    Like the web and P2P, the most influential ideas are often quite simple, and "obvious".
  • Here's a thought... (Score:3, Interesting)

    by fireboy1919 ( 257783 ) <rustyp AT freeshell DOT org> on Tuesday November 26, 2002 @10:48AM (#4758554) Homepage Journal
    Off the shelf components have helped a LOT.

    perl is ugly to code in, and perl OOP is obviously a hack. I had a graphics/OOP professor who said, "nobody likes to program in perl, but it gets the job done." Obviously he lives in the land of language theory, where perl doesn't, but it gets the idea across...

    perl gets the job done because of its massive collection of components.

    I think I'd go further to say that the big improvement there is in repositories where you can get massive collections of components, as there have been languages like perl (in terms of having lots of stuff - PL/1 comes to mind) in the past.

    Places like CPAN, Sourceforge and Freshmeat really make the difference. So ultimately, the internet is the means through which software development has sped up (at least when you're not talking about RAD-GUI development, which is another thing entirely).
  • by voudras ( 105736 ) <voudras@sw3.1415 ... yer.org minus pi> on Tuesday November 26, 2002 @10:49AM (#4758559)
    I've been using CVS for some time now - even bought a _great_book_ [slashdot.org] to assist me in understanding it better.

    I was recently discussing this sort of thing with some friends and got into what I would love to start as a project - something to the effect of "fvs" or function versioning system - which would allow me to keep tabs on "just-a-box" functions which i use throughout my programs.

    I think any programmer who sees the benifts of CVS would understand where im going with this concept. We all have functions we use again and again - and realizing that there is a potential flaw in a given function at one point is always followed by exasperation because one realizes that the function needs to be changes in X number of programs.

  • Object Technology.. (Score:3, Interesting)

    by eastshores ( 459180 ) on Tuesday November 26, 2002 @10:51AM (#4758579)
    In my opinion, object technology has done more to allow developers to re-use effort than any other paradigm in development. That isn't to say that the potential of OOP has even been approached. It is rare to find large development shops that have a consistent base of knowledgable OOP developers, and it isn't always necassary; it is very important to have OOP concepts influence the architecture of any system that maintains re-use as a design objective.

    Design patterns also play an important role in allowing for a given design to be re-used, consumed, whatever the case may be. OOP related technologies such as UML, Corba, and now many XML based solutions are beginning to mature the field.

    I am not as experienced as I would like to be with OOP, but I can say that I have been in the procedural world long enough to realize that there seems to be a divine power in OOP. It makes you *think* entirely different about problems, and by breaking a very large, very complex problem into approachable components. Allowing not only a single developer to build more complex systems, but systems that are well suited for re-use in future or existing systems.
  • Observations (Score:4, Interesting)

    by GeckoFood ( 585211 ) <geckofood@nosPAM.gmail.com> on Tuesday November 26, 2002 @10:53AM (#4758596) Journal

    What I have observed in the course of software development, in various companies, is that the management dictates that "thou shalt follow said process" and there will be the obligatory design reviews, spiffy diagrams, an all the huffing and puffing. But when the smoke clears, it still comes down to 9 months of work compressed into 3 months of actual work time, and everyone shifts into a hack 'n slash mode. The processes in place fail because of a lack or adequate time and inflexibility of deadlines.

  • by wiredog ( 43288 ) on Tuesday November 26, 2002 @10:55AM (#4758614) Journal
    What sort of programming do you do? I've done machine tools programs and xml to Oracle data converters and I spend more time developing algorithms and data structures than I do wtriting code.
  • Patterns... (Score:2, Interesting)

    by Alphaloc ( 227011 ) on Tuesday November 26, 2002 @10:59AM (#4758641)
    have made my OO life a lot easier. Providing developers with a basic terminology that all other engineering disciplines already had. Making it possible to quickly discuss what solution you're going to use instead of how you might try to solve a problem. And those patterns prove to be a nice example of how to think "out of the box" and how even the hardest problems can and should have a nice design.
  • by sphealey ( 2855 ) on Tuesday November 26, 2002 @10:59AM (#4758646)
    While I'm only in my mid 20's and I'm no veteran by any stretch, it seems like there have been huge leaps in programmer productivity made possible by things like OOP and off-the-shelf components.
    Hmmm. In 1980 we had ASK ManMan. Written in COBOL and FORTRAN. A full distribution (IIRC) was around 5 megabytes, which really put a strain on available storage. The function of ManMan was to provide accounting and manufacturing management support for manufacturing companies. It performed this function very well, and some orgs out there are still using the 1980s versions.

    Today we have JDEdwards OneWorld. Written in C++ and other state-of-the-art tools. The distribution runs about 10 GB, with a working set for development purposes of 1.5 GB. Its function? To provide accounting and manufacturing management support for manufacturing companies. How well does it do? Is it any better than ManMan? I will leave that for you to decide. But hey - you can cut-and-paste directly from OneWorld to Excel. That's a gain I guess.

    sPh

  • by Annoyed Coward ( 620173 ) on Tuesday November 26, 2002 @11:15AM (#4758792) Homepage Journal
    I would be very cautious with the word improved.

    Software (design and) development is all about passion. Only mechanisms to express the road to success have changed (and improved). Experts look for species like good software developers, analyse them and define a new process. And most of them are not wrong.

    There are hardcore believers of waterfall model. And so goes with extreme programming. Configuration management is luxary (and waste of time, sometimes) for somebody, and may not be same for others. The best part is, we have methods suitable for individual's nature.

  • by bfwebster ( 90513 ) on Tuesday November 26, 2002 @11:18AM (#4758821) Homepage
    Sad to say, there has been little such progress in the last 30 years. One of the things I do for a living is act as an expert witness in litigation [bfwa.com] over failed IT projects. In my research, I reviewed 120 such lawsuits that took place over a 25-year period and found (a) that they all fall into one (or two or three) of half a dozen fact patterns, and (b) the root causes are all the same. (I wrote a white paper summarizing my findings [bfwa.com]). The simple fact: we make the same mistakes over and over again, and these are mistakes that have been well-known and well-documented for 30 years.

    Brooks, in the "No Silver Bullet" essay referenced above, stated that there is both essential and accidential complexity in software development, and because of that there never would be a "silver bullet" to slay the software "monster". However, there are fundamental practices that increase the likelihood of success and fundamental pitfalls that every project faces. And, in the end, the root causes of most failed IT projects are human factors; in fact, you could just cite the "seven deadly sins"--pride, envy, gluttony, lust, anger, greed, sloth--and probably hit the nail on the head.

    In conjuction with that, far, far too many practitioners in the IT field lack one or more of the following:

    • Talent [byte.com]
    • Sufficient (or any) education in software engineering (or even computer science)
    • Any familiarity with the literature from the past 30+ years. I'm not talking about IEEE/ACM Transactions, I'm talking about standard classic works such as _The Mythical Man-Month_ (Fred Brooks), _The Psychology of Computer Programming_ (and everything else by Gerry Weinberg), _Principles of Software Engineering Management_ (Tom Gilb), _Peopleware_ (and everything else by Timothy Lister and/or Tom DeMarco), _Assessment and Control of Software Risks (and anything else by Capers Jones), _Death March_ (and anything else by Ed Yourdon), _Journey of the Software Professional_ (Luke Hohmann), and any of the 100 or so texts on software engineering on the bookshelf behind me.

    To quote George Santayana (who is often misquoted):

    Progress, far from consisting of change, depends upon retentiveness...Those who cannot remember the past are condemned to fulfil it.

    Software engineering is hard enough--with all the human issues--without further handicapping ourselves with ignorance of all that has been already discovered and documented. Yet that is exactly what most IT workers do. Until we find a way to solve _that_ problem, the failure rate for IT projects will remain high indeed. ..bruce..

  • Opinion: Yes. (Score:3, Interesting)

    by 4of12 ( 97621 ) on Tuesday November 26, 2002 @11:20AM (#4758834) Homepage Journal

    I think the ingredients you mention have made production of software a better process over the past 25 years. Software applications can do more and can be built more quickly as result of those improved tools available to developers.

    However, you still see a great deal of unreliability, bloated, and inefficient code because developers are trying to do much more than they did 25 years ago.

    If all we needed to do was re-create the applications of 25 years ago, then the benefits of new techniques would be more evident. But people demand more and programmers want to create works up to their full personal potential and exceed what is currently possible.

    Ragged-edge software is manifest evidence that we still are constantly crossing the barrier of human potential, that place where what is barely possible becomes what doesn't work. It's a good sign of innovation. And, it provides added impetus to keep trying to find more ways of improving the software development process.

  • by eweu ( 213081 ) on Tuesday November 26, 2002 @11:28AM (#4758905)
    does software really require engineering?

    Yes. Good software does. In my opinion, this is the most significant problem with software today. There are too many "software engineers" that don't follow any sort of engineering principles at all. Quality and documentation are afterthoughts. Design consists of a conversation or two before implementation (coding).

    If this sounds like you, please leave the industry now. We've suffered enough.
  • by BigTom ( 38321 ) on Tuesday November 26, 2002 @11:28AM (#4758911) Homepage
    We don't know enough to do software engineering yet. If materials in the physical world were as poorly understood, and changed as fast as they do in the software world they couldn't do it there either.

    If requirements were as poorly understood and changed as fast in the physical world as they do in the software world construction would cost a fortune and most big buildings would never get finished (or would never be fit for purpose).

    People who say things like "Make sure the requirements don't change" are living in a fantasy world where they want to blame their inability to deliver on someone else.

    The rules haven't changed, get a high quality small team, get good access to a user who knows what they want and grow a system from small beginnings, checking at each stage that it all works and that quality is high.

    Its all there in Brooks.

    Its no surprise that the guys pushing the agile methodologies were all very succesful designers and developers anyway.
  • Re:Absolutely! (Score:3, Interesting)

    by Black Perl ( 12686 ) on Tuesday November 26, 2002 @11:45AM (#4759070)
    If Perl had an IDE that was as easy to use it would dominate the world. (more than it already does).

    There are many. Others have mentioned Komodo [activestate.com] and the Visual Perl plugin [activestate.com] for Visual Studio.NET.

    There's also Perl Builder [solutionsoft.com], which people rave about but I have not tried. They claim to be the most popular Perl IDE.

    I would also like to add that an Open Source one, Open Perl IDE [sourceforge.net] is decent. I use it at home.

    I use Komodo at work (because it can act as an IDE for other languages like XSLT) and really like the Perl and Regular Expression debuggers.

    -bp
  • by Communomancer ( 8024 ) on Tuesday November 26, 2002 @11:51AM (#4759130)
    A top-notch staff and a world-class leader, I'm guessing, is significantly more expensive than your average software development team. Therefore, it ain't exactly cheap.

    On the other hand, it's _probably_ as cheap as good and fast is gonna get.
  • by Coz ( 178857 ) on Tuesday November 26, 2002 @11:52AM (#4759138) Homepage Journal
    Incredible stuff, few engineers, short amount of time - and absolutely no control over their budgets. Read Ben Rich's biography - the Skunk Works has historically been a money pit. They'd figure out how to do a job, quickly and well, and then turn it over for production, but their price tag was enourmous.
  • by Anonymous Coward on Tuesday November 26, 2002 @12:02PM (#4759203)
    You are right that the key to good software
    is not methodology (though it helps) nor
    good process (though that helps too.) The key
    is great people (or taking good people and
    making them great.) In software engineering
    the difference in productivity between an
    average programmer and a good one can be as
    much as 1000%, and between a good and a great
    programmer another 1000%. That means that a
    great programmer can do more in a week than
    an average programmer in a year.

    That also means that if you have to pay the
    great programmer $1000 an hour, then it is
    a bargin. However, great programmers are much
    cheaper than that. Usually, pay them third
    quartile, a fat bonus, as much computer
    equipment as they want, and give them interesting
    challenging work in a nice work environment,
    and they will do what comes naturally to them.

    One other thing, it is better, IMHO, to get
    fast programmers than good (quality) programmers.
    People who produce very high quality code slowly
    are quite hard to train to become quick at doing
    so. Fast programmers can generally speaking be
    easily taught how to produce good quality code
    without impacting on their productivity. (Great
    programmers usually hack macros to do the boring
    stuff for them.)

    For our student freind, I would tell him/her that
    if he wants to become a great programmer listen
    to all they are teaching you, but far more
    important is to code code code. Write lots of
    programs, way more than the dumb ass homework
    assignments they give you. Also, read as much
    great source code as you can get your hands on.
    There is lots of it available under GPL.

    Learn from the masters, and practice is the key
    to a skill like computer programming. It won't
    matter at all if you forget the seven layers of
    the ISO communications model, but if you don't
    know the key idioms to writing great code to
    the point that they are muscle memory, you'll
    never be a great coder.

  • not a money pit... (Score:4, Interesting)

    by smagoun ( 546733 ) on Tuesday November 26, 2002 @12:06PM (#4759234) Homepage
    Actually, that isn't correct. I've read Ben's bio, Kelly's bio, etc, and one of the things that struck me was that Kelly insisted on giving money back to the government when a project ran under budget (which happened more than once). Hell, even Have Blue (the precursor to the F-117) only cost $14 million/unit. Compare that to the "low cost" F-16, which is about $20m, or the $185 million "Tacit Blue," which was Northrop's first foray into world of stealth. $14m vs. $185m. Think about that.

    Johnson and Rich were also proponents of having Lockheed engineers maintain their aircraft instead of having military personnel do it. Using Lockheed engineers would save a ton of money because they aren't rotated and therefore don't need to be retrained. Sure, Lockheed would get some money out of the deal, but not as much as the gov't would save.

    The Skunk Works wasn't perfect, but overall they were pretty good with money under Kelly and Ben.

  • Leadership and a bunch of intelligent ambitious developers are all you need.

    They don't have to be the best developers on the planet IMHO, just competent. I work for a company where the leadership has gone from a vapourware touting salesman, to a still slightly too ready to believe that the developers are all wrong, because now the major powerhouse is the support/customer service organization.

    What really would have helped was STRONG capable development management. None of the people I work with are stupid, they've just been made to believe they are for so long, some of them have bought into it.

    They badly want to rearchitect core products, but are continually dragged in several directions through mismangement.

    Anyone is capable of brilliance, but they need an opportunity to fend off the naysayers while they accomplish it ;)
  • what about AOP? (Score:1, Interesting)

    by Anonymous Coward on Tuesday November 26, 2002 @12:10PM (#4759273)
    Not sure if it was invented by college kid, but AOP seems pretty promising:

    aosd.net [aosd.net]
  • by Randolpho ( 628485 ) on Tuesday November 26, 2002 @12:15PM (#4759312) Homepage Journal
    Once you boil down analysis and modeling into the steps necessary to do the job, you've essentially written your program. What you're suggesting is a universal programming language that doesn't look like code and instead looks like some modeling or analysis diagrams.
  • Re:Best improvements (Score:4, Interesting)

    by sql*kitten ( 1359 ) on Tuesday November 26, 2002 @12:16PM (#4759316)
    Now you wouldn't think of developing on UNIX with anything but GCC and the associated build tools.

    Actually a lot of people would. The quality of code generated by the SUNpro and MIPSpro compilers on SPARC and MIPS processors respectively leaves GCC in the dust. GCC really only comes into its own on x86, because Linux (or *BSD) on x86 is the platform that it gets used most on. GCC is portable, yes, but it isn't built for compiling high-performance code. So you need to ask yourself whether getting binaries that execute 2x as fast is worth using a slightly less well known compiler for.
  • Literate Programming (Score:3, Interesting)

    by WillAdams ( 45638 ) on Tuesday November 26, 2002 @01:02PM (#4759726) Homepage
    I am still mystified that a discussion like this can take place and the system which Donald E. Knuth created to enable him to write TeX (www.tug.org, see the book, _TeX: The Program_ for the pretty-printed source) and METAFONT (_METAFONT: The Program_) is almost never mentioned.

    DEK has since written an entire book on the concept (_Literate Programming_ a CLSI series book) a decade ago, but one seldom sees source so provided.

    There are some really cool example programs which're quite interesting (and educational) to read, for example:

    Will Crowther's game Adventure - available here: http://sunburn.stanford.edu/~knuth/programs/advent .w.gz (with an offer of a $2.56 reward check if one can find a bug), or as a document to just read here: http://www.literateprogramming.com/adventure.pdf

    Or a CWEB version of the RPN calculator for K&R's C Book: http://www.literateprogramming.com/krcwsamp.pdf

    Probably what really needs to happen is a way to post a program as a web page, then to click on a link on it, to automagically compile and run it....

    William
  • by Minna Kirai ( 624281 ) on Tuesday November 26, 2002 @01:53PM (#4760160)
    This could be a sign that today's programmers use "objects" or some other abstraction instead of "algorithm" as their conceptual building-block of software development. That scary, Arabic word is unpopular with guys who just want to "call methods on object classes" or whatever the new jargon is.

    From one point of view, only a tiny minority of today's programmers ever need to create "new algorithms"- so much documented study has gone on in that area, you'd only be reinventing the same wheels.

    That can be a sign of maturity, that the field has evolved to the point where specialists can go their thing, and not force everyone else to understand the trickier aspects.

    Looking at your examples- data converters particualrly- I don't see much room for new algorithms. From a restrictive, Computer Science standpoint, nearly everything you do will be isomorphic with a known existing algorithm (modulo "trivial", "cosmetic" variations, of course)

    Naturally, people who don't understand algorithms will have difficulty selecting the ones they should re-use, and are at risk of using them wrongly. But that's a consequence of newer, user-friendly development tools- with a lower barrier to entry, less competent persons can enter the profession, and still muddle by.
  • by sbrown123 ( 229895 ) on Tuesday November 26, 2002 @02:07PM (#4760279) Homepage
    loss of performance with the current obsession on over-inheritence and Java-style interpreted/P-code software overall.

    Most companies see the small performance loss from using object oriented languages does not compare to the loss in man hours maintaining non-OOP code. OOP code scales better when that small application gets larger. Code reuse is simpler and can aid in quicker development of other applications with similiar functionality in the future.

    The "vanilla unix" does not exist anymore so its not really an issue. The original idea of byte-code compatiblity was lost when unix commercialized. This spurred the development of a language like Java which allows code to be cross-platform. Writing in ANSI C would be another way of doing this if only it was implemented the same across the Unixes and the hardware remained a constant. That will never happen.

    Add to this GPL/OS that slashes meaningful business value from well engineered software components

    I think that there are many well engineered software components under the GPL and could give plenty of examples. These components required the experience and time of many programmers. By sharing this codebase IT departments do not have to hire dozens of programmers to create a similiar product.

    You may be a "professional" developer, but I think you should avoid any position where you would have decisions on the direction of what technology is used within your company until you come into the realization of the economic savings of GPL code and the reasoning behind object oriented languages. These subjects are generally covered in most computer science programs at most universitys.
  • by SlySpy007 ( 562294 ) on Tuesday November 26, 2002 @02:07PM (#4760283)
    ...and I'm surprised no one mentions it too often. Things like generative programming, aspect oriented programming, domain engineering, the list goes on. They are all headed towards filling in the gaps where OOA/D falls short (and yes, it DOES fall short. WAY short) and creating flexible, general programs that are highly reusable and highly correct with a minimum of change. It's really great stuff, straight out of Star Trek. And if you think it's just theory, it's not. Do some research and find out that big (and I mean BIG) companies are going this way.
  • by CrashVector ( 568050 ) on Tuesday November 26, 2002 @03:02PM (#4760802)
    Hmmm,

    Of people and projects and the use of tools and technologies:

    Let's see... Two contracts ago I was on a team of developers who had one hammer in their toolbox: COM. I pleaded with them to limit their use of COM but they went on a rampage. They wrapped ADO in a COM based object/relational DB abstraction. Then they used the COM based DB wrapper in their COM based business tier to talk to COM based Active-X controls in the UI. I pleaded with them to use MTS, distributed transactions, and stored procedures as the system was a distributed client/server app; but I was told that transactions and stored procedures would add too much overhead. COM, COM everywhere and no transactions in sight! Result: A slow, giant, buggy, leaky, unstable 4 tier COM based pig of a system that has major database issues. I finally couldn't take it anymore and so I gave notice after 1.5 years on the project. I left there 2 years ago and today they have sold the thing to one customer - who is extremely unhappy...

    My last contract was an aircraft carrier combat system. 220 developers had been playing in Rational Rose for 1.5+ years when I showed up. They had modelled over 5,000 classes and had not modelled anything called "Weapon", "Sensor" or "theShip". They used Rose to generate their C++ code for HPUX and VxWorks and they used TCL/TK to do their UI. Frankly, if there a was a fork in the road these guys took the hard way every time. Result: the system is an impossibly huge nightmare that runs on dying platforms and depends on a dead UI language. The company has run out of money to finish the system. The staff has been cut from 220 to 50 and no deadlines have been moved. None of my friends who remain on this project work 40 hour weeks...

    Currently I'm on contract to do a UI by X-Mas. I just fought a 2 week pitched battle with pig headed engineers over whether or not we should use C#, VB, or MFC to do the UI. I finally ended the battle last week when I walked into a meeting with working screens coded in C#. The UI is mostly finished and all are very impressed with my work. But I have been unable to convince unwilling, MFC loving, pre .NET era engineers of what "Managed C++" is and so I'm currently coding an unnecessary Managed to UnManaged code layer so that C# can talk to unmanaged C++ on the back end...

    I guess what I'm saying is that the biggest thing I've run into in the last few years is people who have very real design decision making power but who don't have a good background in what tools and technologies are available. I see projects blunder into Death March situations because the people working on them are unwilling to keep up with technology or are unaware of what tools are out there...

    I work like heck to keep up with the times. I wish more people would read books like "The Pragmatic Programmer" and take them to heart...

    --Richard
  • by mobil1 ( 629542 ) on Tuesday November 26, 2002 @05:19PM (#4762211)
    For the last 20 years the commercial software development
    churned through imperative programming languages that do the same thing with just a bit different syntax and Spartan/or complete set of standard libraries (Cobol, fortran, C++, pascal, java, C++, C, perl, C#). So we have wasted investors money,
    customers money, many man hours by lying to them that when we use the new programming languages things going to be much better (that self-satisfying process is now called 'commercial software industry').
    In addition to manual conversion to different lang. syntax (and fixing some bugs on the way) of the existing software, we spent millions of dollars on what we call 'Design Tools' and those are mere lego-like packages that allow us to break-down our tasks into such small and disjoint components that a developer who was not part of the 'team-that-did-the-breakdown' would need to spend weeks to find out what set of 'components' (an important word in our vocabulary) enables that button (and it turns out that the a row in a database is read by the 'database layer', then the there is code that the checks if that row is has all the necessary data, then it sets some boolean flag (we call that a business logic layer), then the button checks that flag and to be enabled or disabled (and that we called our GUI layer)).

    In the last 10-15 years, we came up with a role of a 'Software Architect' -- that probably had a noble goal, but really degenerated into a task of dowloading the latest 'new' technology from the internet, then setting them up for some irrelevant test, and then running around the company and telling everyone how we should 'upgrade to the latest technology for some great benefit and no one can understand'.

    Then, in the past 20 years we convinced our customers that they cannot really expect bug-free software from us, but instead we can do something that does most of what they want and fix the rest later. So then the customers simply stopped believing us that we can manufacture something that works, the only thing that customers can do is to request us to do something by some date (therefore the unreasonable (but always movable :-) deadlines) -- and in fact, we got accustomed to it:
    Look at the press releases of the public software companies: they do not mention what functionality their products have, how fault-tolerant they are... instead they mention what Tools they used to build their stuff (XML, EJB, Java, OO, UML, web-enabled and wireless ready). And for a while the stock market rewarded just that. But have you seen a profitable hardware design and manufacturing company bragging in their press releases what compiler or VLSI design tool they use, instead of highlighting the functionality of their product?
    -----------
    We also convinced ourselves that there are no 'methods' to desing software without bugs, instead we'd rather rely on QA to test it -- because we think software is like nothing else -- it cannot be VERIFIED for CORRECTNESS by us.
    And that is fundamentally what needs to be changed: we must be able to use mathematically sound algorithm verification techniques and we should ask our tool vendors NOT to deliver to us a syntax change every 3 years, but build for us tools that would do what do data-flow analysis on the program logic level, that would simulate timings and parallel execution for concurrency enabled applications (threading/etc) and so on.

    -----
    Of course there has been a lot of progress in the computer science and mathematics that supports it, but most of it did not make in to commercial app development world.
    Compilers are better, language parsers are better, OSses are better -- but those are the tools where the science was used to design algorithms (like graph coloring for register allocations, or other optimization techniques)
    And hardware, of course is much better --

    the commercial software 'looks' like made leaps and bounds -- but it is really some of tools that we use underneath that made us look better.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...