Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Bug Programming IT Technology

QA != Testing 342

gManZboy writes "Original author of Make and IBM Researcher, Stu Feldman has written an overview of what should be (but is sadly perhaps not) familiar ground to many Slashdotters: Quality Assurance. He argues that QA is not equivalent to 'testing', and also addresses the oft-experienced (apparent) conflict between QA-advocates and 'buisiness goals.'"
This discussion has been archived. No new comments can be posted.

QA != Testing

Comments Filter:
  • Quality? (Score:5, Insightful)

    by ceeam ( 39911 ) on Wednesday March 02, 2005 @08:32AM (#11822198)
    IME, Quality = Knowledgeable_Staff_On_Good_Salary + No_Deadlines. Unfortunately, this formula larglely is not compatible with business world. So, in the mean time, customers should be grateful if software has been _somehow_ tested and mostly works.
  • Re:Requirements? (Score:5, Insightful)

    by PepeGSay ( 847429 ) on Wednesday March 02, 2005 @08:39AM (#11822216)
    This is a sign that there was no quality assurance during the requirement gathering. Which probably means you were not actually starting your "QA process" , but were actually starting "testing".
  • Re:Quality? (Score:3, Insightful)

    by z80x86 ( 864095 ) on Wednesday March 02, 2005 @08:44AM (#11822231)
    I do not know if I agree with that strict formula you proposed. I have always felt that approaching software with a clear plan for its future is the best way to ensure a quality final product. While systems may often appear to be growing organically, its evolution must be controlled so that it does not deviate far from what was originally expected of it.
  • Re:Quality? (Score:5, Insightful)

    by patrixx ( 30389 ) on Wednesday March 02, 2005 @08:44AM (#11822236)
    Funny, I would say Quality = Knowledgeable_Staff_On_Good_Salary + Deadlines

    BOVE'S THEOREM: The remaining work to finish in order to reach your goal increases as the deadline approaches.

    PARKINSON'S LAW: Work expands to fill the time available for its completion.

  • QA != Testing (Score:5, Insightful)

    by coolcold ( 805170 ) on Wednesday March 02, 2005 @08:46AM (#11822240) Homepage
    and clever != good marks in exams
    testing doesn't make the software any better but testing do find bugs which developers missed. Quality assurance is to make sure that the software is of good enough quality before release and testing does confirm the case.
  • Re:Quality? (Score:3, Insightful)

    by the grace of R'hllor ( 530051 ) on Wednesday March 02, 2005 @08:46AM (#11822243)
    Deadlines focus the development effort. Without the need to ever finish anything, you'll never finish anything because there's *always* something you can add or improve.

    Milestones keep the development on track, and deadlines are used in projectplanning to determine an end state for the development project.

    Besides all this, lots'o'time doesn't give you quality, necessarily. Look at knowledgable modern artists; all the time in the world, and all they produce is a pile [maximonline.com] of crap [sprynet.com].

  • by Anonymous Coward on Wednesday March 02, 2005 @08:46AM (#11822244)
    No, I'm saying that if Linux were to have a QA strategy before the next great project begins, then the product has the potential to be even better.

    Why on this damn website will any criticism, no matter how true/justified be taken as the most vile bile ever spewed?

    And why is any response to criticism met by words to the effect of "And, of course, YOU think that WINBLOW$ is better? Well let me tell you something, you . . . "?
  • by rdc_uk ( 792215 ) on Wednesday March 02, 2005 @08:47AM (#11822249)
    No.

    What you have described is a large bug-hunting exercise.

    QA is a process by which errors are supposed to be PREVENTED, not FOUND OUT.

    That's why QA != Testing

    Better QA == fewer bugs to find (it assures you are building quality)

    Better Testing == more bugs found (it is, in fact, closer quality verification)
  • by Black-Man ( 198831 ) on Wednesday March 02, 2005 @08:49AM (#11822255)
    Every software company I've worked for, the QA department was always separate from Development. Then the classic mistake was always repeated... bring in QA at the tail end of a project and expect them to certify it and find all the bugs.

    Madness!!

  • by bblazer ( 757395 ) * on Wednesday March 02, 2005 @08:50AM (#11822259) Homepage Journal
    I think that the real issue here is the difference between code that works, and code that meets the business rules and need. Anyone can make good code, and have it compile and execute. The problem comes when that great code still doesn't fit the need that it was supposed to fill in the first place. This issue has two hurdles if it is to be overcome. First, are coders that have no business knowledge. Second, business pros that have no software development experience. The coders complain that they weren't given the proper details of the project, and the business guys complain that the coders know nothing about business. I think that all biz people need to take a basic programming course, and all coders need to take a business class. The gulf of poor communication between the two camps is quite large without it.
  • by millahtime ( 710421 ) on Wednesday March 02, 2005 @08:51AM (#11822263) Homepage Journal
    If you have good quality you can cut down or completely out testing. Think of it like a math equation. Y = f(x). Y is the output and f(x) is the input. If you control the inputs with no variation then the output will always be the same so no need to test it. Honda has done this with engines. They save money because they don't test every engine. Yet, all their engines always work.
  • Money Rules it All (Score:3, Insightful)

    by millahtime ( 710421 ) on Wednesday March 02, 2005 @08:54AM (#11822273) Homepage Journal
    A way to look at it is, there is a lump sum for a project. The amount of money (time is money) to produce a product. This amount is made up of base price for product plus rework (which is cost of poor quality). If you can minimize the cost of the rework you can either/or increase profits, cut costs (including time to get product out), improve the design as it is higher quality.
  • by lottameez ( 816335 ) on Wednesday March 02, 2005 @08:55AM (#11822281)
    The writer talks about separating QA from the Development group. In our organization, this was a large part of the problem. First, there was a tendency for the development group to "throw it over the fence" and expect QA to find problems that the engineers couldn't be bothered to look for.

    The QA staff, on the other hand, rarely had engineers of a sufficient caliber that had the insights to search for and find the most insidious problems. Not only that, they (QA) occupied the no-man's land between business users and development, understanding neither area with any clarity.
  • by rdc_uk ( 792215 ) on Wednesday March 02, 2005 @09:02AM (#11822314)
    I used to work as a QA person.

    In our company QA was a separate organisation, for 3 simple reasons;

    1 - you are auditing and commenting on other people's work, not in a peer review "did you think about doing it like this" way, but in a "That is not acceptable; redo" way. Close colleagues in a department are NOT suitable for that role; you cannot be expected to say that about the person in the next cubicle's work, whereas a department with that as their job will be accepted when they do it.

    2 - Keeping up to date on the quality requirements, combined with performing your live QA duties for the engineering department was a full time job. Or at least, it certainly was if the company wanted to keep its ISO9001 certification.

    3 - Its a case of the buck stopping here. In our company project proposals, requirements and plans HAD to be signed off by QA before the funding got released for the project. At the same time, due to our doing telecoms stuff, we had a legal responsibility to sign off that the EMC conformity, physical safety and electrical safety tests had been conducted properly and passed. (and that meant constantly checking updates of the various national standards to ensure the company standards used the strictest requirements in each case). Random engineer is not good enough. (you have to have passed the right courses to audit each various section, need to be a qualified ISO9001 auditor to do the internal audits for that etc)

    Professional QA is a full and seperate job. (but I did get to play with the 20KV discharge equipment!)
  • by dallaylaen ( 756739 ) on Wednesday March 02, 2005 @09:13AM (#11822349) Homepage
    If design is flawed, what should QA do?

    If docs are porly written, and incomplete, how does one decide what's bug an what's feature?

    If the docs depict the program's behavior, not define it, what can QA do? ...And more, see F. Brooks' "Mythical Man-Month" for example, or Alen Holub's "Rope Long Enough to Shoot Your Leg".

    And yes, if everythign is done right from the beginning, the QA people would have enough time to do something except testing.

    Of course, only third of the two ways to write bugless programs works...
  • This hurts (Score:3, Insightful)

    by steve_l ( 109732 ) on Wednesday March 02, 2005 @09:20AM (#11822383) Homepage
    I can see the problems this creates. Dev teams blame QA for being fussy; QA hate the dev team for releasing such junk.

    We have to embrace test-centric development more. JUnit, CppUnit, PyUnit, they make it easy to write tests. But to convince the developers to write good tests, that is another matter. I often find it is harder to write a decent test for something complex than it is to implement the complex thing, but without that test, how do you know it works?
  • Re:Requirements? (Score:3, Insightful)

    by Unnngh! ( 731758 ) on Wednesday March 02, 2005 @09:20AM (#11822384)
    I've found a number of ways to work around a lack of requirements. A good specification is one, but if you are so inclined a static prototype can often achieve as much if not more from the customer's perspective. A couple hours of use case gathering with management and the customer can also achieve amazing results, without the need for lengthy documentation or weeks and months of back-and-forth.

    Then you just have to convince everyone to stick to what they've agreed to when they ask you to change everything halfway through coding!

  • Oh god yes (Score:5, Insightful)

    by Michalson ( 638911 ) on Wednesday March 02, 2005 @09:23AM (#11822400)
    Can someone please forward this to the good old folks at Mozilla. The written QA requirements (on Mozilla) are so cursory that whole features have dropped off the map simply because nobody bothered to check to see if they still worked (i.e. in 1.4 multiobject drag and drop stopped working). Might also help the parsing engine, which continues to have kruft from the Netscape days (like how <b<i> is interpretted as <b><i> instead of as a single broken tag to be ignored [and you can use any tags you want, the second one can even contain parameters, allowing you to really annoy people by adding extra HTML that only works in Firefox/Mozilla, not IE/Opera]). Though really as Slashdot [slashdot.org] has reported before, Mozilla could really use a more robust and systematically tested parser just to avoid potential buffer overrun exploits.

    Bottom line: OSS could get way ahead of commercial software simply by doing proper QA and unit testing (not just the UNIX "it seems to work" test, the "are out of range inputs properly detected or does the program just choke and die") on par with what the best commercial developers have been doing. Just because you have to do all this paperwork and repetive checking when working for "The Man", doesn't mean it's an evil thing that should be thrown out. Sometimes the man actually has some good ideas, even if he spends most of his time shouting about how you didn't dot your i s on that last flowchart.
  • by StormReaver ( 59959 ) on Wednesday March 02, 2005 @09:24AM (#11822404)
    "I mentioned it in another post, but my dad has a good web site that deals with quality issues (IE only, unfortunately)."

    I'm sure the irony of that statement is not lost on anyone. A site, giving advice on good quality, is itself a quality disaster. You'll understand if I don't take his credentials to heart.
  • by rdc_uk ( 792215 ) on Wednesday March 02, 2005 @09:27AM (#11822416)
    " Imagine a complex program containing many code paths"

    Mmmkay.

    "QA spends a day to code a test suite which exercises every code path"

    erm... "QA spends a day"

    Yeah, right.

    You do realise that a FULL code path test suite will, perforce, be LARGER than the source code it tests?

    When doing QA, I used to start writing the test cases for software when the REQUIREMENTS document arrived, so that they were ready for use during the tail end of coding and for the unit testing. Its a BIG job.

    And you design tests from the reqs, not from the code - how will you trap a completely missing boundary case, if you build tests from the source? Or the design?

    Requirements drive source and test design, separately so that the assumptions in the former cannot pollute the latter.
  • by glew ( 412946 ) on Wednesday March 02, 2005 @09:38AM (#11822464)
    That sounds very good. In theory.

    Having worked in a CMM 3 company for a couple years, my opinions of the thing are quite different: CMM, and processes in general, are a tool that managers use to offload their work on the engineers.

    We used to spend vast amounts of time peer reviewing all sorts of useless documents, making estimates for project planning, and so on, additionally to the architecture and coding work.

    This didn't do anything at all for quality. Deadlines slipped like always (often more, because of the time lost to irrelevant stuff). Spec documents were just as Ground-Control-To-Major-Tom-like as usual.

    It did, however, give the managers the warm fuzzy feeling that overcomes control freaks everywhere when they're sure they can track, number, file and index everything that goes on around them. Without having to do any actual work. Without even knowing the first thing about the product we were making (without CMM, a prerequisite for anyone attempting to write any sort of project plan).

    One of our line managers admitted all of this quite openly, one of his favourite sayings was "Since we have processes, I can go home at four every day". We didn't. We got to stay till 8.

    In my experience, CMM should be avoided like the plague. It's complete and utter waste of time, and encourages empty hierarchies.
  • by Bozdune ( 68800 ) on Wednesday March 02, 2005 @09:47AM (#11822529)
    Let's also not forget that the DoD has had a number of programs over the years that attempt to determine whether such methodologies work, and/or attempt to determine what the best methodology might be. Of course, everyone using such a methodology invariably reports that it works fantastically, either because they want the next deal, or because they want their particular methodology to be King of the Hill.

    I worked for a DoD contractor for a while, so I've seen it from the inside -- and, I'd say that using DoD-funded development projects as a measure of anything is ludicrous. Years after my DoD experience, I remember interviewing for a lead hardware engineer. I needed a guy who could build a Z80-based microcontroller board. I had one tech to give him, that's it. And, I needed the board laid out and working in 4 months. I knew this was possible, because I had worked with plenty of hardware engineers who could do this in their sleep, with one layout and no rework. Remember, this is 4mhz, folks. Crosstalk? What crosstalk? Hell, armed with a book and help from the vendor in the form of boilerplate designs, even I could have taken a stab at it, and the last time I hacked hardware was years ago in a college course.

    Anyway, this guy was from a large defense contractor, R******n. Turns out he was PART OF A TEAM that had built a Z80 CPU board over the last 18 months. His particular responsibility had been the DRAM circuit. According to him there were 20 other hardware engineers on the project. Yup, he said TWENTY. That's right. T-W-E-N-T-Y.

    The $64,000 question is, what the heck was this guy doing for those 18 months? I was stunned. So was he, when he realized what was expected of him in the "real" world. I don't care how MIL-spec'd his board had to be, or how much vibration and radiation testing they had to do, or how many $22,000 toilets they had to flush it down to test it, 18 months and 20 people is ridiculous. Period.

    I found someone else for the position. He built the board, delivered it ahead of schedule, and it worked fine. And while he was doing that, in parallel he designed and built another board for an RF hand-held. I guess he wouldn't have fit in at R******n. Nothing against R******n, though. Largest employer in the state. Love you guys. Keep everyone working.
  • by ian13550 ( 697991 ) on Wednesday March 02, 2005 @09:53AM (#11822569)
    Cost, Time and Quality and you can only have 2 out of the 3.

    For instance, you can have a project done fast and cheap but the quality will be lacking. Or you can have a project done correctly and quickly but it will cost you a fortune!

    QA is part of that "having it done correctly" piece that most companies tend to cut out. Most companies can only grasp the Time and Cost factor and fail at the whole "Quality" component when doing pre-project analysis. I do not have enough fingers to count the projects I've witnessed that have been running behind schedule (which is usually a BS timeframe to begin with) that decide to cut time from the QA process to make the deadline.

    The reasons for "falling behind schedule" and all those other fun things that go into bad project management have been discussed before so I won't even mention it.
  • by Digital_Quartz ( 75366 ) on Wednesday March 02, 2005 @09:54AM (#11822577) Homepage
    The problem is, there are two motives to reach CMM 3;

    a) It looks good to our customers.
    b) It reduces our cost.

    Companies that strive for motive A often will do their best to meet the requirements of CMM to the letter, without actually changing what they do on a day to day basis. "CMM says we need to have a baseline and configuration management for our code, so I want everyone to check their work into this new CVS thing, at least once a month", for example.

    It's easy to "meet the letter" of CMM without at all meeting the intent. At my company, for example, there's a core group who is trying to push "scrum" [versionone.net] as a software development methodology, and they make all kinds of bizzare claims that this is somehow consistent with CMM 3, pointing to specific wording within CMM, and making claims that such and such is equivalent to CMM, even if it doesn't quite meet it. Meanwhile, I try to envision a mission critical system like a 767, or a space shuttle, or an ambulance dispatch service produced with scrum, and it makes me afraid to go outdoors.

    Some people are afraid of change.
  • Re:Requirements? (Score:3, Insightful)

    by ColdGrits ( 204506 ) on Wednesday March 02, 2005 @10:17AM (#11822748)
    "A good specification is one, but if you are so inclined a static prototype can often achieve as much if not more from the customer's perspective."

    In that case you have a specification! In the form of your static prototype.

    Nowehere does it say that a specification HAS to be solely a written document...
  • by rdc_uk ( 792215 ) on Wednesday March 02, 2005 @10:18AM (#11822753)
    "From what I've seen of QA, it is testing, just not in the "Testing" phase. It is having well-defined objects, interfaces, input ranges, output ranges, unit tests and so on to make sure that when you assemble everything together, it has few bugs left. Basicly, weeding them out at an earlier point in the process."

    No offense, but you missed out:

    Ensuring that the requirements the SW is built to match are complete / correct in the first place.

    Ensuring that the SW is built in a way that is suitably efficient for the project.

    Ensuring that the SW has at least been thought about in terms of being built for re-use.

    Ensuring that there was at least some thought about "is there something already here that we could re-use or modify?"

    Ensuring that the SW is built in a method that lends itself to maintenance and modification without tearing out of hair.

    Ensuring that some form of profiling or metrication has been performed, in case the project as a whole needs optimisation (being able to look at the metrics for each unit speeds that first "where to optimise" pass SOOOO much)

    Ensuring that throughout all those processes the correct feedback was fedback to the people who actually DID all those things you just ensured...

    ALL of that is part of the QA for software development, very little of it actually involves testing the software does its job right...
  • by CaptKilljoy ( 687808 ) on Wednesday March 02, 2005 @10:32AM (#11822895)
    If design is flawed, what should QA do?

    QA prevents that from happening in the first place

    If docs are porly written, and incomplete, how does one decide what's bug an what's feature?

    QA prevents that from happening in the first place

    If the docs depict the program's behavior, not define it, what can QA do?

    QA prevents that from happening in the first place

    And yes, if everythign is done right from the beginning, the QA people would have enough time to do something except testing.

    Sounds like your company needs to develop a real QA process.
  • by KontinMonet ( 737319 ) on Wednesday March 02, 2005 @10:49AM (#11823033) Homepage Journal
    1) deciding what you want to build, and deciding exactly (i.e., good specs);

    Too often, I've stumbled across over-specified systems that, as a result, are delivered incredibly late. And then, because of time constraints, the whole project is de-scoped and bodged work-arounds are built so that functionality can be 'added later'.

    At the design stage, politics often slows things down. I prefer the continuous approach: When you have enough design, start coding.
  • by radtea ( 464814 ) on Wednesday March 02, 2005 @10:55AM (#11823080)
    In that case you have a specification! In the form of your static prototype.

    A prototype and a specification do not contain the same information. A prototype consists of a single concrete instance of the thing a specification describes. It contains more information than the specification in some respects (the concrete design choices the implementor has made to fill in the gaps the specification is silent on) but more importantly it is also missing information that is absolutely required in a specification.

    For example: What are the tolerances on particular processes? What are the scaling and responsiveness requirements? And so on.

    Nothing that does not capture this kind of information is a specification, and so far as I know only a document can capture this kind of information.

    Don't get me wrong: prototypes are useful. But they are not specifications, because they do not contain the all of the information that a specification must contain.

    --Tom
  • by Anonymous Coward on Wednesday March 02, 2005 @11:09AM (#11823220)
    I've been on various QA/Testing (no, they are not the same, but my team performs them both) teams for the better part of three years now.

    The most severe downfalls I've seen are when the marketing teams and development teams do not adequately communicate with the QA team.

    In order for QA to be successful, the QA team must be aware of and in tune with marketing's idea of how the product is envisioned to perform.

    Nothing is more frustrating than logging a bug that is a critical flaw in the operation of a product to get the developer response "spec discrepancy" ... and the defect is thrown to the wind, because marketing refuses to update their first (and apparently final) draft of specs to accomodate the new feature that would make the product much more palletable to a prospective consumer.

    I was disappointed to enter this thread and see stabs at Microsoft testing and other mindless drivel I'm used to seeing at /. We need some more comments from people in the industry, whose job titles actually include the phrase "Quality Assurance" or "Software Testing" or something.

    Of course going AC here, lest my co-workers lynch me. ;D
  • by Xiver ( 13712 ) on Wednesday March 02, 2005 @11:22AM (#11823394)
    Cost, Time and Quality and you can only have 2 out of the 3.

    Every time I hear this I cringe, because it is complete crap. On a small level or to someone who has no quality assurance practices in place, it appears to be true, but for large projects/products quality assurance is inversely proportional to cost and time precisely because it cuts down on the amount of time required to develop and test a product to its specifications. Quality assurance procedures will help ensure that errors are caught early on in the project lifecycle and develop standard procedures for project development which may have an initial hit on timelines, but in the end save time. The really nice thing about good quality assurance procedures is that they generally get easier as people get familiar with them and time goes on.

    For instance, you can have a project done fast and cheap but the quality will be lacking. Or you can have a project done correctly and quickly but it will cost you a fortune!

    I'm assuming that when you say quickly here you are thinking of thowing more manpower at the solution because, as the time to complete a project increases the cost increases and vice versa. Quality does not mean features and flawless software. It means that your client gets what they ask for on budget and it works like specified.

    Cost is directly proportional to time, quality assurace might add some time to the project, but it has been my experience that the time it adds is usually offset during testing and not having to add features that a client expects during / after internal testing.
  • Re:QA (Score:3, Insightful)

    by timmi ( 769795 ) on Wednesday March 02, 2005 @11:34AM (#11823538)
    In reality, Windows rose to dominance because it was an inferior product brought to market at the right time.

    businesses chose to use windows because they wouldn't have to rewrite all their legacy/mainframe/in-house apps, because underneath it all, you still had DOS. and if your app didn't play nice with windows task-switching of DOS apps, you could exit wondows entirely, and have essentially a clean slate for the custom app to use.
  • by natoochtoniket ( 763630 ) on Wednesday March 02, 2005 @12:12PM (#11823959)
    The major purpose of the CMM is to generate the maximum possible costs. Military contractors are paid on a cost-plus basis. They make money by finding justifiable ways to generate higher costs. They don't make money by selling finished products that are produced efficiently. Similarly, military officers don't get promoted by finishing projects ahead of schedule and under budget. They get promoted by justifying larger budgets and larger staffs, which then require higher-ranking people to run the larger organization. Both sides benefit by finding ways to justify the maximum possible cost for every project.

    Having said that, many of the ideas that are in the CMM are good ideas. Each item has some justification, and can be helpful to generate better software systems. After all, the cost-plus contracts are based on justified costs, not just every cost that the contractor can dream up. There has to be some plausible justification for every cost item.

    So, for non-military purposes, we can use the CMM model and associated literature. We just have to use them intelligently. We cannot blindly implement every detail, because that will make every project over-budget and late. And, we cannot blindly reject every CMM element just because we think the CMM is bloated and inefficient. We should instead consider which management methods and processes are actually appropriate for the current project at hand. The CMM books provide a good inventory of things that we might want to do. We just have to pick the items that are appropriate for the particular project, in the particular company and business environment.

  • by patrixx ( 30389 ) on Wednesday March 02, 2005 @12:23PM (#11824088)
    the reason that "testing must take up the slack" could be because of lack of reasonable deadlines/gates in the earlier development stages?
  • by thagrol ( 864136 ) on Wednesday March 02, 2005 @12:42PM (#11824305)
    I've been on various QA/Testing (no, they are not the same, but my team performs them both) teams for the better part of three years now.

    Try 15. (In software and in manufacturing)

    The most severe downfalls I've seen are when the marketing teams and development teams do not adequately communicate with the QA team.

    I have to disagree, the most severe downfalls I've seen are when the ship to customer date isn't allowed to slip but the date for development submitting to test is. Test end up having to do a month's work in a week and there is no time to fix bugs. So it shouldn't be a surprise when the product ships with known, unfixed crtitcal bugs and the customer ends up cancelling the entire contract.

    In my experience what you need to produce some sort of "quality" product is a sales team that doesn't over promise, senior management that are aware that you can't just tack on some testing at the end and it'll be alright, clear specs, and enough time to actually do the job.

    Oh, and a test team that are commited to the role, not a bunch of wannabe programmers who see it as something to do for 3 months just to get a foot in the door.

    Good communication helps, as does a good working relationship between teams but neither is entirely essential.

    Having said the above, it's also my belief that the strongest impediment to producing high quality software is the end user license. As long as software producers can disclaim any and all responsibilities, fitness for purpose etc there is no incentive for them to do it right.

  • by javaxman ( 705658 ) on Wednesday March 02, 2005 @02:22PM (#11825476) Journal
    I'm currently working on a monstrous behemoth of an application and the need exists for some way to automatically test my app out. Can anybody suggest free or trial-based tools/software packages that can help me unit-test my code before I submit it to end users from second-stage testing?

    The tools you are looking for exist, but they're not cheap and they need someone to create test cases for them and run them. I assume it's a web app you're talking about? Really, you need to hire a professional and give them professional tools. There's no good way to test your app cheaply. What you really need is a copy of Mercury Interactive's Astra or Segue Silk and someone to run it for you.

    If you have to do it cheaply and by yourself, good luck and have fun writing those tests. Do a web search and have fun evaluating tools and writing tests. Here [aptest.com] is the first hit I got on google. I've never used it but maybe you could look into ieUnit [sourceforge.net].

    Oh, and my advice is to find a job closer to home. That commute will literally kill you. No, really, I mean it, you'll die.

  • by Ungrounded Lightning ( 62228 ) on Wednesday March 02, 2005 @08:43PM (#11829693) Journal
    if QA runs the show: it never ships, as there are always improvements to be made. Always.

    Then you're doing QA wrong.

    One of the big parts of having a spec and a QA process is to know WHEN TO STOP.

    When you get the function of a part right, and the tests have been run and show it's right, you MOVE ON. You only come back to it if the debugging of a later part reveals a hidden bug that the earlier tests missed (or couldn't test without the availability of the later part).

    When you've moved on from the last part you're DONE.

"God is a comedian playing to an audience too afraid to laugh." - Voltaire

Working...