Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Programming IT Technology

Increased Software Vulnerability, Gov't Regulation 291

PogieMT writes "An article in the New York Times (registration required) suggests that the rash of security flaws, viruses and worms is leading a push towards greater regulation by the government, which, according to the piece, has largely relied on the efforts of individual companies."
This discussion has been archived. No new comments can be posted.

Increased Software Vulnerability, Gov't Regulation

Comments Filter:
  • Like the cars (Score:4, Interesting)

    by ComaVN ( 325750 ) on Monday September 01, 2003 @07:04AM (#6843721)
    Much like car safety between the '50s and '70s. Manufacturers simply didn't care about safety, because the customer didn't care.
    • by Anonymous Coward
      They'll care when their baby pictures get wiped out, taken with that nice new digital camera
      • when their quicken data or other very personal info is 'liberated'. or any number of other personal information. can you imagine how fast things would be patched if a virus/worm scanned for quicken/quickbooks/misc financial data and emailed them to people in the local address book?

        eric

  • Hmmm (Score:4, Insightful)

    by RMH101 ( 636144 ) on Monday September 01, 2003 @07:08AM (#6843736)
    Is this being used to restrict individual freedoms in a similar way as 9-11 is used to?

    Call me cynical, but I don't think the US government are getting into this for the sake of safeguarding my PC from viruses...

    • Re:Hmmm (Score:5, Insightful)

      by rknop ( 240417 ) on Monday September 01, 2003 @07:14AM (#6843765) Homepage

      Call me cynical, but I don't think the US government are getting into this for the sake of safeguarding my PC from viruses...

      It's cynical, but it's also not an unreasonable fear based on anybody who's been rationally observing the behavior of our government recently.

      I fully expect that we'll see increased security resolutions which are ostensively tough on companies like Microsoft, but those companies will embrace them (while all the while getting good PR about "doing the right thing and making the right sacrfices") because ultimatly they will only be minor inconveniences... while the regulations that show up will all but prohibit free software (at least for commercial purposes, and possibly for anybody who wants to connect to the Internet), meaning that in the long run Microsoft benefits hugely from those "minor inconveniences".

      Meanwhile, the regulations-- like a lot of what we've seen with airport security-- won't increase actualy computer security one whit, but anybody who complains about them will be chastised by John Ashcroft as a whiner who won't let the government do what it needs to safeguard our homeland.

      Yeah, I'm cynical too.

      -Rob

      • Re:Hmmm (Score:3, Insightful)

        by Chexum ( 1498 )

        It's cynical, but it's also not an unreasonable fear based on anybody who's been rationally observing the behavior of our government recently.

        I would oppose any regulation with all my instincts, but let's look at it this way: when was the last time an electrician, or architect/house-builder handed you a paper that with the money you forked over, they can only make product only *this* good, and you are responsible for any damage they may be causing, not them, and forced you sign

  • Hmm.. Regulation (Score:2, Insightful)

    by dbs_flac ( 701673 )
    Who is going to pay for regulation? I can see goverments passing it between them waiting for someone else to pay. Self regulation by software companies will not work, can you see Microsoft, SCO, Sun and Red Hat sitting down to draft a policy? I can't.
    • Why even ask who is going to pay for it? Every government initiative is funded via taxes..

      Oh, and higher prices to the consumer...

      Is anyone surprised this was going to happen? Im only suprised the goverment hasnt gotten involved until now.
  • by sql*kitten ( 1359 ) * on Monday September 01, 2003 @07:12AM (#6843750)
    Regulation is not the answer - professionalism is. The government has oversight over the construction industry for example, but engineers are accredited and the profession is run day-to-day but the professional institution, in the UK this is the Institute of Civil Engineers. Same in medicine, the government oversees, but day to day regulation rests with the BMA, the British Medical Association, and doctors answer to them. Same with lawyers, accountants, investment bankers... even lifeguards and hairdressers have professional bodies.

    Software development needs to become more like engineering, and software developers should be required to take a qualification like CEng (UK) or PEng (US) in order to work in positions of authority and responsibility. Remember that engineering is about public safety - bridges don't often collapse, buildings don't often topple, and that's all because the people designing them have been certified by independant bodies. Programmers of safety-critical systems are already often required to be certified by the relevant body, usually that of the electrical engineers.
    • by Ed Avis ( 5917 ) <ed@membled.com> on Monday September 01, 2003 @07:23AM (#6843795) Homepage
      I'm an MEng and I've still written programs that crash... so have you. It's not a question of certification but just how much time you're prepared to take writing some code (by which I include choice of method, programming language and so on) and testing it. You can have thirty years of experience and still bang out flaky code if you're in a hurry. And if flaky code is all that's needed for the particular task, why not?

      Rather than regulation we should let the market decide. Vendors could undertake: I will pay you $100 for each crash. Sometimes this already happens, eg with guarantees about the number of nines in server systems. The biggest problem is deciding responsibility for any faults. If an operating system call, which (according to POSIX or whatever) should not return null, one day does return null and the application crashes, who should pay up? And how do you find out whose fault it was? Running the whole system in some kind of virtual machine where you can roll back the last few seconds of execution would be one answer.
      • by sql*kitten ( 1359 ) * on Monday September 01, 2003 @07:54AM (#6843923)
        I'm an MEng and I've still written programs that crash... so have you.

        Sure, it wouldn't be a perfect system - but it would better than the situation we have now, where no-one is willing to take responsibility for quality. A strong professional body for granting certified status, backed by a public unwillingness to buy software that didn't have a signature on it from a qualified engineer (maybe in turn backed by a law that some software must be signed off to be sold to the public) would work wonders.
        • If there were public unwillingness to buy software that didn't have a signature on it from a qualified engineer, then people wouldn't buy software that didn't have a signature on it from a qualified engineer.

          Most of the worm/virus issues exist b/c of code written in c rather than in a safer high level oop language where you don't get buffer overflows, sloppy use of pointers, etc.

          Most of the problems with worms/viruses, etc, are due to sloppy sysadmin practices. Of course, with better code sysadmins could
      • I'm an MEng and I've still written programs that crash
        You can design something so critically that if one bolt or one thread is below specified strength, the whole thing will crash. But normally you design with a safety factor so that a number of things can be off-spec or break and still have the whole thing hold together and function properly.
        Now, how do you do a safety-factor in programming ;-)
    • Control and regulation is what the governments of the world does best.

      The very existence of a government translates to the control of its people, and its resources.

      Them wanting to control the IT market, under the guise of 'for your safety', so common man will accept it, is an expected maneuver.

      Not that i agree with it, but its expected and inventible.. Just one more step towards total control of the public...
      • by Eric Ass Raymond ( 662593 ) on Monday September 01, 2003 @07:29AM (#6843823) Journal
        I'll choose a democratically elected government over a plutocratic regime of corporations (=markets) any day.

        Them wanting to control the IT market

        Not all government control over the markets is bad. It's a fact that a capitalist society cannot self-regulate - it's natural growth is always towards a monopoly. This unhealthy growth cannot be curbed by some internal mechanism inherent in he markets (as libertarians like to believe) and external control is always required at some stage.

        • Name one monopoly that was achieved without the direct backing of government force, or more commonly, by exploiting an overly complex, ambiguous system of law.

          Government is at the root of monopoly, not some "natural tendency of the market". The natural tendency of the market is to promote competition -- only government can prevent or eliminate it.
    • I really wish that would work. But the problem is that the software industry is not like construction. If a house is designed poorly it could collapse and cause serious damage. There would likely be lawsuits involved and the construction company would be bad publicity. They would lose market share and possibly fall out of the buisiness entirely.

      If a software program is poorly designed, it crashes, Joe User restarts his machine and goes on with his life. He doesn't even bother to investigate what ca

      • by sql*kitten ( 1359 ) * on Monday September 01, 2003 @07:48AM (#6843903)
        If a software program is poorly designed, it crashes, Joe User restarts his machine and goes on with his life. He doesn't even bother to investigate what caused the crash because it happens so often.

        But it is possible to write reliable software. Aircraft, for example, run on extremely reliable software. The way it works in civil engineering is, if you can't get a CEng to sign off on the plans, you can't go ahead with the project. A CEng won't sign unless he's sure, because if it fails, he's responsible and he'll likely never work again. The fact that he's an employee is neither here nor there, he answers to the ICE, not the company. A similar approach could be taken with software - make the senior programmer on a team personally responsible, and give them the authority - independant of the company employing them - to say yes or no.
        • Aircraft, for example, run on extremely reliable software.

          An aircraft is a pretty simple environment to run software in, compared to computers. No, really. Computers are expected to run arbitrary combinations of software, to perform equally well as a platform for gaming, a database server, and a network firewall. Computers come in zillions of hardware configurations and are used for zillions of different applications, and a piece of software that happens to be run on one of them is expected to run smo

        • You can write reliable software for a plane, thats true... but a plane is a relatively simple, relatively isolated system. It does a limited number of things in a limited number of ways, and therefore it can be tested exhaustively and completely. Thats simply not the case on a desktop computer, however. You can't exhaustively test a comsumer operating system because you can't possibly know everything that will be done to it. The same is true of a word processing suite, a web browser, various server pro
    • Certifying the developers wont help if the management is still pushing pushing to ship software with inadequate testing. Micro$oft already hires many of the best and brightest programmers in the world, and yet their security still sucks. Therefore the problem must be more systemic; simply put, their corporate culture and procedures must not reward designing in and implementing secure software. Even after the "Trustworthy computing" initiative, this still appears to be true. Imagine civil engineers working f
      • Certifying the developers wont help if the management is still pushing pushing to ship software with inadequate testing.

        It will if those developers are personally responsible for the work, accountable to a supervisory professional body, and liable to lose their professional status and hence livelihood if they make a serious mistake. All the managers in the world won't get a known bad product out the door at that point, because every professional developer will tell them where to go. It's like unionisati

    • by awol ( 98751 ) on Monday September 01, 2003 @07:45AM (#6843892) Journal
      Of course regulation is the answer. But the implications are horrible. Any doubt that we are living in the "wilds" of the post revolution expolosion just consider the issues of industrial safety immediately after the industrial revolution. It was a disaster, people were killed and maimed hourly. Look at software, thankfully few people are actually harmed but some of what we "professionals" produce is just crap.

      Professionalism is an answer to nothing in this case. Regulation comes in many forms. Pick your jusrisdiction and even your industry and you will find a litany of standards and regulations to which a product must conform before it can be sold. Fire safety for clothes, building materials, Electrical safety standards etc etc etc. One recurring theme seems that most of these standards relate to safety, or to paraphrase to reduce the human cost of substandard products. Having never worked in the industry, I do not know, but I can imagine that the standards required for medical equipment software (pacemakers et al) and things like nuclear power stations are much higher. This is not a question of the qualifications of the people who do the work but opf the output of their work and that is regulation, plain and simple.

      Personally I think that the market is the right tool for many of these regulations, but that requires better information and we all know how companies are about disclosing the true nature of their products at the moment, but I digress. The other point is that whilst I am comfortable with my ability to choose the prudent or safe product, I don't trust the vast majority of morons out there to do the same and if they drive a crappy car they can kill me, so I am happy to have regulated standards.

      Software, ah yes software, well for starters with most software the worst thing a crash or defect will do is cost you money (or make you late for a date), so I am not so sure that I want so much regulation. Secondly, due to the nature of the process, software is more art than engineering, and that is nothing to do with the professionalism of the people writing it. Now, it is true that the baseline at which the process turns from art into engineering is increasingly high (I am comfortable relying on my compiler to turn my Arty C code into engineered machine language and that the hardware will interpret this in a way that is engineered, whereas thirty years ago that was not so much the case) and in future that boundry will be higher still, however it is not a question of the "capabilities" of the industry participants that currently determines that level and getting us to a point where it is will take a long time and a number of really astounding revolutions in the tools at our disposal.

      Having said all that. I would love to see "BS01232 - Computer Operating Systems" that defined a minimum standard of performance, but such a thing is a logistical nightmare do define yet alone to actually implement, so in the mean time I will just run the OSes for which my tasks are best suited and grin and bear the pain.
      • Having never worked in the industry, I do not know, but I can imagine that the standards required for medical equipment software (pacemakers et al) and things like nuclear power stations are much higher.

        I _have_ worked in medical equipment software; not implantable devices but other life-critical stuff. Let's just say that neither the FDA nor any other regulatory organization looked at a single line of code; there were documents they did look at, but not code. Nor was any certification required to work

    • You're right ... professionalism is the answer. But professionalism is not something that can be easily mandated: certification means absolutely nothing if the people being certified aren't fundamentally trustworthy. Being able to pass a test is no indicator whatsoever as to the the kind of human being you are and the quality of work you will choose to perform, it just means that you can pass a test. And you simply destroyed your case by using doctors, lawyers, accountants and investment bankers as examp
    • You want me to get a fucking license to program and toady up to a bunch of self-important people on the "Board of Professional Programmers" in order to practice my profession? That's regulation by another name, just by a different body. No thanks; leave the strangling over-regulation and "professionalism" to fields which are pretty much mature, static, and stagnant... though perhaps that's the _reason_ they're stagnant.
    • Who is going to pay these costs? The costs for certification and increased education are high, and will undoubtedly increase the cost of developing software.

      Why not regulate that everything has to be perfect? Some M&M candies are a little lopsided, so let's pass some regulations and make the employees get a license. That way nobody will have to eat another lopsided M&M, they'll all be made to a 10nm spec.

      How about instead we just let consumers decide what's acceptable, and what prices they're will
  • by jafo ( 11982 ) on Monday September 01, 2003 @07:12AM (#6843752) Homepage
    Regulation may or may not work. What would really work would be if the government (Microsoft's biggest customer, I've heard) stopped buying their products in favor of others that are more secure. Re-evaluate that when Microsoft's products have less of an issue.

    I know that all systems have some security problems or another. I don't recall any of them having sent me a thousand e-mail messages every day, though. And it's not like this is the first time.

    Let the government talk with it's money and people will listen.

    Personally, I don't really like my tax money going so much to Microsoft. For one thing, I don't like that the privacy of my information and security of the systems relies on something that seems to have so many problems.

    Sean
  • retro posting (Score:4, Interesting)

    by segment ( 695309 ) <sil@poli[ ]x.org ['tri' in gap]> on Monday September 01, 2003 @07:14AM (#6843762) Homepage Journal

    I tried to submit something similar before as an article but it was denied ... and I sincerely thought it is very relevant to this. According to the NSA's "Statement on Cybersecurity [nsa.gov]" paper released earlier this year, there were a few people who are spooked as the government seems to want to either backdoor or control somehow software under the guise of 'tougher security'
    A significant cybersecurity improvement over the next decade will be found in
    enhancing our ability to find and eliminate malicious code in large software applications. Beyond the matter of simply eliminating coding errors, this capability must find malicious software routines that are designed to morph and burrow into critical applications in an attempt to hide. There is little coordinated effort today to develop tools and techniques to examine effectively and efficiently either source or executable software. I believe that this problem is significant enough to warrant a considerable effort coordinated by a truly National Software Assurance Center. This center should have representatives from academia, industry, federal government, national laboratories and the national security community all working together and sharing techniques to solve this growing threat.
    And to add insult to injury to MS, a letter was sent to Tom Ridge [politrix.org] asking the Dept. of Homeland Sec to limit or stop it's use of MS products due to insecurity.

    Personally I would stop using machines if it were possible to have some form of monitoring of my actions without my authorization. Aside from that it's not a secret that the NSA has been accused of corporate espionage [apc.org], so I would hope large corporations would think twice about giving them any form of say when it comes to codes for commercial software.

  • Trends? (Score:2, Insightful)

    by NtwoO ( 517588 )
    Isn't it strange how there is a marked surge in software control in the past few months with microsoft's main competitor being an OS that is being built with a relatively low centralized control
    • I would run out of moderator points just when an actually insightful post appears.

      Regulation / certification / etc. have always been tools of large corporations to keep smaller players out of the game. Can you think of an easier way to marginalize Linux than shoving it in the box of, "it's not made by certified programmers; it's untrustworthy"?

  • by Alien Being ( 18488 ) on Monday September 01, 2003 @07:21AM (#6843786)
    Gates is probably telling Bush "see, this is why we need trusted computing." Bush will declare that either you are with him, or you are with the terrorists.

  • Blame the user (Score:2, Informative)

    by madsen ( 17668 )
    In just about every report on worms or virus attacks the user is blamed for propagating the problem. In the article Scott Charney (MS security chief) tells the users to get antivirus software and keep it up to date.
    That wouldn't be necessary if the user does as his third suggestion, patch the system.
    And that wouldn't be necessary if the system would be built more securely from the start.

    A good idea for MS would be to not make their stuff so userfriendly that it automatically executes every virus attachem

    • Re:Blame the user (Score:3, Informative)

      by dzym ( 544085 )
      A good idea for MS would be to not make their stuff so userfriendly that it automatically executes every virus attachement that it comes across but instead would warn the user by default.
      The default behavior for Outlook, Express, has already been to do this. It is certainly not Microsoft's fault a select subset of individuals aren't patching or are smart enough to be purposefully circumventing their attachment protections but are dumb enough to run attachments anyway.
  • While regulation of software might sound like a good idea to the anti-Microsoft crowd, consider how it would effect free software developers. Imagine if you couldn't release any software that hasn't been vetted by some government agency - that would be end end of 99% of the open-source projects out there.

    And even if there were some excemption for not-for-profit developers, what about distribution companies like Redhat? They would be out of business in seconds ..

    • Imagine if you couldn't release any software that hasn't been vetted by some government agency


      No, imagine if you couldn't sell any software that hasn't been vetted by some government agency...

  • Reprint of the story (Score:3, Informative)

    by Florian Weimer ( 88405 ) <fw@deneb.enyo.de> on Monday September 01, 2003 @07:24AM (#6843800) Homepage
    Here. [nybor.com]
  • In my opinion the easiest way to cope with this threat is to make software companies responsible for their products - see article by Declan McCullagh [com.com].
    Of course this regulation has to be done carefully - we shall deem liable for damages only those companies that require MONEY for that product: for instance when you install free version of RedHat Linux - RedHat (or anybody else) is not responsible for the damage, yet if you pay for this distro - then RedHat _shall_ be responsible - they can simply buy an ins
  • by goon america ( 536413 ) on Monday September 01, 2003 @07:29AM (#6843819) Homepage Journal
    People accept the low level of software quality simply because the thought has never entered their heads that things could be any different. MS can get away with it, much like the old AT&T of yore, because it knows that switching and using an alternative is costly enough, if only cognitively costly enough, that people will be willing to accept a level of frustration up to the value of the cost of switching before doing so.

    Regulating computer safety makes these guys exactly like the AT&T of yore. And don't we all know what happened with that?

    So let some damned competition into the market. The only reason to trust these guys in any other situtation is to simply not understand the idea of a world without them, and sadly that seems to be the way most people think.

  • A few years ago there was a push on to provide home users with "safe" connections with the ISP running a firewall and virus scanning. What ever happened to this? While this would not fix everything it would help a lot, especially for inexperienced users. The current situation is kind of like making people do their own water purification at home.
    • A few years ago there was a push on to provide home users with "safe" connections with the ISP running a firewall and virus scanning. What ever happened to this? While this would not fix everything it would help a lot, especially for inexperienced users. The current situation is kind of like making people do their own water purification at home.


      Because human stupidity is extremely difficult to firewall.
    • " Why isn't security the ISP's responsibility? "

      You don't want the ISP to firewall for you. For this extra "service" you'd pay more. To open an extra port (to play quake for example) you'd have to pay extra. This would lead to every application using port 80 so they can get through the firewall, and then another mechanism (MS SOAP or whatever) to run other stuff through that port. At that point nothing is different except things are more complicated, and you gave up some freedom. Not to mention it makes t

  • by Anonymous Coward
    You don't buy a car for $20, $50 or even $399. Nor do we build bridges for anything near that cost. Realize that adding regulation will not significantly change the security issues and will cost end users tremendously.

    You thought software prices in the 80's were horrible, wait until it costs you $70,000 for a text editor (that's been "certified").. that's where we're headed.

    Software "Engineering" is still in its infancy. It's like civil engineering was back hundreds of years ago. In order to create more s
    • heck, you can't sometimes trust sofware that does cost $50k and upwards, and actually, call me a monkey and spank my ass, but the more the software costs the more probable it is that it is designed just for you(or just very small group) and thus has gotten way less testing than widely adopted cheap softwares.

      but yes, software engineering is in it's infancy, much more in infancy than other fields of engineering were over 2000 years ago.

      who knows, maybe there will be new ways to develop programs that are mo
  • by penguin7of9 ( 697383 ) on Monday September 01, 2003 @07:35AM (#6843848)
    I think regulation is the wrong solution. A better solution is to hold companies responsible for security breaches.

    Everybody keeps passing the buck: businesses blame the software company, software companies blame hackers, and ultimately the taxpayer and customer ends up paying for the incompetence and poor choices of the businesses.

    Businesses should be primarily responsible for the harm that arises from the software they choose. If they want to pass on the risk of their choice to the software company, that should require an explicit contractual agreement.

    And the government should get out of trying to regulate how software is written, and the government should get out of trying to catch "hackers".
    • If business is to be held for its actions or inactions, it is not something that you want for IT only. When an accountant is held personally responsible for the integrity of money transfers, it would be the end of money laundering. We would be seen to have many more white collar criminals. There is a huge intrest NOT to make people / companies accountable for their actions.

      When you are to be held responsible for the security of your systems, you need management tools that enable you to do your job; to patc
  • Yes! Yes! (Score:4, Insightful)

    by moehoward ( 668736 ) on Monday September 01, 2003 @07:35AM (#6843849)
    Any user who does not patch daily and harms another due to not being patched should be punished. Here is how I think it should work....

    A few big ISPs should simply start cutting service to those who have been backdoored and are zombies, have opened virus laden e-mails, or are otherwise infected and causing others problems. For example, no firewall on an open, always-on connection. Especially cable modem ISPs and DSL providers should do this. It should be VERY heavily marketed ... "If you don't patch and change your behavior, we cut you off without warning."

    My feeling is that by doing this, people will finally start learning how to patch and how to not open e-mail attachments. People will get firewalls and AV software ASAP.

    I have seen the threat of this work on a small scale. ISPs are dimwitted morons for not requiring this in the first place. How stupid to give a bunch of newbies loaded guns and then deny responsibility. Buy stock in firewall and AV companies!
    • Re:Yes! Yes! (Score:3, Insightful)

      by MadKeithV ( 102058 )
      "Any user who does not patch daily and harms another due to not being patched should be punished"

      95% of the people I have to go visit to solve serious computer-related problems wouldn't even know what the word "patch" means.

      To me, requiring the average joe user to be on top of his patches is like asking average joe driver to stay on top of the advancements in electronic motormanagement technology. I just want to drive the damn car, fill up the water reservoir every now and then, and take the car in for r
      • I completely disagree. If my wife can click the "Install Patch" button when Windows XP pops it up, anyone can.

        How hard is that?

        Do your user's have problems finding the "On" button? How do they possibly get any usefulness out of a computer if they are such complete dolts?

        The patch thing is not hard.

        If they are that idiotic, just turn on the switch for them so the patches automatically install. You are just being difficult and somehow want to change your completely unrealistic experience into some jihad t
    • When I had DSL service, with a static IP address, in Allen, TX (a suburb of Dallas -- damn I miss living in Texas), my ISP made it clear to me that I should damn well get a firewall. Of course, (a) I knew this, (b) let them know, (c) told them I appreciated their efforts to educate their customers.

      Now, this wasn't perfect, as they didn't require me to use a firewall, but it was better than nothing.

      My ISP in Whitby, ON (Canada) (a suburb of Toronto -- damn I miss living in Texas), went a bit further: the

  • Solution - simple (Score:3, Interesting)

    by ajs318 ( 655362 ) <sd_resp2NO@SPAMearthshod.co.uk> on Monday September 01, 2003 @07:42AM (#6843879)
    The idea of software being distributed without warranty dates all the way back to the first ever spreadsheet. The software company's lawyers were worried that if someone used the programme to design a suspension bridge, and it later collapsed and investigation proved that it was due to a flaw in the software, they might get sued. Furthermore, it would have been a physical impossibility to test the software in all circumstances. These were the days of 2MHz 8080 processors, lest we forget.

    The sane response would have been "let them try, we'll never have what they're asking for and you can't be sued for what you've not got." Instead, that company explicitly disclaimed any warranty on their software, and the situation has persisted since. Today, one company is responsible for a lot of software, and they could easily afford to pay for several suspension bridge failures. But the law has not caught up with reality. The solution is simple and everyone will like it except the distributors of substandard software.

    My proposed solution is to require all software to be guaranteed to perform substantially as indicated on the packaging. If you buy any other product, and it doesn't do what the literature said it was going to do, then you are entitled to a refund.

    The only exception to the requirement for a guarantee would be where the source code is available for scrutiny. IMHO, reading the source code before deploying a mission-critical application is just Due Diligence. It has been stated by some that this is a lot of work to expect people to do. It is, but there is nothing to say independent bodies could not audit software for a fee. The GPL does not seek to prohibit anyone from making money out of their own work; only by misappropriating other people's work.

    Whilst stopping short of my Ultimate Ideal, I think this is a fair compromise. Most goods are required to be guaranteed, why should software be any different? But Open Source software is more like self-assembly furniture: you {or a suitably qualified person in your pay} can examine the pieces {source code} before they are put together {compiled and installed}, determine suitability for your application, and make a decision: use as-is, use slightly-modified or reject outright. You only get your money back on kit-built stuff if there are actually any pieces missing; everyone understands that circumstances of deployment are beyond the control of the supplier.
  • by Cooper_007 ( 688308 ) on Monday September 01, 2003 @07:46AM (#6843896)
    Here you have a company that goes out and make some cool software that scratches an itch that a big chunk of the world also feels. The program is an instant hit, becomes #1 on CNET's download page and the main coder gets his picture on the cover of Time (one can always dream).

    Suddenly a bug is discovered which will give others full control of your system. Acting quickly, a patch is created and a fixed version is put online, and warnings posted to all the regular places.

    Several weeks later an exploit program is seen in the wild, attacking systems owned by CLUELESS USERS who either never knew of the problem, or were too lazy/overworked to fix it. The damage is immense, and in the current fingerpointing society most people blame this company even though they did everything that could be reasonably expected from them.

    And now a growing group of people feel the government should be breathing down this company's neck for not making secure software?

    Replace "company" with "group of OSS developers", and tell me how things should be different for this case, and why.

    Mirrors suck, huh?

    • The group of OSS developers have released the source code for anyone to examine and thereby determine its suitability for deployment in their application. If it does something you didn't expect, well, you should have read the source code, or paid an independent expert to read it for you.

      The closed source company, by refusing to show you the source code, have taken this responsibility entirely upon themselves.

      Also, when you write Open Source software, you are conscious that any mistake you make will be
  • by Korgan ( 101803 ) on Monday September 01, 2003 @07:49AM (#6843907) Homepage

    Get rid of the whole regulation issue. Thats not necessary. It would be far better to make the software publisher liable for any faults or flaws in the software that led to an incident such as MSBlaster, Slammer or any other number of worms out there.

    Virii like SoBig.F are not something that can be avoided because the vulnerability there is the user themself. The only way to sort out virii like that is to educate users to not open email they are not expecting or recognise. Even then its still a risk.

    If Microsoft were liable for the damages caused by the worms such as MSBlaster and Slammer because their software was vulnerable, don't you think their culture would change very rapidly? Instead of having the worst security reputation, they'd suddenly have the very best. Win2k3 is a good start in the right direction by disabling everything by default. I applaud that. Now they need to sort out their coding practices so that these sorts of issues are a non-event.

    Governments don't need to regulate anything. All they need to do is make it illegal for a company to not take responsibility for faulty products, regardless of the product. It worked in the automobile industry, its worked in the medical industry, its worked in the engineering industry.

    If my car explodes because of a fault in the fuel line at manufacturing, I'm perfectly within my rights to sue that company. If my computer becomes completely unusable because a vulnerability allowed someone to damage it or similar, why shouldn't I sue the publisher of that software? I'd also reserve the right to sue the person that exploited that vulnerability and caused the damage.

    Don't need regulation, just liability and a warranty of suitability for a purpose. 'This OS is guaranteed to perform to XXXXXXX level and is considered suitable for XXXXXXXXXX purpose.'

    • by sql*kitten ( 1359 ) * on Monday September 01, 2003 @08:05AM (#6843964)
      If Microsoft were liable for the damages caused by the worms such as MSBlaster and Slammer because their software was vulnerable, don't you think their culture would change very rapidly?

      Well, given that Microsoft had released patches for both of the vulnerabilities exploited by those two viruses long before the viruses were ever released, I'm not sure it even should be liable. Nothing helps if the sysadmins don't stay on top of things.
    • by ctid ( 449118 ) on Monday September 01, 2003 @08:25AM (#6844061) Homepage
      Get rid of the whole regulation issue. Thats not necessary. It would be far better to make the software publisher liable for any faults or flaws in the software that led to an incident such as MSBlaster, Slammer or any other number of worms out there.

      This wouldn't work because then no-one could use (eg) Debian Linux, as there is no one company behind it. The right way to prevent security problems is to make sure that there is fair and open competition in the OS market. This way a company whose products are proven over and over to be unreliable and insecure (naming no names) would simply be overtaken by its competitors. Once the company saw the writing on the wall, they might decide to focus properly on security, or run the risk of being driven out of the market. To achieve this, companies who sell OSs and applications should be forced to open up their secret protocols and file formats to ensure that competition is fair. This will have the additional effect of allowing a more varied ecosystem of OSes on the internet, making it far more difficult for virus and worm writers to hit a majority of machines.


      Although these ideas would be good for competition and good for security and good for the economy, they won't happen because that is not how democracies work any more. Certain companies will buy political influence to prevent this happening. We are already seeing Microsoft claiming that it's "impossible" to create a secure computing platform without secure hardware. This sort of madness is likely to be the result of government intervention.

    • If my car explodes because of a fault in the fuel line at manufacturing, I'm perfectly within my rights to sue that company. If my computer becomes completely unusable because a vulnerability allowed someone to damage it or similar, why shouldn't I sue the publisher of that software? I'd also reserve the right to sue the person that exploited that vulnerability and caused the damage.

      That line diffuses your whole arguement. This is not a problem in the computerthat rendered it dangerous without anything h
    • How would vendor liability affect open source software?
    • "The government has essentially relied on the voluntary efforts of industry both to make less-buggy software and make systems more resilient," says Michael A. Vatis, former director of the National Infrastructure Protection Center at the Federal Bureau of Investigation. "What we're seeing is that those voluntary efforts are insufficient, and the repercussions are vast."

    Wrong it is not the voluntary developers (of Open Source), but the salaried developers at MicroSoft that have the problems.

    The volunta

    • "What we're seeing is that those voluntary efforts are insufficient, and the repercussions are vast."

      I think that here "voluntary efforts" refers to businesses' efforts to handle security without regulations and laws forcing them to (i.e. 'voluntarily'), and doesn't refer to Open Source developers.

      Have a nice day.

      --
      Simon

  • by Anonymous Coward
    When M$ Windoze becomes fully warrantied (M$ can afford it), and most OSS coders don't dare accept liability for their software .... "Why should we be using Linux for our company systems? It doesn't even come with a guarantee! On with the windoze installation!"
  • Part of the charm of the internet has always been its lack of regulation. It has been the last frontier that we can still explore. There were parts of it that should have been labelled on the map, "Here be monsters and sea serpents". Now, it is becoming like the cow town where the railroad now reaches, and the women have arrived, and they want to civilize the place. They want to hire a sheriff and close down the saloons. They want a dry goods store and a bank. The mountainmen and adventurers who first came

    • If you want to explore, you'll have to start risking your life. Is it worth it? For some, it is.

      $/pound for space flight continues to drop. Right now it's about $3000-$10000 for LEO. Give it another decade and we'll hit $1000 or even $500/lb. Then life starts to get really interesting...
  • From the article:
    "What we're seeing is that those voluntary efforts are insufficient, and the repercussions are vast."

    Did I miss a meeting? Is this not Slashdot? I'm skimming through the posts and seeing a lot of cammoring that seems to approve regulating software. IS NOT LINUX A VOLUNTARY EFFORT?! Hand this guy a copy of Knoppix and tell him to crack it!

    The biggest problem with computers on the internet today is the number of people who ran out and bought a computer because it looked like an interactive

  • by dpbsmith ( 263124 ) on Monday September 01, 2003 @08:46AM (#6844140) Homepage
    The cause of the current problem is only partially due to insecure Microsoft software. It is very noteworthy that Windows 98 and 95 were immune from the latest round of malware (W32/Blaster, W32/Welchia, W32/Sobig.F). The main cause is monoculture--the dominance of a single operating system, Windows NT and its variants.

    What we need is a truly competitive market in which many operating systems compete, no single operating system dominates, and a market that uses many operating systems therefore demands and rewards inoperability and writing software to standards rather than writing to a single vendor's API.

    Why don't we have it? Because Microsoft was allowed to get a monopoly and the Justice Department is not doing its job and breaking it up.

    It wouldn't be any different if IBM were the dominant company--as it was a few decades ago--or Apple, or what have you.

    The problem is not Microsoft. The problem is monopolization. And the answer is not the free market--monopolies exist only when the market has already failed.
  • by R.Caley ( 126968 ) on Monday September 01, 2003 @08:47AM (#6844146)
    The main reason worms can cause such havoc is that they find themselves in a monoculture. We are in the software equivalent of the Irish potato famine.

    What the government should do is enforce diversity. Requireing every government department above some minimum size to use systems from at least 3 independent sources would be a start.

    • The main reason worms can cause such havoc is that they find themselves in a monoculture.
      That makes it easy for the worms to cause havoc. The main reason worms cause so much havoc is the tendency to try to hide stuff from everybody.
      CDs use Your computer to install and set up stuff so they can play themselves.
      File extensions are hidden so you can click on a presumably well-named file and have the spreadsheet show up.
      A general tendency to have to click on everything to be sure you don't miss something.
      A beli
      • The main reason worms cause so much havoc is the tendency to try to hide stuff from everybody.

        That only really applies to viruses not worms, except in the sense that the low levels are (thankfully) hidden from users (I have no wish to have to think about kernel device drivers while I'm trying to write an email).

        In any case, if it were not for the monoculture, viruses would only be a way you could screw up your own computer. That would be a Good Thing, hopefully you'd learn something. The monoculture all

  • by deafgreatdane ( 231492 ) on Monday September 01, 2003 @08:56AM (#6844188)
    "There are three major things every consumer and user of computers needs to do," Scott Charney, the security chief for Microsoft, said. "One, get antivirus software and keep it up to date. [...]"
    I think this spin is one of the biggest problems with the public perception of computer security.

    I find it appalling that we tolerate anti-virus software as a necessary solution. IMO, every virus is an exploitation of a bug in the software, and original vendor should be responsible for fixing the hole that allowed the virus to exist.

    Why doesn't the press focus on the hypocrisy holding of software vendors more accountable for fixing their problems, while at the same time, advocating supporting a third part to fix the same problems?

    I about blew my top when fixing my in-laws' machine for a case of blaster, and MS so "conveniently" linked one of the trusted anti-virus sites that offered removal tools. If it's microsoft's hole, why don't they provide a cleanup method?

    (This is not to say we shouldn't have virus filters on SMTP and firewalls - there's nothing wrong with trying to block the spread of virii through multiple means)

  • The only way Linux, FreeBSD, and all of the other operating systems that have appeared over the years were possible is because of the lack of government regulation. Once the government steps in, it will only stifle creativity and limit consumer options.

    Who is best to deal with government regulations? Microsoft.

    Thanks, but no thanks. This issue will work itself out. We are in our growing stages. The government is not a solution to everything... actually, not much at all, really.
  • by SysKoll ( 48967 ) on Monday September 01, 2003 @09:26AM (#6844312)
    Typical NYT drivel. A problem pops up? The Times clamor for government regulation. Astonishingly, when faced with a dramatic, err, bug in its own journalist monitoring activity [worldnetdaily.com], the NYT doesn't call for the gummint to create a Journalism Ethic Control Board. But these programmers guys? Yeah, they need to be kept under control.

    The gummint will be only too happy to oblige and produce several layers of ineficient, costly, slow, slightly corrupt bureaucracy that will not solve the problem but will never disappear. As usual.

    Let us put on our bureaucrat hat and see what can be done, in the immortal tradition of public service that gave us the Transportation Safely Authority. Let's see. Strip search programmers when they come to work in case they bring a copy of 2600? Have them remove their shoes? A nice start, but not enough.

    See, the problem is that scumbags are writing programs that are up to no good. No scumbag coding, no worm and virus, eh? So let's put all compilers under lock. Let's make sure that scripting languages only accept input scripts that have been digitally signed by a new Programming Safety Authority. Let's make it a crime to use a computer without PSA-approved tools. Each program has to be certified by the PSA. Use the TCPA and Palladium chips to lock out all the bastards using non-PSA software and operating systems. Ban all non-Palladium computers and electronics. Do an FBI criminal check on each person entrusted with a compiler. And of course, recruits thousands of new civil servants to enforce all these new rules, at a low, low cost of [#insert eye-popping budget that will be overrun anyway].There you have, secure computing. A bit harsh, but it's for our safety, isn't it?

    If you think the above is funny, I am sorry. I meant it to be ironic in a chilling way. Because when you start involving the government into a human activity, you never know how the bureaucrats are going to warp it.

    So I'm gonna speak slowly so that even New York Times journalists can understand: KEEP GOVERNMENT OUT OF COMPUTING. Got it?

    -- SysKoll
  • by scruffy ( 29773 ) on Monday September 01, 2003 @09:33AM (#6844343)
    Software engineering is unlike a lot of other engineering in that no one can predict with much certainty what a large program is going to do. This lack of certainly is not just bad engineering, it is a mathematically proven law of software. Add to that the fact that each computer runs a slightly different set of programs and is connected to a slightly different set of peripherals, then you have even a more impossible problem.

    Software on airplanes work reasonably well because they test the hell of it and two airplanes of the same model are pretty much the same. Also, the users of the software (airplane crews) are well-trained. The exteme testing and thorough training though makes it very expensive. I don't think we can afford to hire software engineer and tutor for each household.

    I would be afraid that regulation would not fully take into account the difficulties of making perfect software and dealing with untrained users.

  • by cowbutt ( 21077 ) on Monday September 01, 2003 @10:05AM (#6844517) Journal
    I do feel that, for a number of reasons, regulation will probably be the only way to make proprietary software vendors improve the quality of their products.

    But on the other hand, if other industries are examined, such regulation will only turn into a further barrier to entry for new entrants to the market and non-commercial (i.e. Free and Open Source ) software.

    I already see this when trying to sell FOSS solutions to the public sector, who invariably have successful "Common Criteria" evaluation as a "nice to have" (at least - in some cases it's mandatory).

    Getting these evaluations done is expensive, so only the big boys get to play... Ironically, the people I talk with know that FOSS solutions are usually at least as secure as the products on their approved list, but their hands are tied by regulations and auditors.

    --

  • Now watch as... (Score:4, Insightful)

    by Kyouryuu ( 685884 ) on Monday September 01, 2003 @10:12AM (#6844553) Homepage
    Now watch as Bill Gates and his cronies push for Trusted Computing, the Palladium project. After all, it's never Microsoft's fault that the bugs exist, right? It's always those darned users and by George we need to foolproof the system. Please. Trusting computing is a joke. It is a power play by top industry corporations to seize power and act as a yet another cohesive monopoly in a so-called free market. Just like the RIAA. Just like the MPAA.

    Here's a thought. Hold the software companies responsible for their own goofups and bugs. Let the people sue. Let the people file their class action lawsuits against Microsoft for their errors. But don't let the government take control.

    I don't want the ignorant US government, or any government for that matter, looking over the Internet and infringing on it any more than they already are. Half of those farts probably don't even know what the Internet is. I can't say I'd want these clueless individuals, easily motivated by legal bribery (lobbies) and big business (Palladium), to be involved. They will only serve to screw things up, pass ridiculous laws, and tax Internet commerce to death. Let the Internet be that one place government is unable to corrupt.

    The problem is that the people who aren't on the Internet; the people who take passive interest in computers, are ignorant to these facts. That's why I feel, unfortunately, that things like Palladium are destined to pass. Microsoft and others are going to get these bills through the door while the politicians are still ignorant to computers.

    I'd like to say we can stop them, but we don't have a $47 billion lobbyist group behind us.
  • One thing that would help is LESS homogeneity. If everyone is running the same software then a single flaw makes the entire network vulnerable. If we simply had more competition, the effects of such attacks would be less significant. If there were more competition, people would be more willing to drop the insecure products in favor of one their friends are using that doesn't have the problem. Yes, there will be more problems, but not as far reaching.

    One place regulation would help is in mandating open stan

  • I had some buffer overflow problems in my mail client. Silly me. I ran splint and considered hiring another programmer (preferably one who knows what splint is and how to use it).

    Now I've changed my mind. Instead of adding engineers who sit at computers and write C, I'm going to add beurocrats who look over their shoulders and produce Word documents.

    Fortune 500, here I come!

    [this post is close-captioned for the sarcasm impaired. All the previous was SARCASM.]

  • by hackus ( 159037 ) on Monday September 01, 2003 @11:13AM (#6844841) Homepage
    After developing applications for a wide variety of banking industries it became clear that:

    1) The only way to develope software systems, is to proactively secure the systems once they are deployed.

    2) To proactively and continuously review and examine such systems, you must have the source code and build tools and access to the hardware engineering requirements of the systems involved.

    3) The only known process where this can be achieved is through Open Source.

    Closed binary proprietary software is not secure, cannot be MADE secure, is impossible TO secure and with patents and copyrights laws as written it could be quite possible you could be SUED for securing the software yourself.

    Security became an extension of the software engineering process for the company I started previously, and it involved reviewing the source code and making changes, performing attacks, etc.

    Critical to this process was to have as many eyes and opnions looking at the source code as possible. The more experienced professionals that had a chance to offer advice and opinions on the code, the better and more secure the code became.

    An entire portion of the software engineering process cannot even be done with proprietary software, and I personally as a CIO, declared proprietary binary only software sales DOA in this industry 2 years ago.

    -Hack
  • by Lucas Membrane ( 524640 ) on Monday September 01, 2003 @01:04PM (#6845331)
    How about revitalizing the role of government specs in government purchasing? The government is such a big customer, if they could simply stop buying system software products that presented too big a risk, the large vendors would find it advantageoous to provide software that didn't.

    This worked for accessibility. When 11 state governments said that they would stop buying software with lousy accessibility for persons with disabilities, big software vendor(s) finally did something about it. Why shouldn't it also work for security???

    This approach used to bring big advantages to the private sector, as manufacturers had to learn to do the right thing on many products. It has lost its impact recently, as the government has given in to business by buying COTS, no questions asked.

"Someone's been mean to you! Tell me who it is, so I can punch him tastefully." -- Ralph Bakshi's Mighty Mouse

Working...