Forgot your password?
typodupeerror
The Internet Technology

Web Server Stress Testing : Tutorial /Review 28

Posted by Hemos
from the building-the-machine dept.
darthcamaro writes "I found an interesting article on builder.com that suggests laying 'Siege' to your server to help you setup up a site to withstand /. "One of the great fears for many Web developers, and even more so for Web server admins, is watching their sites brought down by an avalanche of traffic. The traffic may be from a DoS attack, or maybe the site just got slash-dotted. Bottom line: It just isn't available.""
This discussion has been archived. No new comments can be posted.

Web Server Stress Testing : Tutorial /Review

Comments Filter:
  • by Anonymous Coward
    Forward /. hordes! You have a mission now!
  • No more trial and error for this guy. What I would do is purchase large amounts of traffic, tweak, then gradually work my way up to larger and larger numbers.

    Not only does this save time, but hey, it saves money.
  • DDoS (Score:3, Funny)

    by Dreadlord (671979) on Tuesday March 23, 2004 @11:18AM (#8644912) Journal
    $ siege -c25 -t1M www.mydomain.com
    ** Siege 2.59
    ** Preparing 25 concurrent users for battle.
    The server is now under siege...


    In other news, domain/hosting company mydomain.com [mydomain.com] was under a heavy DDoS attack, it's believed that the attacks were done by members of a geek news website called Slashdot.
  • 1. download
    2. gunzip and untar
    3. ./configure
    4. make, make install
    5. $ siege -c200 -t720M www.thescogroup.com
    ** Siege 2.59
    ** Preparing 200 concurrent users for battle.
    The server is now under siege...
  • by monkeyserver.com (311067) on Tuesday March 23, 2004 @11:38AM (#8645117) Homepage Journal
    I looked through the article, it doesn't look like much more than a slightly sophisticated wget for loop :). Seriously though, this seems similar to a few other basic stress testers out there. For the projects I've worked on you need session management, interactive processes, ... basically hitting 5 urls isn't gonna stress test anything of value.
    The Grinder [sourceforge.net] on the other hand, allows for distributed workers, following the same or different 'scripts' all controlled from a single console. It provides you with a slew of configuration options and all sorts of data at your fingertips. The scripts are jython [jython.org] which is easy to learn and very flexible. If you want to stress test a complex app, especially something interactive, or requiring sessions, check out the grinder, it's a god send.
  • stress testing tools (Score:5, Informative)

    by HaiLHaiL (250648) on Tuesday March 23, 2004 @11:54AM (#8645283) Homepage
    Another great tool for stress testing your site is Jakarta JMeter [apache.org]. Gives you a nice GUI for watching your response times plummet as your site is pummeled.

    From the article:
    Siege now supports a new proxy tool called Sproxy, which harvests URLs for testing. The premise behind Sproxy doesn't make much sense to me... Personally, I prefer Scout for getting my URLs, since it just goes through a site's links and adds them that way.

    The advantage of using a browser to set up your test plan is that it better simulates real traffic patterns on your site. Microsoft's Application Test Center [c-sharpcorner.com] does this, and JMeter has a proxy server [apache.org] similar to Sproxy.

    When you're trying to replicate problems with a live site, however, it would seem more appropriate to me if you could base your test on real traffic to the site. I wrote a load testing tool once that used web logs to simulate the actual traffic patterns, but it was incomplete, mostly because web logs don't record POST data. A good stress tool could come with an Apache/IIS/Tomcat plugin that recorded traffic for use in stress testing.
    • When you're trying to replicate problems with a live site, however, it would seem more appropriate to me if you could base your test on real traffic to the site.

      Assuming a standard apache common log format, you can just install siege, then run:

      awk '{print "http://www.iddl.vt.edu"$7}' access_log>/usr/local/etc/urls.txt
      Run siege using this file, and you have it running based on actual traffic on your site. I just threw this together, and initial testing shows that it can work this way.
  • by Kopretinka (97408) on Tuesday March 23, 2004 @12:32PM (#8645739) Homepage
    Why can't I load the article?
  • stress testing based on concurrent users hitting script etc is fine,
    but there are other things to make sure of, returnin information and the like

    check out the perl module called webtest or something like that
  • by Jahf (21968)
    Any author who has any even remotely possibly interesting-to-geek material on their site who is not on a dedicated box with a full T3 who doesn't reject any request that is referer'ed by /. is behind the times anyway.
    • However, that cannot prevent an attack by Google. You wouldn't want to block requests referred by google.com, because you do want people to find your site, right?

      As reported in a previous story [slashdot.org], Google linked their main logo graphic to an information academic site and brought it down [swin.edu.au]. Subsequently, Slashdot hit [swin.edu.au], but it didn't hold a candle to Google. Fortunately, such attacks by Google are rare. Of course, there is no way to determine your risk for a Google attack, unlike slashdot attacks.

      The best idea is
      • by Jahf (21968)
        Depends on if you care if people see your site ... I know one guy who takes all traffic referred by Google, /. and a couple of other sites (he occasionally publishes tech goodness) to the Google cached version of his page.

        Most people can't afford to keep their personal servers ready to handle 1% of the load that Google's image fiasco or 10% of a popular article on /. can throw at them.

        Should those people be penalized by not being able to have their own site (rather than surrendering control to a bunch of
        • Re:Yeesh (Score:2, Interesting)

          by Jahf (21968)
          btw I doubt even the referer->GoogleCache mechanism would save most sites from the inadvertant DDOS that Google provided by that image link. Just more argument to Google and /. being better citizens.

          Perhaps /. could wait to publish a story until Google had it cached and then give the -option- in a user pref to allow links to be rewritten to the Google cache ...

          Perhaps Google could add a new piece to the stale robots.txt standard like "cache-link-only" so that Google would know the author was only inter
  • by stecoop (759508) on Tuesday March 23, 2004 @04:53PM (#8648860) Journal
    One item this article didn't explicitly look at was the network saturation percentage.

    Most servers can handle a greater load than the network traffic can handle. To demonstrate a proper test you would need to test outside of your routers and firewall. This means that the test machines should be located outside of your local area network while testing or at least a certain quantitative percentage for statistical purposes.

    Odds are most people are going to work within the LAN and lay Siege to their machines but forget that there is an outside world.
  • I was expecting an article about stressing out a server by putting MS Exchange on it... But this is good too.
  • The best stress tester was a company called Envive, which was a distributed attack sort of focus, with server time and space all over the world. You write a script, and then can watch the attack from a web browser. Proof positive that Siege is more popular though - they went out of business.
  • http://www.linux.ie/pipermail/ilug/2004-January/00 9863.html [linux.ie]

    Takes a bit to get into the discussion though

    Relavent system has 2TB of data(6TB space), max recorded throughput is 550Mb/s, over 20000 concurrent http requests.
  • As a test consultant working in all areas of automated testing (as opposed to manual 'tick the box' testing) - I do most of my Load or Stress Testing using the industry standard tool LoadRunner [mercuryinteractive.com]. I've used all other load testing tools and this is by far the best (albeit pretty expensive) but for large scale commercial projects - nothing even comes close.
  • ... but they couldn't sell it so they stopped development. MS Web Application Stress Tool "webtool" is worth a look. It free and does alot http://www.microsoft.com/technet/itsolutions/intra net/downloads/webstres.mspx

I don't want to achieve immortality through my work. I want to achieve immortality through not dying. -- Woody Allen

Working...