Web Server Stress Testing : Tutorial /Review 28
darthcamaro writes "I found an interesting article on builder.com that suggests laying 'Siege' to your server to help you setup up a site to withstand /.
"One of the great fears for many Web developers, and even more so for Web server admins, is watching their sites brought down by an avalanche of traffic. The traffic may be from a DoS attack, or maybe the site just got slash-dotted. Bottom line: It just isn't available.""
I will find immensely humorous if that site dies (Score:1, Funny)
Re:I will find immensely humorous if that site die (Score:1)
Sweet.. (Score:2)
Not only does this save time, but hey, it saves money.
DDoS (Score:3, Funny)
** Siege 2.59
** Preparing 25 concurrent users for battle.
The server is now under siege...
In other news, domain/hosting company mydomain.com [mydomain.com] was under a heavy DDoS attack, it's believed that the attacks were done by members of a geek news website called Slashdot.
sco'd (Score:1)
2. gunzip and untar
3.
4. make, make install
5. $ siege -c200 -t720M www.thescogroup.com
** Siege 2.59
** Preparing 200 concurrent users for battle.
The server is now under siege...
Re:sco'd (Score:2)
Re:sco'd (Score:2)
I still like The Grinder better (Score:5, Informative)
The Grinder [sourceforge.net] on the other hand, allows for distributed workers, following the same or different 'scripts' all controlled from a single console. It provides you with a slew of configuration options and all sorts of data at your fingertips. The scripts are jython [jython.org] which is easy to learn and very flexible. If you want to stress test a complex app, especially something interactive, or requiring sessions, check out the grinder, it's a god send.
stress testing tools (Score:5, Informative)
From the article:
Siege now supports a new proxy tool called Sproxy, which harvests URLs for testing. The premise behind Sproxy doesn't make much sense to me... Personally, I prefer Scout for getting my URLs, since it just goes through a site's links and adds them that way.
The advantage of using a browser to set up your test plan is that it better simulates real traffic patterns on your site. Microsoft's Application Test Center [c-sharpcorner.com] does this, and JMeter has a proxy server [apache.org] similar to Sproxy.
When you're trying to replicate problems with a live site, however, it would seem more appropriate to me if you could base your test on real traffic to the site. I wrote a load testing tool once that used web logs to simulate the actual traffic patterns, but it was incomplete, mostly because web logs don't record POST data. A good stress tool could come with an Apache/IIS/Tomcat plugin that recorded traffic for use in stress testing.
Re:stress testing tools (Score:2)
Assuming a standard apache common log format, you can just install siege, then run:
awk '{print "http://www.iddl.vt.edu"$7}' access_log>/usr/local/etc/urls.txt
Run siege using this file, and you have it running based on actual traffic on your site. I just threw this together, and initial testing shows that it can work this way.
I don't know what's happening... (Score:3, Funny)
siege is only the first layer (Score:2, Insightful)
but there are other things to make sure of, returnin information and the like
check out the perl module called webtest or something like that
Yeesh (Score:1)
Re:Yeesh (Score:1)
As reported in a previous story [slashdot.org], Google linked their main logo graphic to an information academic site and brought it down [swin.edu.au]. Subsequently, Slashdot hit [swin.edu.au], but it didn't hold a candle to Google. Fortunately, such attacks by Google are rare. Of course, there is no way to determine your risk for a Google attack, unlike slashdot attacks.
The best idea is
Re:Yeesh (Score:1)
Most people can't afford to keep their personal servers ready to handle 1% of the load that Google's image fiasco or 10% of a popular article on
Should those people be penalized by not being able to have their own site (rather than surrendering control to a bunch of
Re:Yeesh (Score:2, Interesting)
Perhaps
Perhaps Google could add a new piece to the stale robots.txt standard like "cache-link-only" so that Google would know the author was only inter
Web Testing should Include External Traffic (Score:3, Insightful)
Most servers can handle a greater load than the network traffic can handle. To demonstrate a proper test you would need to test outside of your routers and firewall. This means that the test machines should be located outside of your local area network while testing or at least a certain quantitative percentage for statistical purposes.
Odds are most people are going to work within the LAN and lay Siege to their machines but forget that there is an outside world.
hmmm... not what I expected (Score:2)
used to be a great tool for this (Score:2, Interesting)
ftp.heanet.ie (Score:1)
Takes a bit to get into the discussion though
Relavent system has 2TB of data(6TB space), max recorded throughput is 550Mb/s, over 20000 concurrent http requests.
Mercury Interactive tools (Score:1)
Something good from Microsoft ... (Score:1)