Quote:
Originally Posted by CaptainDoggles
I'm not sure that having multiple people hammering the server over and over will do anybody any good.
|
Well, it does a simple http get of a VERY small page once every 10 minutes... Lets say 1000 people would use this app (no way that will happen!

). That would give 6000 requests per hour. I think you could host a web server on an old 386 computer that would manage that load and still run on idle... How many requests per hour do you think this site gets?

A really small web server can do 100/requests per second, and then we talk "real" pages that are not just a list of files that is a few kb large with no database backend involved etc...
EDIT: I do however understand that if not working with stuff like this - it could seem that it would be quite a hammering for a web server! I remember working on a project 10 years ago where we did an Internet banking site that was running on two P3:s with 512Mb memory. They could handle 1500 concurrent users hammering them with almost no load at the web server side (c++/isapi dll:s)... The databases did start to get hot at that number of users though (and about 100/requests per second)... On sites today we have loads of over 700/requests per second, and the same scenario... But of course that involves a bunch of clustered servers.