View Single Post
  #17  
Old 04-21-2012, 04:03 PM
mazex's Avatar
mazex mazex is offline
Approved Member
 
Join Date: Oct 2007
Location: Sweden
Posts: 1,342
Default

Quote:
Originally Posted by CaptainDoggles View Post
I'm not sure that having multiple people hammering the server over and over will do anybody any good.
Well, it does a simple http get of a VERY small page once every 10 minutes... Lets say 1000 people would use this app (no way that will happen! ). That would give 6000 requests per hour. I think you could host a web server on an old 386 computer that would manage that load and still run on idle... How many requests per hour do you think this site gets? A really small web server can do 100/requests per second, and then we talk "real" pages that are not just a list of files that is a few kb large with no database backend involved etc...

EDIT: I do however understand that if not working with stuff like this - it could seem that it would be quite a hammering for a web server! I remember working on a project 10 years ago where we did an Internet banking site that was running on two P3:s with 512Mb memory. They could handle 1500 concurrent users hammering them with almost no load at the web server side (c++/isapi dll:s)... The databases did start to get hot at that number of users though (and about 100/requests per second)... On sites today we have loads of over 700/requests per second, and the same scenario... But of course that involves a bunch of clustered servers.
__________________
i7 2600k @ 4.5 | GTX580 1.5GB (latest drivers) | P8Z77-V Pro MB | 8GB DDR3 1600 Mhz | SSD (OS) + Raptor 150 (Games) + 1TB WD (Extra) | X-Fi Fatality Pro (PCI) | Windows 7 x64 | TrackIR 4 | G940 Hotas

Last edited by mazex; 04-21-2012 at 04:42 PM.
Reply With Quote