From: Simon Roust on
Hi, I'm currently designing a website which will be showing some
course results to students at the end of their course. One of the
options we want to give them is the ability to download a PDF version of
their "result letter" (rather than posting it out to them, so saving
money). The PDF will be generated on request for those students who
want it by calling a webservice residing on another server, this will
probably take a few seconds to execute (the data will have to be
retrieved from the database, suitably formatted and some PDF template
stuff executed to generate the PDF document).

We are a bit worried about overloading our PDF generation server with
requests (as we get a big peak load from students when we release the
student results as students are naturally anxious to see how they did),
unfortunately we don't have the option of spreading out the release of
results gradually.

So we'd like to build in a mechanism to throttle the number of
simultaneous webservice calls we make from the website. (the exact
number of requests will be determined when we see what sort of impact
they have on the server when we do some volume testing). It has been
deemed acceptable to give the user a "Sorry we are busy, please try
again later." message when we run out of capacity.

Can anybody give me some advice on how we can best manage the number of
concurrent requests we're making to the webservice. I think I'll need
to keep some sort of reference counting to count how many asp.net
threads have started a call to the webservice and how many have
completed. Or is there an easier / better option? We don't really
want to just limit the number of webserver asp.net threads as we'd like
to keep serving "simple pages" to other clients even when we're "busy".

The website will be developed using ASP.NET 2.0 + Visual Studio 2005
(or perhaps VS 2008 targeting the .NET 2.0 framework).

Thanks in advance for your thoughts.

Kind regards,

Simon