From: Crown Royal on 15 Sep 2010 08:17
A client has a webserver, and ever since we updated .net from 2.0 to 3.5, the
default apps pool stops after a while. The users get a service unavailable,
if the default apps pool is started, then everything is back to normal. I've
searched all over to find an answer to fix this, and although it seems like a
common problem, noone has an answer.
Can anyone provide help?
From: Brian Cryer on 15 Sep 2010 09:24
"Crown Royal" <CrownRoyal(a)discussions.microsoft.com> wrote in message
>A client has a webserver, and ever since we updated .net from 2.0 to 3.5,
> default apps pool stops after a while. The users get a service
> if the default apps pool is started, then everything is back to normal.
> searched all over to find an answer to fix this, and although it seems
> like a
> common problem, noone has an answer.
> Can anyone provide help?
A couple of thoughts:
1. Don't mix applications in the same pool which are for different versions
of .net. This might account for your problem.
2. (A variant of the above) Current wisdom is to have a separate application
pool for each application. If you move each application to its own separate
pool then does the problem go away?
IIS 7 creates a separate application pool by default for each application,
whereas II6 doesn't.
My guess is that if you give each application its own application pool that
either the problem will go away or you will find that one application is
misbehaving and crashing - although if this were the case I'd hope to see
some evidence of it in the event logs.
Hope this helps.