From: John Bell on
On Tue, 6 Apr 2010 11:50:37 -0700, "tshad" <tfs(a)dslextreme.com> wrote:

>
>"John Bell" <jbellnewsposts(a)hotmail.com> wrote in message
>news:umdkr5dpvf50ft73sq09frfodtgldv7b78(a)4ax.com...
>> On Mon, 5 Apr 2010 11:32:29 -0700, "tshad" <tfs(a)dslextreme.com> wrote:
>>
>>>I am looking at changing about 10 procedures that use a Permanent table
>>>for
>>>nothing more than temporary storage to load and store various records from
>>>a
>>>select statement and run a Join on this table to update other records.
>>>
>>>It then deletes all the records it put into the table. The table is used
>>>by
>>>multiple procedures at the same time keyed by a Guid it always gets at the
>>>start of the routines.
>>>
>>>There are about 4 indexes on this table.
>>>
>>>There can be any from 1 to 1000 records involved in this.
>>>
>>>I was thinking about dropping the table and just using a temp table. When
>>>the table is created I would then have to create 3 indexes - would drop
>>>one
>>>index (the Guid).
>>>
>>>Then the table would go away when I exit the procedure. I don't think
>>>there
>>>is any necessity to drop the table (and indexes) as it would go away
>>>anyway.
>>>
>>>But now I don't have to delete the records at the end or create the Guid
>>>at
>>>the beginning.
>>>
>>>Anything I am missing here?
>>>
>>>Thanks,
>>>
>>>Tom
>>>
>> Hi Tom
>>
>> Creating many temp tables can create a bottleneck on tempdb See
>> http://support.microsoft.com/kb/328551
>>
>Also, I noticed that this mainly seems to be talking about SQL 7 and a
>HotFix to that database.
>
>Is that still the case for 2000, 2005 and 2008?
>
>Thanks,
>
>Tom
>
>> Have you looked at whether you need a table to hold this?
>>
>> John
>
The issue is with all versions but there has been work to reduce the
problem in the recent versions. You shoukl still configure tempdb as
recommended.

John
From: John Bell on
On Tue, 6 Apr 2010 23:26:45 +0100, "Tony Rogerson"
<tonyrogerson(a)torver.net> wrote:

>Unless you've lots of concurrent connections CREATING temporary tables you
>won't get the blocking problem he talks about - you really need to be going
>some in order to see it.
>
>Tony
>

That depends on your definition of "going some", but it was very easy
to get this in SQL 2000.

John
From: John Bell on
>>
>> Any actual solution would be a guess as we haven't seen the code, but
>> it sounds to me like this is probably one of those common cases where
>> the solution is not set based.
>>
>It is all set based here.
>
>1) insert set of rows that is a subset of another table (or two) into the
>table
>2) use the table, usually to update another table or insert records into it
>using this table in a join.
>3) delete the records that were just added into the table (if you were the
>only one using the table at the moment - it would then be empty).
>

From this description I can't see the need for a temporry table so
long as you use transaction correctly.

If you used the subset for multiple procedures then you would need to
create the temporary table in the parent procedure and then call the
other procedures from that because when the parent procedure goes out
of scope the temporary table would not exist... but that is not the
pattern you have described!

>> If you are looking at 1000 rows then adding an index/statistics on a
>> table may be more expensive than not having the index. Because the
>> permanent table is volatile statistics could be stale.
>>
>So then the Temp table would be better - based on what Tony said that
>statistics are created on the fly as rows are added.
>
>> But without ciode that is all speculation.
>>
>True. But the code is different in each case.

I have seen many systems where they have developed a poor pattern and
then propogated it!!

>
>The table has 3 columns (GUID, ParentID, ChildID). One Guid per session.
>If I were to use the table and add 100 rows, I would get a new GUID and all
>the 100 rows would use that Guid.
>
>When done I do a delete using that GUID.
>
>DELETE table WHERE GUID = @GUID
>
>Tom


John
From: Gert-Jan Strik on
Tom,

> It's just confusing when in one case it says it will recompile and in
> another it says (or seems to say) that it won't.
>
> Tony says it will recompile the statement each time it sees the #Temp

I haven't seen Tony say this. He said that the use of temporary tables
causes recompiles. But not "each time".

SQL Server keeps statistics for tables, both temporary tables and
regular tables. When there have been many changes, the statistics will
be recalculated. When this happens, the query plans that use this
information are invalidated. So the next time you run "the same" query,
it will cause a recompile, because there will be no cached version
anymore.

This also happens for regular tables, but the chance is much lower that
you will encounter this, because many users will be using the same
table.

--
Gert-Jan
From: tshad on

"John Bell" <jbellnewsposts(a)hotmail.com> wrote in message
news:1pfor5peafkd1sqbltagb6f6p8ge2j6dlt(a)4ax.com...
>>>
>>> Any actual solution would be a guess as we haven't seen the code, but
>>> it sounds to me like this is probably one of those common cases where
>>> the solution is not set based.
>>>
>>It is all set based here.
>>
>>1) insert set of rows that is a subset of another table (or two) into the
>>table
>>2) use the table, usually to update another table or insert records into
>>it
>>using this table in a join.
>>3) delete the records that were just added into the table (if you were the
>>only one using the table at the moment - it would then be empty).
>>
>
> From this description I can't see the need for a temporry table so
> long as you use transaction correctly.
>
> If you used the subset for multiple procedures then you would need to
> create the temporary table in the parent procedure and then call the
> other procedures from that because when the parent procedure goes out
> of scope the temporary table would not exist... but that is not the
> pattern you have described!

Sure it is.

I was describing what happens now where the deletes seem to be slowing down
the procedure.

If I change that to:

1) Create table in procedure (in the parent)
2) Insert rows
3) call procedure that uses the table
4) drop the table or drop out of the procedure (out of scope) which would
drop the table for me.

At the moment all is done in one procedure (or another procedure is called
to do something with the table). It then comes back and the records are
deleted.

>
>>> If you are looking at 1000 rows then adding an index/statistics on a
>>> table may be more expensive than not having the index. Because the
>>> permanent table is volatile statistics could be stale.
>>>
>>So then the Temp table would be better - based on what Tony said that
>>statistics are created on the fly as rows are added.
>>
>>> But without ciode that is all speculation.
>>>
>>True. But the code is different in each case.
>
> I have seen many systems where they have developed a poor pattern and
> then propogated it!!
>
True.

And in this case, we have an issue where one or more of the procedures is
not deleting the records. One of the procedures jumps around the delete if
there is nothing processed (but the records are still in the table). So we
end up with 25000 records. Shouldn't have any or just the ones currently in
process.

Thanks,

Tom

>>
>>The table has 3 columns (GUID, ParentID, ChildID). One Guid per session.
>>If I were to use the table and add 100 rows, I would get a new GUID and
>>all
>>the 100 rows would use that Guid.
>>
>>When done I do a delete using that GUID.
>>
>>DELETE table WHERE GUID = @GUID
>>
>>Tom
>
>
> John