From: Michael C on
"Erland Sommarskog" <esquel(a)sommarskog.se> wrote in message
news:Xns9CD3F3CA6AE63Yazorman(a)127.0.0.1...
> Yes, C# stored procedure that would read the temp table into a dataset
> could probably beat the T-SQL solution with quite a margin. C# is after
> all better apt to run loops than T-SQL.

I guess C# doesn't have to worry about locks, multiple users etc so it's not
really a valid comparison.

Michael


From: Michael C on
"Erland Sommarskog" <esquel(a)sommarskog.se> wrote in message
news:Xns9CD3F2D052927Yazorman(a)127.0.0.1...
> And the second release of SQLCLR had unlimited sizes for UDTs and
> UDAs.

That is true but it's still a pretty silly limitation to have at all. It
took until 2005 (7th release?) were we could have local variables of more
than 8K. It just seems that sqlserver always has a much bigger list of
gotchas than other products and this list of gotchas always makes less
sense.

>> That's all there is to the code, it's such a simple problem to solve that
>> the code is minimal. As I said, if ADO.net can do it then linq can do it.
>> You stated ado.net can do it so it is possible.
>
> So why don't you do it then?

I've explained several times that I can't be bothered installing 2008 just
to write less than 5 lines of code to prove something that would be very
simple to do. I don't mean to be rude but I really think you're clutching at
straws here trying to push this very minor point. Considering I've already
provided a code sample anyway. Basically linq is very flexible and will have
no problem providing the data in the form that ado.net requires.

> Just because ADO .Net can do it, does not mean that LINQ knows that
> ADO .Net can do it, or how it does it. Even less so if you bring EF.

EF is a little different and I have already stated that it's likely it can't
do it.

Michael


From: Erland Sommarskog on
Michael C (mike(a)nospam.com) writes:
> "Erland Sommarskog" <esquel(a)sommarskog.se> wrote in message
> news:Xns9CD3F3CA6AE63Yazorman(a)127.0.0.1...
>> Yes, C# stored procedure that would read the temp table into a dataset
>> could probably beat the T-SQL solution with quite a margin. C# is after
>> all better apt to run loops than T-SQL.
>
> I guess C# doesn't have to worry about locks, multiple users etc so it's
> not really a valid comparison.

That's true, but that is not really the main reason. C# is compiled (if only
to MSIL), and T-SQL is not.

Actually, I found the other day when testing functions for making comma-
separated lists into table, and if I had a list of numbers, it was better
to make a special UDF that returns integer, and thus to the string-to-int
conversion in C#, and returning a table of strings and convert in T-SQL.
The difference was not big, maybe 20 ms for a total execution time
over 500 ms, but nevertheless.


--
Erland Sommarskog, SQL Server MVP, esquel(a)sommarskog.se

Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
From: Erland Sommarskog on
Michael C (mike(a)nospam.com) writes:
> "Erland Sommarskog" <esquel(a)sommarskog.se> wrote in message
> news:Xns9CD3F2D052927Yazorman(a)127.0.0.1...
>> And the second release of SQLCLR had unlimited sizes for UDTs and
>> UDAs.
>
> That is true but it's still a pretty silly limitation to have at all.

As I said, there is a difference between a limited object and an unlimited
object. SQL Server does not live in a some nice model world which
pretends to be unlimited, but which crashes when memory runs out.

> It took until 2005 (7th release?) were we could have local variables of
> more than 8K.

You could have input parameters of text/ntext/image in previous versions,
but you could not assign them.

And there is a difference. Consider this script:

declare @g varchar(MAX) = ''
declare @x varchar(MAX) = replicate(convert(varchar(MAX), 'K'), 1000000)
declare @i int = 1500
while @i > 0
select @g += @x, @i -= 1
select datalength(@g)

I'm running this on a virtual machine with 516 MB of memory and some
786 MB of pagefile. How do think it ends?

As a clue, it has been running for soon four hours, and tempdb has
expanded from 8 MB to almost 1GB. We don't know yet, but I'm optimistic
that the script will complete successfully.

The corresponding C# program would complete a lot faster - with an
out-of-memory exception.

And this is the important thing: if you have to deal with large object
in a server, you must also be able to spill to disk. But if everything
could spill to disk, there would be a performance penalty for that. So
it does make sense in having a division in small and large objects. And,
obviously, spilling objects to disk is more complex than having them
in memory only.

Then you can call that silly if you like.

> I've explained several times that I can't be bothered installing 2008
> just to write less than 5 lines of code to prove something that would be
> very simple to do. I don't mean to be rude but I really think you're
> clutching at straws here trying to push this very minor point.
> Considering I've already provided a code sample anyway. Basically linq
> is very flexible and will have no problem providing the data in the form
> that ado.net requires.

Again: LINQ would still need to know what parameter types to use when
defining the parameter list. It's not going to sort it out magically.

And I'm sorry, it does not hold to say that something is trivial, and
refusing to prove it.

--
Erland Sommarskog, SQL Server MVP, esquel(a)sommarskog.se

Links for SQL Server Books Online:
SQL 2008: http://msdn.microsoft.com/en-us/sqlserver/cc514207.aspx
SQL 2005: http://msdn.microsoft.com/en-us/sqlserver/bb895970.aspx
SQL 2000: http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx

From: Michael C on
"Erland Sommarskog" <esquel(a)sommarskog.se> wrote in message
news:Xns9CD5F7DE521Yazorman(a)127.0.0.1...
> As I said, there is a difference between a limited object and an unlimited
> object. SQL Server does not live in a some nice model world which
> pretends to be unlimited, but which crashes when memory runs out.

Sounds fine to me.

> And there is a difference. Consider this script:
>
> declare @g varchar(MAX) = ''
> declare @x varchar(MAX) = replicate(convert(varchar(MAX), 'K'), 1000000)
> declare @i int = 1500
> while @i > 0
> select @g += @x, @i -= 1
> select datalength(@g)
>
> I'm running this on a virtual machine with 516 MB of memory and some
> 786 MB of pagefile. How do think it ends?
>
> As a clue, it has been running for soon four hours, and tempdb has
> expanded from 8 MB to almost 1GB. We don't know yet, but I'm optimistic
> that the script will complete successfully.
>
> The corresponding C# program would complete a lot faster - with an
> out-of-memory exception.

Sounds reasonable. In the *extremely* rare case I need to store such huge
strings then I expect to send the results to disk. If this is the reason
sqlserver has these 8k limits then that seams even sillier than I first
thought. A feature than it is very rarely used seams to be limiting the
other 99% of cases.

> And this is the important thing: if you have to deal with large object
> in a server, you must also be able to spill to disk. But if everything
> could spill to disk, there would be a performance penalty for that. So
> it does make sense in having a division in small and large objects. And,
> obviously, spilling objects to disk is more complex than having them
> in memory only.
>
> Then you can call that silly if you like.

I still do. The only difference is they reduced the limit where it crashes
to a 1980s era figure of 8k.

>> I've explained several times that I can't be bothered installing 2008
>> just to write less than 5 lines of code to prove something that would be
>> very simple to do. I don't mean to be rude but I really think you're
>> clutching at straws here trying to push this very minor point.
>> Considering I've already provided a code sample anyway. Basically linq
>> is very flexible and will have no problem providing the data in the form
>> that ado.net requires.
>
> Again: LINQ would still need to know what parameter types to use when
> defining the parameter list. It's not going to sort it out magically.

No, of course not, you have to tell it what to convert it to. This was
exactly what I did in the code sample I provided. You did read the code I
posted? :-)

> And I'm sorry, it does not hold to say that something is trivial, and
> refusing to prove it.

Actually, it's up to you to prove it is not possible. You are the one who
raised this point from second hand information of which you're not really
sure about. (is it Linq or linq to sql or EF that has this supposed
limitation). It is not my responsibility to do the work for you to prove or
disprove something that you raised. As far as I am concerned this is
possible from linq and I will not discuss this any further until you've
actually attempted it and can provide something a little more substantial.

Michael


First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
Prev: Sql Server on VMWare
Next: AdventureWorks database