From: Göran Andersson on
G�ran Andersson wrote:
> Arne Vajh�j wrote:
>> On 15-03-2010 16:37, Andrew Poelstra wrote:
>>> On 2010-03-15, Jeff Johnson<i.get(a)enough.spam> wrote:
>>>> "sahel"<nam.nam.barooon(a)gmail.com> wrote in message
>>>> news:e8722a56-827e-4d6f-a20a-876f8b4db2b0(a)x12g2000yqx.googlegroups.com...
>>>>
>>>>> i want this answer because i want to get some char from user like
>>>>> this : y=(x^2)+1 so i will put 2 in char but to write the program for
>>>>> it that it must do an math question ,program must know that 2& 1 are
>>>>> not char they are int
>>>>
>>>> Subtract 48 from each digit that you convert from char to int and
>>>> you'll
>>>> have what you want. Of course, that's the long way. There's also the
>>>> Convert
>>>> class or the int.Parse() / int.TryParse() methods.
>>>
>>> Or subtract '0' and you'll not only be clearer, your code will
>>> work outside of ASCII-based encodings.
>>
>> It will work with encoding where the digits are in order.
>>
>> But EBCDIC is not that common in .NET programs.
>>
>> But the readability argument is still valid.
>>
>> Arne
>>
>
> A char always contain a Unicode character, it's not encoded.
>

(Well, it's kind of encoded as strings are represented internally as
UTF-16/UCS-2 values rather than full 32 bit unicode code points, but
that's a different matter that's not really relevant here.)

--
G�ran Andersson
_____
http://www.guffa.com
From: Peter Duniho on
G�ran Andersson wrote:
>> A char always contain a Unicode character, it's not encoded.
>
> (Well, it's kind of encoded as strings are represented internally as
> UTF-16/UCS-2 values rather than full 32 bit unicode code points, but
> that's a different matter that's not really relevant here.)

I think it is relevant. Strings and characters in .NET are documented
to be UTF-16. It's not like it's some "internal-only implementation
detail"; it's an integral part of the design of .NET, just as the
numerical formats for Int32, Double, etc. are. That is, a program can
safely rely on the specific format to accomplish things, without
worrying that it might change in the future.

Pete
From: Arne Vajhøj on
On 17-03-2010 05:13, G�ran Andersson wrote:
> Arne Vajh�j wrote:
>> On 15-03-2010 16:37, Andrew Poelstra wrote:
>>> On 2010-03-15, Jeff Johnson<i.get(a)enough.spam> wrote:
>>>> "sahel"<nam.nam.barooon(a)gmail.com> wrote in message
>>>> news:e8722a56-827e-4d6f-a20a-876f8b4db2b0(a)x12g2000yqx.googlegroups.com...
>>>>
>>>>> i want this answer because i want to get some char from user like
>>>>> this : y=(x^2)+1 so i will put 2 in char but to write the program for
>>>>> it that it must do an math question ,program must know that 2& 1 are
>>>>> not char they are int
>>>>
>>>> Subtract 48 from each digit that you convert from char to int and
>>>> you'll
>>>> have what you want. Of course, that's the long way. There's also the
>>>> Convert
>>>> class or the int.Parse() / int.TryParse() methods.
>>>
>>> Or subtract '0' and you'll not only be clearer, your code will
>>> work outside of ASCII-based encodings.
>>
>> It will work with encoding where the digits are in order.
>>
>> But EBCDIC is not that common in .NET programs.
>>
>> But the readability argument is still valid.
>
> A char always contain a Unicode character, it's not encoded.

You are correct.

Encoding is for bytes.

Arne

From: Arne Vajhøj on
On 17-03-2010 11:53, Peter Duniho wrote:
> G�ran Andersson wrote:
>>> A char always contain a Unicode character, it's not encoded.
>>
>> (Well, it's kind of encoded as strings are represented internally as
>> UTF-16/UCS-2 values rather than full 32 bit unicode code points, but
>> that's a different matter that's not really relevant here.)
>
> I think it is relevant. Strings and characters in .NET are documented to
> be UTF-16. It's not like it's some "internal-only implementation
> detail"; it's an integral part of the design of .NET, just as the
> numerical formats for Int32, Double, etc. are. That is, a program can
> safely rely on the specific format to accomplish things, without
> worrying that it might change in the future.

I think he meant that the surrogate pair problem was not
relevant for the '0'..'9' discussion.

Arne