From: Mathias Gaunard on
On 18 jan, 19:59, Le Chaud Lapin <jaibudu...(a)gmail.com> wrote:

> What disturbs me is that, within the next few months, I will be forced
> to commit. I will have to choose wchar_t or char16_t, but as char16_t
> is not available, and there is no indication of when it might become
> available, I must use wchar_t for now, but will not be able to change
> to char16_t when it becomes available without a total recall of all
> deployed network software.

I think the types themselves are supported since GCC 4.3 or 4.4, which
are production quality.

--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]

From: Le Chaud Lapin on
On Jan 19, 12:45 am, Jerry Coffin <jerryvcof...(a)yahoo.com> wrote:
> In article <20b7032a-896f-4702-b9b8-62d164ec5474
> @h9g2000yqa.googlegroups.com>, jaibudu...(a)gmail.com says...
> > { The question concerns the two C++0x types char16_t and char32_t. -mod }
>
> > Hi All,
>
> > Any idea when these two types [char16_t and char32_t] will be
> > commonly supported across major compilers?
>
> They are currently supported in gcc and the beta version of VS/VC++
> 2010. I'd expect that most compilers that don't support them already
> will probably add that support quite quickly.

Thanks Jerry.

I just had one of my engineers check to see if VS2010 beta actually
supports char16_t versus simply making it an aliases for something
else, the obvious choice being wchar_t, and it appears that Microsoft,
right now, is simply making it an alias:

/* uchar PROPERTIES */
#if _HAS_CHAR16_T_LANGUAGE_SUPPORT
#else /* _HAS_CHAR16_T_LANGUAGE_SUPPORT */
#if !defined(_CHAR16T)
#define _CHAR16T
typedef unsigned short char16_t;
typedef unsigned int char32_t;
#endif /* !defined(_CHAR16T) */

#endif /* _HAS_CHAR16_T_LANGUAGE_SUPPORT */

My system relies on char16_t and char32_t being distinct types, not
aliases for anything else, so unfortunately, I will have to stay with
wchar_t for now.

Also, after a bit of musing a few weeks ago about the feasibility/
appropriateness of adding char16_t/char32_t to C++ as distinct types,
I arrived at the conclusion that it was not as trivial as it might
seem for the compiler developer.

Unfortunately, I cannot recall they exact thought process that lead me
to this conclusion. I think it had to do with hard choices regarding
policy. But it does not suprise me that Microsoft has deferred, at
least for the time being, on making these bonafide distinct types.

-Le Chaud Lapin-


--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]

From: Martin B. on
Le Chaud Lapin wrote:
> On Jan 18, 8:49 am, SG <s.gesem...(a)gmail.com> wrote:
>> On 16 Jan., 20:26, Mathias Gaunard <loufo...(a)gmail.com> wrote:
>
>>> uint_least16_t and uint_least32_t work just as well for that.
>>> Sure, it doesn't have the semantic attached to it, but that's not
>>> needed to hold elements.
>> Though, there is a difference between uint_least16_t and char16_t. The
>> first is just an alias of some integer type while the second is a
>> distinct type. I'm not sure how important this is in reality but it
>> affects overloading, for example. Just wanted to mention this.
>
> Which is one of the reasons I need the reach char16_t and char32_t.
> My code is type-driven, and being network-oriented, numerical type
> codes for char16_t and char32_t must remain invariant from one network
> node to another.
>

What do you mean by "numerical type codes"? Is this something compiler
specific?

cheers,
Martin

--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]

From: Le Chaud Lapin on
On Jan 20, 8:13 am, "Martin B." <0xCDCDC...(a)gmx.at> wrote:
> Le Chaud Lapin wrote:
> > Which is one of the reasons I need the reach char16_t and char32_t.
> > My code is type-driven, and being network-oriented, numerical type
> > codes for char16_t and char32_t must remain invariant from one network
> > node to another.
>
> What do you mean by "numerical type codes"? Is this something compiler
> specific?

No, just my thing.

I generate a code from 1 to 16 for each of the C++ scalar arithmetic
types, and use these codes in many places [like serialization]. My
UNIOCDE string class is currently based on wchar_t, but I would rather
use char16_t, but for this to work under my model, char16_t must be a
distinct type, not simply an alias for, say, wchar_t, for then the
assigned code for char16_t would be the same as that for wchar_t,
creating confusion throughout my system.

-Le Chaud Lapin-


--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]

From: Martin B. on
Le Chaud Lapin wrote:
> On Jan 19, 12:45 am, Jerry Coffin <jerryvcof...(a)yahoo.com> wrote:
>> In article <20b7032a-896f-4702-b9b8-62d164ec5474
>> @h9g2000yqa.googlegroups.com>, jaibudu...(a)gmail.com says...
>>> { The question concerns the two C++0x types char16_t and char32_t. -mod }
>>> Hi All,
>>> Any idea when these two types [char16_t and char32_t] will be
>>> commonly supported across major compilers?
>> They are currently supported in gcc and the beta version of VS/VC++
>> 2010. I'd expect that most compilers that don't support them already
>> will probably add that support quite quickly.
>
> Thanks Jerry.
> ...
> else, the obvious choice being wchar_t, and it appears that Microsoft,
> right now, is simply making it an alias:
> ...
> typedef unsigned short char16_t;
> typedef unsigned int char32_t;
> ...
>
> ... I arrived at the conclusion that it was not as trivial as it might
> seem for the compiler developer.
>
> Unfortunately, I cannot recall they exact thought process that lead me
> to this conclusion. I think it had to do with hard choices regarding
> policy. But it does not suprise me that Microsoft has deferred, at
> least for the time being, on making these bonafide distinct types.
>

But the standard mandates these being distinct types?
So it's the whole "treat xyzchar_t as builtin type: Yes/No" mess all
over again? *Sigh* :-)

br,
Martin

--
[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]