I would like to create a macro to convert a variable string into the UNICODE (16 bits).
Example:
#define UNICODE ("xyz") - It will return 0,'x',0,'y',0, z at compilation time.

The size of parameter to the macro can be any length.

Thanks for helping

Recommended Answers

All 9 Replies

Something like this?

#include <stdio.h>
#include <wchar.h>

#define UNICODE(x) L#x

int main ( void )
{
  wchar_t *p = UNICODE ( xyz\n );

  fputws ( p, stdout );

  return 0;
}

By the way, Unicode is only an encoding, not a storage specification. The actual storage is with multi-byte characters that happen to represent Unicode codes. Technically, you can use Unicode with single byte characters as long as you restrict yourself to the single byte Unicode values.

Something like this?

#include <stdio.h>
#include <wchar.h>

#define UNICODE(x) L#x

int main ( void )
{
  wchar_t *p = UNICODE ( xyz\n );

  fputws ( p, stdout );

  return 0;
}

By the way, Unicode is only an encoding, not a storage specification. The actual storage is with multi-byte characters that happen to represent Unicode codes. Technically, you can use Unicode with single byte characters as long as you restrict yourself to the single byte Unicode values.

This is for the USB message which require 16-bit Unicode character. We can always do this in the function but I wonder if we can do it in the macro that will have the message at compilation so it will run faster.

Thanks a lot for your post.

you can use that macro anywhere you use a string literal. You can't, however, use that macro with a char*

// this will NOT work
char* hello = "Hello World";
wchar_t thello = UNICODE(hello);

you can use that macro anywhere you use a string literal. You can't, however, use that macro with a char*

// this will NOT work
char* hello = "Hello World";
wchar_t thello = UNICODE(hello);

Your point being? There was never any assumption that the macro would be used with narrow string variables, unless you were reading a few posts that I conveniently missed. Or are you just posting pointless fluff to make yourself look smart when you can't find anything meaningful to add?

Your point being? There was never any assumption that the macro would be used with narrow string variables, unless you were reading a few posts that I conveniently missed. Or are you just posting pointless fluff to make yourself look smart when you can't find anything meaningful to add?

getting pretty snotty now aren't you :twisted: or are you suffering from pms today :mrgreen:

you might (or might not) be supprised at the number of people I've see try to do what I mentioned, so its not an unreasonable thing to warn people about.

Unicode is not an encoding, the actual storage specification is the character encoding. Unicode is more accurately described a coded character set.

Member Avatar for iamthwee

are you suffering from pms today

Isn't that like the eight wonder of the natural world, for us men anyway? He he.

Don't worry Narue the New Year may be more hopeful.

God bless everyone.

>getting pretty snotty now aren't you :twisted: or are you suffering from pms today :mrgreen:
That was pretty sharp-tongued, even for me. ;) I was suffering from arrogant forumgoers trying to show off their brilliance by incorrectly "correcting" me with meaningless crap. But you wouldn't do that...more than once. :twisted:

>you might (or might not) be supprised
I'm very surprised, actually, since it's immediately obvious what the macro does and that it only works with string literals. Though I shouldn't be surprised, with the rampant idiocy of many programmers. :rolleyes:

>Unicode is not an encoding
From dictionary.com:

encoding: Computer Science. To format (electronic data) according to a standard format.

From unicode.org (emphasis is mine):

Q: What is Unicode?
A: Unicode is the universal character encoding, maintained by the Unicode Consortium. This encoding standard provides the basis for processing, storage and interchange of text data in any language in all modern software and information technology protocols.

Now, it's entirely possible that the Unicode Consortium doesn't know what encoding means, or hired someone to write their FAQ who doesn't know what encoding means, or you're being far too pedantic and making mistakes in the process. :)

How about we meet halfway and call Unicode a code table? I respect you too much to flame you out of existence.

I don't think so. What is generally called a character encoding, they call a Character Encoding Scheme. See http://www.unicode.org/glossary/. And their definition of Coded Character Set would characterize Unicode nicely. Their glossary doesn't give a definition of character encoding, probably because they've avoided using the term because it's sometimes used in different ways. I would not call a numbering of a series of characters an encoding.

To say that "Unicode is just an encoding" is misleading, because "character encoding" normally refers to the storage specification, such as UTF-8 or UTF-16BE. This is how the term is usually used, and their FAQ goes against the norm. (I'm basing what I call the 'norm' from looking at the View > 'Character Encoding' menu in Firefox, the 'Encoding' menu in I.E., and just about every link I've looked at in a Google search for 'unicode character encoding'.)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.