The memset function declared in memory.h is declared this way...

void* memset(void* dest, int c, size_t count);

and the MSDN example of its use I have shows this for setting the 1st four bytes of a buffer to a '*'...

memset(buffer, '*', 4);

The 2nd parameter confuses me a bit because its typed as an int which is a 32 bit quantity in 32 bit Windows, yet the example above shows a literal char being used. I frequently use this function and that always confuses me. I frequently use it to zero out a buffer, and I do this...

memset(szBuffer, '\0', iSomeCount);

Is that correct?

What about just this...

memset(szBuffer, 0, iSomeCount);

Are these equivalent? Is one form preferable to the other? Is one or the other incorrect?

Recommended Answers

All 6 Replies

They are both correct.

Both just make sure that you use the sizeof() function to determine the size of you buffer, like:
char buffer[1024];
size_t bufferSize = 1024*sizeof(char); //NOT just 1024.


From www.cplusplus.com:

void * memset ( void * ptr, int value, size_t num );

Fill block of memory:
Sets the first num bytes of the block of memory pointed by ptr to the specified value (interpreted as an unsigned char).

Thanks Mike. Just wanted to make sure. Can't be too careful, it seems!

>>The 2nd parameter confuses me a bit because its typed as an int which is a 32 bit quantity in 32 bit Windows, yet the example above shows a literal char being used

character is just a one-byte signed integer. The compiler will promote it to an integer of the required size.

In c and c++ languages there really is no such thing as a character -- they are all represented as integers. If you look at any standard ascii chart you will see what the decimal/hex value is for all 255 possible characters in the English alphabet. So, the letter 'A' for example has a decimal value of 65. You can even do mathimatical operations of characters, such as int n = 'C' - 'A' + 'B' / 'E' >>size_t bufferSize = 1024*sizeof(char); //NOT just 1024.

sizeof(char) is guaraneed to be 1, so that statement is like saying 1024*1, which makes little sense.

>> sizeof(char) is guaraneed to be 1, so that statement is like saying 1024*1, which makes little sense.

It may be so for char but never assume a type "in general" has a particular size in bytes, most types do not have a cross-all-platforms guarantee of that. (program on micro-controllers for a while and you will start to put sizeof() everywhere as an instinctive reflex.. you know sometimes, in some obscure platforms or hardware, there are weird things, like char of size 7 bits and such)

If you're serious about writing C++ programs, rather than writing C programs and using a C++ compiler to compile them, std::fill is a better choice than memset because it's type-safe and works even for class types.

commented: Thank you, finally someone pointed out the obvious! +5

Here's a twist then I thought of after my original post. Lately I've been switching a lot between ansi and unicode, and going through replacing wchar_t type stuff with the t macros in tchar.h. When I come to lines such as this...

memset(pBuffer, '\0', iSomeCount);

I was tempted to do this...

memset(pBuffer, _T('\0'), iSomeCount);

...simply because I was going through a lot of code changing "some text" type stuff to _T("some text"). But using AncientDragon's line of reasoning I'm assumming a double NULL, i.e., "00" would be interpreted as just one numeric zero. Right?

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.