Let's say "a" is a char and "b" a string. Why does it take longer to compute b+a than a+b? Strange, isn't it?

Recommended Answers

All 2 Replies

maybe its the way the compiler evaluates the addition...maybe because char is first and string second or some weird thing....it could also due to some thing happening in the memory where the char or string is located...who knows

I think it has to do with storage in memory and the fact that different types are allocated memory in different ways.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.