I'm converting our code to use lexical_cast for converting between our numeric types to string. It worked mostly but was surprised to see what looked like graphic characters on my screen. It turns out that when we wanted to display the ascii value of a character, lexical_cast<string> doesn't work -- it doesn't treat the char as a numeric, which when we really think about it, makes a lot of sense. Casting the char to int solves the problem. But I was just wondering how the experts would handle this. char c = 65; string s1 = lexical_cast<string>(c) // s1 = "A" string s2 = lexical_cast<string>((int)c) // s2 = "65" Thanks! -Jerry
In article
string s2 = lexical_cast<string>((int)c) // s2 = "65"
I would have written int(c) instead of (int)c because compilers perform more useful checking on the constructor than they do on the cast, but that's all. meeroh -- If this message helped you, consider buying an item from my wish list: http://web.meeroh.org/wishlist
Miro Jurisic
In article
, "DY, JERRY U (SBCSI)" wrote: string s2 = lexical_cast<string>((int)c) // s2 = "65"
I would have written int(c) instead of (int)c because compilers perform more useful checking on the constructor than they do on the cast, but that's all.
The two forms are equivalent. -- Dave Abrahams Boost Consulting http://www.boost-consulting.com
In article
Miro Jurisic
writes: In article
, "DY, JERRY U (SBCSI)"
wrote: string s2 = lexical_cast<string>((int)c) // s2 = "65"
I would have written int(c) instead of (int)c because compilers perform more useful checking on the constructor than they do on the cast, but that's all.
The two forms are equivalent.
I have it in my head that I have seen a compiler that warned about int construction from (for example) char*, but (obviously) not about a cast of char* to int. I could be on crack, of course. meeroh -- If this message helped you, consider buying an item from my wish list: http://web.meeroh.org/wishlist
participants (3)
-
David Abrahams
-
DY, JERRY U (SBCSI)
-
Miro Jurisic