Skip to content

Assigning an integer to a character variable

An answer to this question on Stack Overflow.

Question

I was trying to assign an integer to a char, unsigned char and signed char variables, to understand the results of the same. I am running this code on a 64 bit Mac OS. I am baffled with the response and am looking for someone to help me out with the same. I am not looking at the sizeof result, I am only trying to understand why printing the value of variables result in weird characters.

#include<iostream>
 using namespace std;
 int main() {
     char a = 2;
     unsigned char b = 10;
     signed char c = 200;
     cout<<"Regular char variable value "<<a<<"\t"<<"size of the variable is "<<sizeof(a)<<"\n";
     cout<<"Regular char variable value "<<b<<"\t"<<"size of the variable is "<<sizeof(b)<<"\n";
     cout<<"Regular char variable value "<<c<<"\t"<<"size of the variable is "<<sizeof(c)<<"\n";
 }

And the output of this code is following:

Regular char variable value 	size of the variable is 1
Regular char variable value 
             	size of the variable is 1
Regular char variable value ?	size of the variable is 1

Please note that the 2nd line in the output is garbled, which adds to the complexity of the output.

Answer

It is because cout<<a tries to print a as a character instead of a number (the same for b and c).

Looking at an ASCII table [![ASCII table][1]][1]

tells you that character 2 is STX, 10 is LINEFEED, and 200 is up in the non-standard special characters section, or would be. Note that you assign 200 to a signed char whose maximum value is 127, so you'll actually end up with a negative number.

Since none of these characters have visible representations on the terminal, you'll see blanks.

To print them in numerical fashion you should use, e.g.

cout<<(int)a;

[1]: https://i.sstatic.net/Pzj1J.gif