When we declare a variable of type int and we don’t tell the compiler if it is supposed to be signed or unsigned it is signed by default:
signed int s_number; // signed
unsigned int u_number; // unsigned
int number; // equivalent with signed int
This is true for all integer number types (short, int, long, long long).
But there is one exception: The char type!
The char Type is Special
According to the C Standard, it depends on the implementation if char is signed or unsigned. The standard also says that char doesn’t collapse into one of signed char or unsigned char but is considered its own type (although it is represented internally as signed or unsigned, of course).
But the most surprising thing to know about char is that it is not guaranteed to be 8 bits. That means the C Standard does not guarantee a byte to be 8 bits. There are some very old architectures where char is 9 bits wide for example.
Fortunately for us, basically all architectures in use today have bytes that are 8 bits wide. Still, a program that assumes char to be 8 bits wide will not be portable to every platform where a C compiler is available.
Ask the Compiler
The characteristics of the char type (as well as all other types) are exposed in the standard header file limits.h. Therefore we can write a simple C program to find out how char is implemented on our platform:
#include <stdio.h>
#include <limits.h>
int main(void)
{
printf("char bits: %d\n", CHAR_BIT);
printf("char is: %s\n", CHAR_MIN < 0 ? "signed" : "unsigned");
printf("signed char min: %d\n", SCHAR_MIN);
printf("signed char max: %d\n", SCHAR_MAX);
printf("unsigned char min: 0\n");
printf("unsigned char max: %d\n", UCHAR_MAX);
return 0;
}