ASCII (American Standard Code for Information Interchange) is a character encoding standard for electronic communication. It assigns a unique numerical value to letters, digits, punctuation marks, and control signals, allowing computers to store and exchange text.
Key Features
- 7-Bit Standard: The original ASCII uses 7 bits to represent 128 unique characters.
- Universal Basis: It forms the foundation for modern encodings like Unicode (UTF-8).
- Sequential ordering: Letters and numbers are ordered sequentially, making sorting and manipulation easy.
Visualizing ASCII Conversion
How ASCII Works
ASCII maps human-readable characters to integer values. The standard set uses 7 bits ($2^7 = 128$ characters).
Key Character Ranges
| Character Type | Range (Decimal) | Description |
|---|---|---|
| Control Characters | 0 - 31 | Non-printing (e.g., Null, Line Feed, Escape) |
| Space | 32 | The spacebar character |
| Digits (0-9) | 48 - 57 | Numerical digits |
| Uppercase (A-Z) | 65 - 90 | English capital letters |
| Lowercase (a-z) | 97 - 122 | English small letters |
| Delete | 127 | Command to delete |
Character Groups
1. Control Characters (0-31)
These were originally designed to control hardware devices like teletype machines and printers.
- NULL (0): Represents 'nothing'. Used to terminate strings in C/C++.
- Line Feed (10): Moves the cursor to the next line.
- Carriage Return (13): Moves the cursor to the start of the line.
2. Printable Characters (32-126)
These include all the numbers, letters, and symbols you see on your keyboard.
- Digits: '0' starts at 48.
- Letters: 'A' starts at 65, 'a' starts at 97.
3. Extended ASCII (128-255)
Standard ASCII only uses 7 bits. Computers work in 8-bit bytes. The "Extended ASCII" set uses the 8th bit to add another 128 characters, such as mathematical symbols ($\div$, $\pm$) and accented letters (é, ñ).
Important ASCII Calculations
1. Case Conversion
The difference between an uppercase letter and its lowercase equivalent is exactly 32.
'A' (Decimal 65) = 0100 0001
+ 32 (Decimal 32) = 0010 0000
--------------------------------
'a' (Decimal 97) = 0110 0001
2. Digit vs. Value
The character '0' is not the same as the value 0. To convert an ASCII digit to its integer value, subtract 48.
char input = '5'; // ASCII value is 53
int value = input - 48; // 53 - 48 = 5
Quick Quiz
1. How many characters are in Standard ASCII?
A) 256
B) 128 ✅
2. What is the ASCII decimal value of 'A'?
A) 97
B) 65 ✅
3. Which character represents the value 32?
A) Zero (0)
B) Space ✅