2. Representation of Text
A character is a symbol or letter on the keyboard.
A list of all the characters a computer and peripheral device can process
and control is called the character set.
Each character is represented by a unique binary code.
3. ASCII
The internationally agreed code used to represent American English is the
American Standard Code for Information Interchange or ASCII.
ASCII uses 7 bits to represent each
character
from 0000000 to 1111111
which can be used to store 128 characters.
The codes 0 - 31 represent control
characters which are special non-printing
characters
e.g TAB and Return.
4. ASCII
Extended ASCII, which was created when IBM designed its PCs, uses 8
bits to represent each character.
from 0000000 to 1111111
which can be used to store 256
characters.
5. Unicode
The fact that ASCII can only represent at most 256 characters creates
problems for international communication.
The 256 characters are based on European alphabets and do not contain
Arabic or Japanese characters.
The solution was development of the
Unicode character set which uses 16 bits to
represent each character.
Using 16 bits means that up to 65,536
characters can be represented.
6. Error Detection
It is possible for an error to occur when data is transmitted eg there is
interference which affects the signal.
For example, the character A has an ASCII code 65, which in binary is
01000001. What happens if there has been a transmission error that changes
the value of the signal?
7. Error Detection
In order to detect such an error, we can use an extra bit called a parity bit.
Parity can be set as either odd or even. In this example we will be using even
parity.
The data to be transmitted is 01000001. There are two bits with a value of 1,
so our parity bit needs to be set to 0. The message to be sent will have an even
number of bits set to 1:
Data to be sent:
Parity bit ASCII
0 01000001
8. Error Detection
When the transmission is received, the processor will count the number of bits
set to 1.
In our example, we experienced some interference causing one of the bits to be
changed from a 0 to 1.
Data received:
Parity bit ASCII
0 01010001
There are now 3 bits set to 1, an odd number. As we are using even parity, we
can tell there has been an error and so the processor will request that the data
is retransmitted.
9. Error Detection
Using odd parity error checking works in the same way, except there should be
an odd number of bits set to 1.
Data to be sent:
Parity bit ASCII
1 01000001
Data received:
Parity bit ASCII
1 01010001
There is an error because we are using odd parity but have received an even
number of bits
10. Credits
Higher Computing – Data Representation – Representation of Text
Produced by P. Greene and adapted by R. G. Simpson for the City of
Edinburgh Council 2004
Adapted by M. Cunningham 2010
All images licenced under Creative Commons 3.0
• Rainbow of Books by Dawn Endico
• Close up of a black keyboard by John Ward