ASCII
Computers have to be able to represent letters and symbols as well as numbers. Simply, the idea is to give each character a number, as a code and store the codes and their meanings in a table.
A common code is ASCII - the American Standard Code for Information Interchange.
This uses seven bits to store characters. Seven bits is enough to code 128 different characters.
Symbol | Binary |
A | 100 0001 |
B | 100 0010 |
C | 100 0011 |
D | 100 0100 |
E | 100 0101 |
F | 100 0110 |
Using this information write a definition of ASCII in your exercise books. You do need to know what it stands for and how many bits it uses.
ACTIVITY
Activity 1
Use the table of ASCII binary codes to decode these letters
ASCII Binary Code | Decimal | Character |
110 1000 | 104 | h |
110 0101 |
| |
110 1100 | ||
110 1100 | ||
110 1111 |
2 groups: ASCII & Unicode
https://en.wikibooks.org/wiki/A-level_Computing/AQA/Paper_2/Fundamentals_of_data_representation/ASCII_and_unicode
https://en.wikibooks.org/wiki/A-level_Computing/AQA/Problem_Solving,_Programming,_Data_Representation_and_Practical_Exercise/Fundamentals_of_Data_Representation/Unicode
Present poster “difference between Unicode and ASCII” (3 group)
http://www.differencebetween.net/technology/software-technology/difference-between-unicode-and-ascii/
Материалы на данной страницы взяты из открытых источников либо размещены пользователем в соответствии с договором-офертой сайта. Вы можете сообщить о нарушении.