Вы находитесь на странице: 1из 10

Introduction

Computers and digital circuits processes information

in the binary format. Each character is assigned 7 or 8 bit binary code to indicate its character which may be numeric, alphabet or special symbol. Example - Binary number 1000001 represents 65(decimal) in straight binary code, alphabet A in ASCII code and 41(decimal) in BCD code.

Codes
Symbols to represent text, number and other

information. Purpose: to suit the transmission via medium and peripherals. It has been developed as standard for communicate between each parties.

Types of Code
BCD (Binary-Coded Decimal) code
ASCII (American Standard Code Information

Interchange) code EBCDIC (Extended Binary Coded Decimal Interchange Code) code Gray code Excess-3 code

BCD
Binary-coded decimal (BCD) is a digital encoding

method for decimal numbers in which each digit is represented by its own binary sequence. Four-bit code that represents one of the ten decimal digits from 0 to 9. BCD's main virtue is ease of conversion between machine- and human-readable formats, as well as a more precise machine-format representation of decimal quantities.

Example - (37)10 is represented as 0011 0111 using BCD

code, rather than (100101)2 in straight binary code. Thus BCD code requires more bits than straight binary code. Still it is suitable for input and output operations in digital systems.

Note: 1010, 1011, 1100, 1101, 1110, and 1111 are INVALID CODE in BCD code.

EBCDIC
Extended Binary Coded Decimal Interchange Code,

was developed in 1963 by IBM as an extension to the older Binary Coded Decimal (BCD) standard. BCD encodes the 10 digits of the standard counting system using 4 binary counters, or bits, where an individual bit can be either a 1 or a 0. With EBCDIC, IBM extended the approach to 8 bits, allowing encoding not only of digits, but also of upper and lower case letters, formatting codes such as headers and carriage returns, and basic communication controls

EBCDIC is eight bits, or one byte, wide.


Each byte consists of two nibbles, each four bits wide. The first four bits define the class of character, while

the second nibble defines the specific character inside that class. For example, setting the first nibble to all-ones, 1111, defines the character as a number, and the second nibble defines which number is encoded.

ASCII
The American Standard Code for Information

Interchange is a character-encoding scheme based on the ordering of the English alphabet. ASCII codes represent text in computers, communications equipment, and other devices that use text. Most modern character-encoding schemes are based on ASCII, though they support many more characters than ASCII does.

It is 7-bit or 8-bit alphanumeric code. 7-bit code is standard ASCII supports 127 characters. Standard ASCII series starts from 00h to 7Fh, where

00h-1Fh are used as control characters and 20h-7Fh as graphics symbols. 8-bit code is extended ASCII supports 256 symbols where special graphics and math's symbols are added. Extended ASCII series starts from 80h to FFh. ASCII includes definitions for 128 characters: 33 are non-printing control characters (now mostly obsolete) that affect how text and space is processed;94 are printable characters, and the space is considered an invisible graphic.

Вам также может понравиться