In computing, a byte is a unit of data that is eight binary (bits) digits long. Thus it has 28 = 256 different values that can be logically mapped to represent letters, numbers or symbols (for example, "a", "7", or "@"). The word was coined by Dr. Werner Buchholz of IBM in 1956.
A byte is abbreviated with a "B". Computer storage space is usually measured in byte multiples, for example 100MB is equal to 100 megabytes (approximately 100 million bytes) of data.
Other common abbreviations include:
kB = kilobyte = 1024 bytes = approximately 1 thousand bytes
MB = Megabyte = 1024 kilobytes = 1,048,576 bytes = approximately 1 million bytes
GB = gigabyte = 1024 megabytes = 1,073,741,824 bytes = 1 billion bytes (although this is an approximation) 
TB = terabyte = 1024 gigabytes = 1,099,511,627,776 bytes = approximately 1 trillion bytes
PB = petabyte = 1024 terebytes = 1,125,899,906,842,624 bytes = approximately 1 quadrillion bytes
EB = exabyte = 1024 petabytes = 1,152,921,504,606,846,976 bytes = approximately 1 quintillion bytes
Half a byte is referred to as a nibble (or nybble).