Data compression is the compacting of information by decreasing the number of bits that are stored or transmitted. Thus, the compressed data requires much less disk space than the original one, so more content can be stored using identical amount of space. There are many different compression algorithms which work in different ways and with a lot of them just the redundant bits are erased, so once the info is uncompressed, there's no loss of quality. Others remove unneeded bits, but uncompressing the data subsequently will lead to reduced quality compared to the original. Compressing and uncompressing content requires a large amount of system resources, especially CPU processing time, therefore any hosting platform that employs compression in real time needs to have adequate power to support that attribute. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of saving the entire code.