Data compression is the compacting of information by decreasing the number of bits that are stored or transmitted. Thus, the compressed data requires much less disk space than the original one, so more content can be stored using identical amount of space. There are many different compression algorithms which work in different ways and with a lot of them just the redundant bits are erased, so once the info is uncompressed, there's no loss of quality. Others remove unneeded bits, but uncompressing the data subsequently will lead to reduced quality compared to the original. Compressing and uncompressing content requires a large amount of system resources, especially CPU processing time, therefore any hosting platform that employs compression in real time needs to have adequate power to support that attribute. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of saving the entire code.
Data Compression in Shared Hosting
The compression algorithm employed by the ZFS file system that runs on our cloud hosting platform is known as LZ4. It can boost the performance of any site hosted in a shared hosting account with us because not only does it compress data more effectively than algorithms employed by various other file systems, but it also uncompresses data at speeds which are higher than the hard drive reading speeds. This can be done by using a lot of CPU processing time, that is not a problem for our platform due to the fact that it uses clusters of powerful servers working together. An additional advantage of LZ4 is that it enables us to generate backups more quickly and on lower disk space, so we will have multiple daily backups of your databases and files and their generation will not influence the performance of the servers. This way, we can always restore all the content that you may have deleted by accident.