Data compression is the compacting of info by reducing the number of bits which are stored or transmitted. Because of this, the compressed info takes much less disk space than the original one, so much more content can be stored using identical amount of space. You'll find different compression algorithms that work in different ways and with many of them only the redundant bits are deleted, which means that once the data is uncompressed, there's no decrease in quality. Others erase unneeded bits, but uncompressing the data later will lead to reduced quality in comparison with the original. Compressing and uncompressing content needs a huge amount of system resources, especially CPU processing time, so every hosting platform which employs compression in real time should have adequate power to support that feature. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of keeping the whole code.
Data Compression in Shared Hosting
The compression algorithm used by the ZFS file system which runs on our cloud web hosting platform is known as LZ4. It can supercharge the performance of any Internet site hosted in a shared hosting account on our end as not only does it compress data significantly better than algorithms employed by various other file systems, but it also uncompresses data at speeds which are higher than the hard disk drive reading speeds. This can be done by using a great deal of CPU processing time, that is not a problem for our platform considering that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it allows us to create backups at a higher speed and on less disk space, so we shall have multiple daily backups of your databases and files and their generation won't change the performance of the servers. That way, we can always recover all of the content that you could have deleted by accident.