Data compression is the compacting of data by decreasing the number of bits that are stored or transmitted. Consequently, the compressed info needs substantially less disk space than the initial one, so additional content might be stored on the same amount of space. There are various compression algorithms that function in different ways and with some of them only the redundant bits are deleted, so once the info is uncompressed, there's no loss of quality. Others erase unneeded bits, but uncompressing the data later will result in reduced quality in comparison with the original. Compressing and uncompressing content consumes a significant amount of system resources, and in particular CPU processing time, so each and every web hosting platform which employs compression in real time must have sufficient power to support that feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the actual code.

Data Compression in Shared Web Hosting

The compression algorithm that we employ on the cloud hosting platform where your new shared web hosting account will be created is known as LZ4 and it is used by the state-of-the-art ZFS file system which powers the system. The algorithm is better than the ones other file systems use because its compression ratio is higher and it processes data a lot faster. The speed is most noticeable when content is being uncompressed since this happens more quickly than data can be read from a hard drive. For that reason, LZ4 improves the performance of any site stored on a server that uses this particular algorithm. We take advantage of LZ4 in one more way - its speed and compression ratio let us make multiple daily backup copies of the whole content of all accounts and keep them for 30 days. Not only do our backup copies take less space, but also their generation won't slow the servers down like it often happens with some other file systems.