The term data compression describes decreasing the number of bits of data that should be saved or transmitted. You can do this with or without the loss of information, so what will be removed during the compression shall be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the info and its quality shall be the same, whereas in the second case the quality will be worse. You'll find various compression algorithms which are more effective for different type of information. Compressing and uncompressing data in most cases takes plenty of processing time, so the server carrying out the action should have ample resources in order to be able to process your data fast enough. A simple example how information can be compressed is to store just how many sequential positions should have 1 and how many should have 0 in the binary code instead of storing the actual 1s and 0s.
Data Compression in Shared Web Hosting
The compression algorithm that we employ on the cloud internet hosting platform where your new shared web hosting account will be created is named LZ4 and it's used by the cutting-edge ZFS file system which powers the system. The algorithm is greater than the ones other file systems use as its compression ratio is much higher and it processes data a lot quicker. The speed is most noticeable when content is being uncompressed as this happens even faster than information can be read from a hard disk. Therefore, LZ4 improves the performance of every site stored on a server that uses this particular algorithm. We use LZ4 in an additional way - its speed and compression ratio allow us to produce a couple of daily backups of the full content of all accounts and store them for a month. Not only do our backups take less space, but in addition their generation doesn't slow the servers down like it often happens with many other file systems.