Shrinking Big Data
If you manage ‘Big Data’ then you’ve probably run into this problem: your Big Data is getting too big. Before you know it you are moving around files that take 20 or 30 hours to transfer and many hours to compress, decompress or even to back up. Six months after installing a new batch of backup servers they are maxed out and you have to add more. As a result you are constantly pressing users to reduce the amount of data you keep accessible for them. At Moonshadow we host and serve more than one billion records for our customers […]