We had a big data loss including all backups some time ago. Our VAR did a really good job rescuing all the data we could save from old backups and the local caches! Unfortunately he put the wrong size of gz-compressed files in the index.xml files. He wrote the compressed size, but database expects the uncompressed size (even if the name of the entry is "compressed size"). Now, the archive server always fails to get older compressed files because of mismatches of the entries in index.xml and database. The replicated archive server hangs when it tries to replicate such a compressed version and must be killed and restarted manually.
Thus my question: does anyone knows a solution to bulk rebuild the index.xml files with the file size data from database?