CDCheck - Wish list
.
CDCheck
News
Information
Download
Order
Support
Mailing list
Online help
Best media
CDR links
CDR portals
Articles, tips etc.
Recovery
Other links
Developers
Documentation
Code
.
[Recovery] Random order retries
The best algorithm I know of to recover a collection of problematic sectors is not to retry reading each one a fixed number of times and then go to the next one, but rather, after an initial sequential pass over all sectors, maintain a list of failed ones and repeatedly pick one at random and retry it once or twice. When a sector is recovered, it is removed from the failed list, leaving a smaller collection to continue working on.
This way, one can set a very large (or even unlimited) retry limit and leave the process to run for a few hours. The less problematic sectors will be recovered first, leaving the harder problems for later. When the user decides to stop the process, the recovered file is the best result possible for the amount of time invested in recovering it.
The sequential approach forces the user to choose whether to use a large limit and get only the beginning of the file, or a small limit and miss sectors that could have been recovered with a few more retries.
Another fringe benefit resulting from the random iteration is that problematic sectors are read after a random seek, which could increase the chance of success if the problem has something to do with head alignment. To make this work best for all sectors in the problematic list, the algorithm should occasionally seek to the disk start and occasionally to the disk end (say, one in every 5 read attempts).
I have experimented with methods of recovery from floppy disks, and from my experience, this algorithm beats the sequential one both in success rate and in friendliness.


cognitus (27.11.2003)

<< Back
Last page update: 29.05.2019
Privacy policy

Page generated in 0.01s