I am experiencing huge problems with a harddisk that contains a lot of highly compresses files (on filesystem level).
Example: The disk is 350GB, has 280GB free space and contains 400GB of data (as I said, the files are highly compressed).
All calculations of Defraggler are totally wrong, speed is extremely poor.
It looks like Defraggler is looking for free space for the new file, but BASED ON THE REAL FILESIZE, not based on the COMPRESSED size (i.e. the required space).
I do not know if my assumption is correct, of course, but I see that the green bar for the new file has the length the uncompressed file would have. After the file is moved, the really used space is totally diffrerent.
Additionally, the files have 5 MILLION fragments and this huge number seems also to be a problem.
Defragging using the normal command, also defragging free space is ultimately slow, Defragging single files (also with many selected) seems to be faster.
This makes Defraggler unusable with NTFS-compressed files.
Update:
Defraggler startet to do some strange things after it was not able to find enough free space for a file (with uncompressed size). The occupation map changed constantly and totally.
This program does not seem to reliable. If it is doing unexplainable things with my valuable data, I can not use it!