Jump to content

Return to Piriform.com

Photo

What does the Drive Wiper/Free Space thing do ?


  • Please log in to reply
7 replies to this topic

#1 OFFLINE WBFAir

WBFAir

    Newbie

  • Members
  • Pip
  • 1 posts

Posted 08 May 2011 - 04:46 AM

Hello all

First post here and as well am hardly a knowledgeable person on cCleaner or power user of it so this is probably a silly question but.

For a while now I have been worried about sensitive info that while I may delete it from my hard drive and do things like run cCleaner to clean out my memories, that still there could be info of it on the disk that in the right hands could be retrieved. In other words, basically from what I have been told that anything you delete is really not truly deleted but more marked to not show up and is freely able to be overwritten by anything.

So essentially its not that its not there anymore, its just that the OS isn't recognizing it as if it is. And as a result, with the write software and knowledge, it can be read and sometimes even reassembled to a degree.

Now I know about wiping a drive clean by doing a low level re-format which re-writes all zeros to it and that this is really the most you could do to fully get rid of everything, but the process of coping all the info I want to keep from one drive to another, then doing a low level format, then putting all the info I saved back on is a pretty time consuming task.

So is this what something like the Drive Wiper in using the Free Space option is for?

Is what this does is take all that unused space that has this left over info on it an write zeros to it all (or in some other way remove it) so that 100% its truly gone, but then also leave all the current and used files and spaces completely un-touched?

Again I know that is probably a real newbee question but I just thought I would ask before I played with it.

As well, for this one particular product, does it really work as flawlessly as that in terms of not messing with things I don't want removed or is there sometimes issues with cCleaner with that so I should look at some other product?

Not that I'm looking to knock it if it is as hey, its free and already does a amazing job for exactly what it is supposed to do and so there really isn't any more that you could ask of it, but I just wanted to ask before I used it.

As well, if it does work pretty well, are there any special setup things I should do or look for in the program to make the fullest effort to not have something I don't want messed with, messed with?

Thanks for any help.

#2 OFFLINE kroozer

kroozer

    hi

  • Members
  • PipPipPipPip
  • 1,542 posts
  • Gender:Female

Posted 08 May 2011 - 09:50 AM

- - take all that unused space that has this left over info on it an write zeros to it all (or in some other way remove it) so that 100% its truly gone, but then also leave all the current and used files and spaces completely un-touched?

Appears to, for me.

- - does it really work as flawlessly as that in terms of not messing with things I don't want removed?

It does for me.

Here is a tutorial for wiping free space.

These are my settings. Posted Image

#3 OFFLINE ident

ident

    Needs More Cowbell

  • Members
  • PipPipPipPip
  • 1,616 posts
  • Gender:Male
  • Location:Cambridge, UK
  • Interests:Carpentry, Programming & Athletics(most sports)

Posted 11 May 2011 - 04:04 PM

Even after multiple passes of $R data, information may still be recoverable/viewable. (after 5 passes of random data, about 3% of bits won't have cycled at all and 6% will have only cycled once). After destroying the data several passes of random data may be used to hide the fact that data was destroyed, but simply overwriting with several passes of random data doesn't guarantee that no data will be recoverable.
No fate but what we make

#4 OFFLINE Alan_B

Alan_B

    Super Hero

  • Members
  • PipPipPipPipPip
  • 4,280 posts
  • Gender:Male
  • Location:Lancashire, England

Posted 12 May 2011 - 07:20 AM

Even after multiple passes of $R data, information may still be recoverable/viewable. (after 5 passes of random data, about 3% of bits won't have cycled at all and 6% will have only cycled once). After destroying the data several passes of random data may be used to hide the fact that data was destroyed, but simply overwriting with several passes of random data doesn't guarantee that no data will be recoverable.

You are being far to clever. I think you are raising needless fears.

Firstly what did you mean by $R data.
My first thought was you were speaking of the over-writing of $R data,
which from the '$' I understood to be Metadata file type.
Finally I realised you meant Random data. Why not say what you mean.

A single 8 bit byte can hold any one of 256 values ranging from 0 through to 255.
When you write a random byte of data each of the 8 bits has an 50% chance of being NOT being set.
A second random byte results in each bit having a 25% chance of NEVER being set
third > 12.5%
fourth > 6.25%
fifth > 3.125%

Your maths may be correct but the application is flawed.
On average out of 32 bits only 1 bit has never been set.
If the storage medium was Punched Paper Tape then 3 bytes would have 8 holes and only one byte would show that this hole never happened.
Yes, that bit of data can be recovered, but it would not tell a wife your Swiss Bank Account number or the name of a mistress.

A magnetic Hard Disc Drive uses analogue values which are not completely obliterated when over-written,
so a single bit that looks like it was never set could actually represent a place where it was set a long time ago but has been cleared many times since.

By studying the analogue values one could detect the difference between a bit that was set for the last two writes and one that was set only on the last write,
excepting that drives do not write zeros and ones but encode and write high values in either direction,
and a series of data bits with the same value result in magnetic values in different directions.
I do not think this analogue information is available for normal software to recover and view on an unmodified and unopened computer.
I think it requires dismantling of the HDD - not many wives would do that ! ! !

I suspect that the greatest danger is undeleted data you are not aware, of rather than data you wiped.

#5 OFFLINE ident

ident

    Needs More Cowbell

  • Members
  • PipPipPipPip
  • 1,616 posts
  • Gender:Male
  • Location:Cambridge, UK
  • Interests:Carpentry, Programming & Athletics(most sports)

Posted 12 May 2011 - 03:23 PM

You are being far to clever. I think you are raising needless fears.


Hi Alan. I had a few drinks when i wrote that last night and was not really thinking along the lines of over complicating things. In views of that i will rephrase my post in a more user friendly way.


OP: My interpretation of your question is you are worried after over writing data information may still be recoverable.

One single pass should be perfectly sufficient when it comes to over writing sensitive data. When over writing the whole volume One pass may leave a small % behind but a second pass should clean this up. If the 1% of data is valuable enough to be concerned about then the data is valuable enough for you to incinerate the drive in the first place and as they say

Nuke it from orbit because it's the only way to be sure
No fate but what we make

#6 OFFLINE Augeas

Augeas

    Moderator

  • Moderators
  • 3,085 posts
  • Gender:Not Telling
  • Location:Where Stuff is made, UK

Posted 12 May 2011 - 05:49 PM

Ah, I remember (just) paper tape. However hard disks do not simply store and detect voltage peaks as ones or zeroes, but as flux reversals, which could denote either one or zero. It's the flux reversals that are interpreted as a binary stream. Furthermore for years the method of detection of voltage peaks, or spikes, has not been a simple threshold value (as at high-density packing rates the voltage spikes run into each other) but Partial Response Maximum Likelihood (PRML) and EPRML processes. In PRML the signal (the Partial Response) is sampled in an Analogue to Digital Converter and the samples processed by a Maximum Likelihood Sequence Detector to produce a viable digital signal. Maximum likelihood techniques rely upon the detection of data sequences rather than bit-by-bit detection, the maximum likelihood detector often being in the form of a Viterbi algorithm.

Whilst I can look at a disgram and read the narrative, the whole process is extremely complex, differs from drive to drive, and is beyond my comprehension let alone explanation.

All that complexity produces a digital data stream that then has to pass through several decoding processes to translate it into user data before sending it to the O/S. User data is never, and has never been, written to disk in its original form.

So you can't really leave bits behind during an overwrite, as they were never actually there in the first place. And even if you did it would be useless without the disk controller. A disk is an amazing device, but the controller is King.

#7 OFFLINE Alan_B

Alan_B

    Super Hero

  • Members
  • PipPipPipPipPip
  • 4,280 posts
  • Gender:Male
  • Location:Lancashire, England

Posted 13 May 2011 - 04:32 AM

My first experience was an ASR33 TeleType writer with Paper Tape Punch and Reader.
The whole thing ran at 110 Baud with 2 Stop Bits 8 data bits and 1 start bit,
good for 10 c.p.s (characters per second) so long as the End of Line was Carriage Return then Line Feed.
The other way round and the first characters on the next line hit the paper whilst the carriage was still thundering from right to left.

I had to start up the system and start loading an editor program from P.P.T. into the microcomputer,
and then take a coffee break whilst a few kilobytes of source code loaded.

If I was not first in at work a colleague would snaffle the good ASR33 and I was left with the worn out stand by,
and I had to configure my microcomputer to deliver 7 stop bits so the ASR33 printer could keep up at less than 7 c.p.s.
I cannot remember how many extra Rubout characters I had to send after the CR/LF before sending the next line of text.

Then I got an interface that ran at 1200 baud = 120 c.p.s.
Display on 12 V battery powered portable TV
Storage on Audio Cassette modulating 1200 or 2400 audio tones at 30 c.p.s.
After a few (for me) simple tweaks the Audio handled 240 c.p.s.

Then I designed a data logger with Western Digital controller for a Floppy Disc,
and got into pre-emphasis control related to track position.
It worked nicely until some-one optimized the P.C.B. layout and signals got confused.

Hard drives are a different ballgame.
It would be a challenge to my sanity.
I just sit back and look in admiration at those who can play it.
Augeas you have delved into this far deeper than I.

Regards
Alan

#8 OFFLINE nikki605

nikki605

    Advanced Member

  • Members
  • PipPipPip
  • 314 posts
  • Gender:Male
  • Location:Tampa, FL
  • Interests:NASCAR

Posted 13 May 2011 - 08:58 AM

Boy, does your post take me back and make me feel old, Alan. :rolleyes: My first job in computers after graduating college was with RCA when they still had a computer division competing with IBM 360 mainframes. I worked on the old RCA 301's and 501's (later, the Spectra 70 series), both of which used paper tape readers to start their programs and for data entry. WOW! How far we have come. ;)
Win7 Pro x64 SP1 Desktop (Speccy) - Win7 Pro x64 SP1 Laptop (Speccy)