File storage

BrentB

New member
Joined
Jul 15, 2006
Messages
4,420
Reaction score
0
I just copied all files from 5 different computers to one PC for easy access and needed only 120G for file storage after removing most of the junk. I never knew this before or thought about it and always bought a large drives. It will take some time to fill the new 320 G HD.
 
IMHO & to corrupt an old saying,

"You can never be too rich, too thin or have too much extra space on your hard drive."

Once you start downloading & archiving video, its nice to have plenty of space :thup
 
I was told that once a drive hits 50% capacity, performance begins to degrade as files are shifted around. Anyone know the tructh to this? I know that one computer I had got cranky as I got near 80% capacity.
 
That's why you need to "defragment" it Matt. If all the file parts are together, it should run fine.

Charlie
 
Matt Gurnsey":28snsi9c said:
I was told that once a drive hits 50% capacity, performance begins to degrade as files are shifted around. Anyone know the tructh to this? I know that one computer I had got cranky as I got near 80% capacity.

Nope news to me.
I use defrag tools. First one was Optune and SpeedDisk then built in ones and one PC I had Diskeeper. HD are mechanical and become unreliable with age. Using Defraggler now.
 
That you don't have to defragment is only one of many advantages to Linux, or any other unix flavor.
 
smittypaddler":135fe6t1 said:
That you don't have to defragment is only one of many advantages to Linux, or any other unix flavor.

Including Mac OS X. :mrgreen:

Warren
 
As I understand it, it does occur in Linux flavors and Mac OS but file defrag is built into the file system
 
This is one of those things I take for granted, since I've been using linux for so many years. Linux isn't free from defragmentation. On any freshly built filesystem, you allocate files A, B, and C, then delete B, you have a hole in the middle; fragmentation. The difference is, Linux filesystem managers use some very sophisticated algorithms for keeping track of free space, far beyond the simple "quikcell" concept implemented in IBM's MVT operating system 40 years ago, where blocks free space of similar size were maintained in separate queues. These algorithms sufficiently efficient, unless you're changing the size of a disk partition, you never need to run a defragmentation program. The thing is, these algorithms are built with open-source, protected under the GPL (GNU Public License), which requires any developer that uses it to make their source code open-source also. Microsoft can't avail themselves of many of these concepts without opening their source code for anyone to see. They want their OS to remain proprietary; a huge disadvantage when it comes to future development.
 
I saw some kind of pill advertised on TV that is supposed to help your hard drive...anybody try it? do you just crush the pill up and sprinkle it on the computer?
 
Back
Top