The Disk Defrag Difference



+ Add a Comment


I have found that the defrag programs are not working as I expected. They don't seem to consolidate. In order to determine which defragment does what I tested them. Since my drive is 640 GB this is a new challange for this software since most of this software was developed in the day when HDD were no bigger than 50 GB. Some of the defrag s/w was of little use to me because I don't need or want mp3 drives or movie drives orgranised by last access methods etc. These kind of drives are better organised by folder/filename. True big files don't really need to be defragged as much as small files BUT if I do a search on a drive or update mp3 tags then yes it does make a difference where the mp3 files are physically placed on the volume. Large Video Files need the least amount of defrag. If they are split in 10 pieces it makes no difference but if they are split in 100's of little pieces then it can start to make a difference.

Windows drives on the other hand benefit greatly from smart file placement methods. So I expect my defrag s/w to be able to defrag both data drives and windows drives. Most defraf s/w either does very little or specializes in Windows drive defrags. Oh yes, I am defragging FAT32 and NTFS in Windows XP. FAT32 is most compatible with my MAC so I am forced to use it.

I still think the easiest defrag is to copy all data to another drive and then format and copy back.

Power Defragmenter 3.0

Goal: Defragment and Consolidate ALL data

Disk Defragmenter 1.0

Goal: Defragment and Consolidate ALL data

Auslogics Disk (Good Simple Program Little or No placement control)

Goal: Defragment and Consolidate ALL data

Piriform Defraggler (Good Simple Program Little or No placement control)

Goal: Defragment and Consolidate ALL data
Result: Nice but it FRAGMENTS FILES TO FILL THE FREE SPACE which is clever because it is fast but stupid because it fragments more files!!!

PerfectDisk 10 (You have ONLY one choice for File Placement - SMARTPlacement Method) (TSR Ability and Boot Defrag)
 (Some Strange Results)

PerfectDisk's ability to defragment all NTFS metadata
Consolidate Free Space
Goal: Defragment and Consolidate ALL data
Result: DEFRAG FREE SPACE WORKS strangely copying files to the end of the driver and leaving them there. Moving files that are fine.
Result: PerfectDisk is different from other programs like UltimateDefrag since it places the least modified files at the outside edge of the disk (which is the fastest place according to UltimateDefrag). It makes no sense they can't both be right.

"PerfectDisk treats files created or accessed in the last 31 to 60 days as Occasionally Modified, and places them next to Recently Modified files (but closer to the outside edge of the physical disk where access times begin to increase). Finally, PerfectDisk treats files created or accessed more than 60 days ago as Rarely Modified and places them next to Occasionally Modified ones, but even closer to the outside edge. Boot files go all the way to that edge, where they are easy to locate during boot time (and unused thereafter)."

UltimateDefrag 2008 (1st PLACE - once they fix the huge number of BUGS)
(Respects Layout.ini)
This otherwise brilliant defrag suffers from major bugs and this makes it hard to evaluate properly. For example when it defrags in a totally unexpected way is this due to bugs or configuration problems. Either way they need to fix the bugs and change the configuration options which are very badly layed out. It is like they made the config options as hard and as confusing as possible by splitting them into two different locations. It becomes very hard to configure because you don't know for sure which config options the defrag is using or not. All intuitive use of the config is totally lost, I found myself endlessly refering to the manual which is something I almost never do. Of Course if you ONLY want to do a basic defrag then it is fine but when you want fine control over what UltimateDefrag is doing believe me it gets confusing as hell.

"Then take the remaining 20% (the files that you use the most) and place them on the outer tracks of your drive where performance is 180 to 240% that of the inner tracks. You then consolidate them closely together (i.e. maximize seek confinement) - the result is that the data that you actually access - booting your computer, loading programs, frequently used data files are on the area of your disk that is 200% faster"

File/Folder with MFT and Respect High Performance ON Strict Placement (Sorted)
Goal: Defragment and Consolidate ALL data
Result: This mode goes COMPLETELY MAD at about 35%, after doing hours of work it starts UNDOING the perfect work that it has ALREADY completed!!! MAD!!!

Auto on C: with MFT and Respect
Goal: Defragment and Consolidate ALL data on C: and order with High Speed Auto
Result: This mode finishes with 25% fragmentation remaining on C: This totally sucks even if UltimateDefrag does not think these files need to be defragmented. The program did not do what I asked it to do. If I complete the defrag and immediately ask it to defrag C: again, it takes forever. So the question is, if it did it correct in the first place then the second time I run it, it should have nothing to do and finish very quickly. I can only come to one conclusion.
Result: Another time it left gaps... It is definately badly bugged. NOT IMPRESSED BUT NEXT VERSION MIGHT BE VERY COOL

O & O Defrag (2nd BEST - Pretty Decent Program)
(Respects Layout.ini)

Goal: Defragment and Consolidate ALL data (Order Alphabetically)
Result: If you have to stop and restart it always restarts at 26% no matter how far you were with the defrag. Very Annoying. UNDOING perfect work. But you can work around this by excluding directories that have already been done!

Diskkeeper 2009 (You have ONLY one choice for File Placement -
Advanced Last Accessed Method)
(TSR Ability and Boot Defrag)

Seems to be totally focused on most accessed method on all drives. Very little choice to defrag any other way.

Does not show you the drive that it is defragging graphically (in order to save CPU usage) which is fine if you are a IT Admin person but totally irritating. I want to see what the drive looks like and I want to know which files are being moved and where. Plus I want to be able to click anywhere on the drive and get info displayed about which files are there and what there status is.

Paragon Total Defrag

Did not use it much. Uses direct disk access.



Donn Edwards

I was intially intrigued by the results published in this article, and have purchased a PC specifically to benchmark Defrag programs. But in setting up this PC i have come to the conclusion that your article is bollocks.

PCMark Vantage's HDD benchmark uses the Intel RankDisk technology, where different kinds of disk activity is recorded and can then be played back on other systems, even though the software (e.g. Alan Wake) may not even be installed on the system. This is easy enough to do for the "Read" activity, and will produce identical results irrespective of the layout of the files on the test system. See PCMark Vantage Whitepaper page 36

For the "Write" activity, a sandbox file is used, and an attempt is made to place it as close as possible to the original file location. This is fully described in the PCMark 05 Whitepaper, page 17:

"RankDisk records disk access events using the device drivers and bypasses the file system and the operating ystem’s cache. This makes the measurement independent of the file system overhead or the current state of he operating system. In replaying traces, RankDisk always creates and operates on a new “dummy” file. This ile is created in the same (or closest possible) physical location of the target hard disk. This allows the replaying f traces to be safe (does not destroy any existing files) and comparable across different systems. Due to the atural fragmentation of hard disks over time, they should be defragmented before running these tests."

Effectively this means that what you guys were doing was using various defrag programs to move free space around, and the resulting tests were either "better" or "worse" based purely on whether the sandbox file could be created near the start or end of the drive. It had nothing to do withi how well or badly the files were optimised.

In other words, the "results" in the article are completely meaningless because the HDD tests don't measure the placement of actual files on the actual test system at all

I'm not saying this error is deliberate, just that benchmarking fragmentation is particularly difficult.


Update: I have posted measurements on my blog which illustrates this problem.



I've seen some issues where MaximumPC has both touted it as making a HUGE difference in performance and making NO difference in performance.  (Search the backissues, fellow readers and you will see why it seems Maximum PC speaks from both sides of its collective mouth about defragging).

Which is it, MaximumPC?

The last paragraph in this "review" identifies exactly what I have been talking about!  MaximumPC created this article to compare 4 defraggers, presumably to see which one was most effective, then it closes with the last paragraph titled "don't waste your money or your time!"  If defragging is wasting your time for negligible gains in performance, why run the comparison at all?  Are we comparing the slowest of the economy cars here or what, and if so, what's the point of such a comparison?




I was just reading and commenting on the antivirus artical that I would like an alternitive set of utilities one of which was a defrager, how ever as stated many times by other posters, your artical is a bit lacking

lacking in that it used a limited amount of defragers, and only 1 host environment, and a host environment that is pretty much junk to say the least and has a life expectancy of about 1 year.... tisk... tisk....

if it were I doing the test, firstly I would run it on a win2k and a winXP system, IMHO these are the 2 primary OS's in the market today, (excludeing Linux and Mac flavors).

scondly I would compair it to the native defragers built into windows, and one of the oldest defrag utilities with a continuiing lifespan, Nortons Speeddisk, which has been around for quite a few years and quite a few OS changes, sadly like all of Nortons products after about 2004 even it began to slip in the desireability levels, but regardless it should have been one of the above mentioned contenders.

also start up and shut down times being what they are, they are  always a source of contention, but as also mentioned so is the normal hour by hour and minuit by minuit open and closeing of app after app after utility after music, after game, after.... well.... after what ever.... if the the normal opperation of your system you don't notice a differance, maybe it's not working and time to try something else, or if after running a defrager you can't boot.... well.... been there done that.

regardless of what you use, I think in addition to start and stop times, during the day performance needs to be considered as well as data integrity which wasn't even addressed... an oops? or in this day of redundancy an unneeded consideration?

well off to scower the web and MaxPC for utility reviews.

gl all and Happy New Year




When is the degragging Linux guide going to be released?

Why do you have to have Windows to take it to the max?



This was a short-sighted test, examining only two items. defragmentation, then startup/shutdown times. Where on Earth do people get the idea that the benefits appear in those areas. Where on Earth do people get the idea that those are the only factors to study?

It reminds me of the tests they used to do with the speed limits on the roads......they looked at the speed limit, and the number of deaths. AS IF the ONLY factor in determining the number of deaths was the speed limit. As if things like how closely you followed the vehicle in front of you was NOT going to change the numbers.

Fragmentation involves the files on your computer being in multiple pieces. So it seems to me the improvements would be related to how smoothly and quickly those files open after defragmentation. You run  Defraggler (free (Piriform, makers of CCleaner), view the files list, making note of the major files (outlook usually one of the worst). Open Outlook, notice the time it takes. Close it. Defrag, so Outlook's .ost file becomes 1 piece. Now open Outlook again. And oh, by the way, that reduces the amount of movement and work required by your hard drive, so you are adding spin cycles to the life of your drive. It has to work less.

I use Defraggler, Auslogics, and IOBit as my three defrag programs. I follow a certain sequence that involves daily, weekly, and bi-weekly routines using various options. I use Vista at home, and XP at work. And following my routine, NEITHER of the resident defrag programs EVER report that the hard drive needs to be defragmented. And the quick version of these defrag programs finish their work in literally seconds to a couple of moments. The longer runs (Defrag and Optimize in IOBit) usually run far shorter times than you found in your tests. And they remain short as you defrag regularly.

And by the way, the native defrag programs are defaulted to wait until 10 percent fragmentation before they tell you that the hard drive needs the service. If only system slowdowns didn't occur until that number reached 10 percent, we would be fine........



 Clearly this defragging topic deserves further studies. There are just too many unanswered questions, and everyone has a different theory. I think the mag should delve deep into this, using this article and it's comments as a basis.


"There's no time like the future."


Norbert Grün

    Dear members!

  • GRRR! Your automatic forced me to abuse the unordered list in order to prevent my comment being lumped into one paragraph. Every other trick you suggested refused to work. :-(

  • Under MS-DOS, applications were loaded into memory in one chunk.

  • Thus a defragger was effective.

  • Under Win 3.x, this still happened, but the DLLs already were loaded in parallel, creating already contention for the disk arm.

  • Since VAX/VMS, programs get loaded by the page fault mechanism.

  • This holds for Win32 and the NT line up to Vista, but also for 4.1bsd, the modern BSDs, Linux and other systems like that like MacOS X, older MacOSs played in the Win 3.x class.

  • What does this mean?

  • Well, the EXE or ELF binary will read in sequentially the program's structural information, then the process' address space is set up and the first pace of code will be addressed resulting in a page fault.

  • The loading process will yield to the kernel which issues the appropriate disk command.

  • In the meantime other processes will have their turn.

  • Then the disk interrupt will fire, the kernel will read the disk block(s) into the phsical address space reserved for the new process and the process will start executing the code.

  • Since a page is usually 4KB in size, a new page fault will occur after feeding this much code to the CPU, probably already rescheduled by preemptive multitasking, worse, it might contain a branch or call that will trigger another page fault.

  • It is very likely that the disk block following the first one will have already rotated away or being somewhere else on the disk in case of a branch or, even more important, a call.

  • There is a read-ahead strategy for sequential disk access, but this is quite ineffective in this case.

  • Also the DLLs and the high number of parallel program starts during boot time of a GUI, Windows being much less sequential than Linux prior starting X11, contention for the disk arm will foil any attempt to get all the code pages loaded in one run.

  • Thus your measurements of system performance improvement couldn't be so impressive as you hoped.

  • The defragging issue is historical legacy from old MS-DOS days as I pointed out in the beginning.

  • Back in old MS-DOS defragging days Heise's c't magazine concluded after getting not too impressive results even with MS-DOS, that defragging is an issue for messies which preferred watching their disks being tidied up instead of tidying up their home.

  • For Linux, the BSDs, and other *NIXoid OSs, defragging is a non-issue.

  • The Berkeley Fast File System for 4.1bsd offered a strategy of controlled fragmentation, splitting up big files at every megabyte deliberately to spread them evenly over the cylinder groups.

  • Cylinder groups were employed by OS/2's HPFS, and loading a big bunch of files into NTFS4 showed up a checkerboard patttern under Norton SpeeDsk that hints the same strategy.

  • Also ext2 and ext3 use cylinder-group-like strategies.

  • Unlike Windows, the *NIXoid OSs mostly run as servers 24/7 hooked up to the Internet where no wee hours will occur like in an office.

  • Perhaps a static web server will allow for a defragger to catch up, but mail and news servers will create and delete comparably small files in rapid succession, let alone the Web 2.0 servers.

  • Databases will prefer a raw disk dedicated to them, PostgreSQL splits its data into 16MB chunks that will remain intact over long times unless you delete lots of the records in a table which will result into lots of occupied empty space, which will be freed only if you copy the records sans indexing into a second database and reindex there, but that requires to hook off the database and put it out of service for a time intolerable in the Internet.

  • As a result, defragging will stay a non-issure in the professional world whereas filesystems with controlled fragmentation will be the choice there in order to keep defragmentation at an acceptable level.

  • Look into my profile for my background before flaming.

  • Kind regards

  • Norbert



No flaming intended, your argument seems valid, but I'm just unsure what to say here.

While your explaination sounds completly true and straight, I'm wondering if possibly there's more to the situation than you have examined. My only reason for saying this is that I have experienced the increased speed of defragging first hand. And I've never had a 3.x os system. Other people's tests would confirm this.

Although I agree that today, with the faster hard drives, HD buffers, etc. it's MUCH less of an issue than when I had an old win95 machine with the then "present day" parts. Perhaps with the way XP and 95 worked it was again less of an issue than prior, and while I don't deny the concept of "mass acceptance based on hype", especially in the computer world, reputables have run benchmarks that say it makes (made) a difference. What would account for the benchmark difference of before and after if not for the defragging itself?

I can't speak for Vista, as I have no experience with it.


"There's no time like the future."



So, I still don't get it (and after all the other useless comments to answer this)...





 For XP, the default defragger works fine, but it's really slow.

If you want a much faster defragger, use the Auslogics. It seems to keep your computer running just as smooth. Many people would disagree and say buy one of the others, but why pay when you have two alternatives that work just peachy?

 I used default until I found another free app. Now I use the Auslogics exclusively. I recommend that for XP. It has one flaw though, if you open the program the  only option is to start defragging. There's no "analyze" button saying "you are [this] fragmented".  But since it works so fast, it's not that much of an issue. Again, there are others that people swear by, but they cost. This one works, and works good.

Finally, note that as mentioned in podcast, most defraggers use different standards and methods, so if you defrag with one, then check it with another, it might say you are worse than when you began. Just pick one and ignore that.

hope this [finally] helps.



"There's no time like the future."



Don't know if anyone will ever read this, but here's my useless comments.

While I see Murph's point totally, and XP was not the focus of the article, a paragraph like the following would have brought balance to the force.

"unlike XP and previous, today's machines and Vista don't seem to benefit from defragging utilities. On an XP machine XXXXX is the choice for us here at MaxPC. Let's face it, anything's more useful than the built-in XP defragger, and because of Vista's auto-defrag option, one is more likely to see a vastly more defragmented drive in XP than Vista, and will usually see improvement after defragging. Thus a third party defragger for XP and older OS's makes much more sense."

I am the one who called into the podcast asking about defraggers right before your article came out (Strange Karma, that...psychic moment. I hadn't gotten my issue yet, at least), and I was hoping for an answer as to which defrag util to use on XP. Not on Vista, which your article doesn't answer.

In all fairness though, I totally understand that that was by no means the goal of your article. It's something others need to come to terms with as well.

That said, I really enjoyed the article. Well written, and makes me wonder why windows never had an auto-background defragger before. ...oh wait, it's MS. They do Invention, not innovation.

There's no time like the future.



And what of O&O and UltimateDefrag2008? Did you forget?

How about testing in a game environment, say FSX for example which fragments all over the place and isn't really read sequentially?



I posted an idiotic comment before reading the article. my bad. ty for the edit feature.



I have to call BS on this. Let me list the reasons why:

1) Vista? Who uses Vista? I'd venture to guess the installed base of your readers is at least 80%% XP.

2) Been running Vista's defrag weekly? How about getting a truly fragged-up disk to run this on? How about somebody's home rig that's been installing and de-installing games, utilities, scouring the 'net, while still running their favorite titles they installed two years back?

3) Just measuring boot-up? That's a damn tiny percentage of what I do with my PCs.

4) Why does this frost me? Because I've been using defraggers on Win9x and XP for, well, at least a decade now, I guess. And the reason I have is because it works. I let my kids (PC gamers, demo-tryers, and web surfers extraordinnaire, 14 & 18yo) run their rigs for months and then walk by with, in my case, Diskkeeper. I run the boot-time defragger first, then the normal defrag from the desktop. I've not measured results because I don't have to do so. Now, there are times I don't seem to recover much when I reboot, but apps do run much better.

Hell, in running Hellgate London through all the patching, etc. I could tell the differences after a defrag if I'd been playing a while and had one or two patches come my way.

This review is useless because it doesn't give anywhere near a decent, reader-like testbed situation.

Try again, please!

But I do love your mag - boot subscriber here...



First off, we don't give credence to subjective "experiences" and "feelings" when we can use numbers to prove points instead.

Second, you're missing the point of this article. Will you see a difference in a super-defragmented machine when using any degfragmentation software on the market? Possibly. We didn't test that as the main focus of the article. But just to note, we didn't see a difference between a 10% fragmented drive and a 0% fragmented drive using Vista's built-in client.

We were actually trying to find performance differences between Vista's free built-in defragmentation client and aftermarket defragmentation software you have to pay for. Based on our testing, we saw no difference whatsoever. Stick with Vista's free client--don't waste your money.

And you don't think an HP desktop machine loaded with common software and used like a "normal" user is a "decent, reader-like testbed situation?" What would you prefer?



I understand completely not giving credence to subjective measurements and terms.

And 10% is NOT very fragmented in my experience.

I didn't read the introduction of this article as Vista-specific. And I reject the idea of Vista as part of the majority of your readers' platforms.

Your average power-user still runs XP with no default auto-defragging service. After a few months they are often heavily fragmented - at least, this is my (subjective) experience.

I don't believe the average business desktop gets as fragmented as quickly and heavily as a gaming and multimedia power-user's PC. Perhaps I misperceive the HP you mention, but it didn't sound like much of a multimedia gaming rig to me.

I appreciate the fact you've looked into Vista's defragmenter and third-party defrag products. Given your refocusing my attention to the fact this was a very specific test with a very narrow focus, I simply believe your test is useless to the majority of your power users.



I'd say it's clear that Vista's not in the majority, but neither is Linux, and I don't hear this kind of outcry when a Linux article appears on the site. Where in this entire article did it say that they were writing this piece for a majority of their readers, as opposed to a segment? Sometimes you just have to step back and say "OK, this one's not for me, I'll read the next one."



It's my turn to SHOUT!


O.K. well, my life's not even close to boring but the truth is that I do really enjoy the podcast. It aids on the bike ride home from work, the long car trip, the craptastic comercials of FM radio, etc...

I even took to Steam, picked up Orangebox so I could play TF2. P.S. TF2 feels like I am playing in SLOW MOTION after playing UT3 & Crysis for so long. (I read MPC, it's obviously NOT the framerate...)

So even if you guys are not doing the podcast anymore, could you at least let us know, and why?

Let me switch my devotion to another PC Mag if you must... but don't just leave us hanging like heroin junkies who just had the price of our favorite smack trippled like a gallon of gasoline!

On that note, skipping the podcast won't do much to lessen our dependance on foreign oil, if that were the reason you skipped a few weeks.

If it's a "side issue" kind of thing as the poster above mentioned, charge for it! I'll pay 50 Cent per week to hear a decent podcast! Do you take Paypal?

Lastly, um can I have a free soundcard... (j/k)

Oh wait, you just posted one. NVM, (except I would pay for it...)

Thanks guys!!!!

MPC set us up the bomb!

THERE ARE ONLY 11 TYPES OF PEOPLE IN THIS WORLD. Those that think binary jokes are funny, those that don't, and those that don't know binary


Talcum X

I have read plenty of reader rants on the lack of a Podcast. I know we all love and enjoy the extra tidbits of news, tips and laughs, but it is only a bonus offering. An extra to all that is the MPC universe. Lest we forget, their primary function as a group is to produce the magazine, and all that goes into it (ads not included of course). Their secondary job is this website. And thirdly, PR functions (LAN parties and the Podcast included). If the first 2 priorities take up all the time they have (and that's not including time for family and a "life outside of work") then the third function gets put on the back burner as it's just an extra bonus for our benefit. So let them do their job without a bunch of complaining cuz we didn't get a cookie.
Every morning is the dawn of a new error.



On page three, the Benchmarks table shows Auslogics in the column header instead of PerfectDisk.




Ahh, thanks for the heads-up, Codepath. I will take away Butters's lunch privileges for this transgression.



My experience, has been quite different on Vista (work/school laptop) as well as XP (gaming/photoediting desktop). I use Diskeeper 2008 on both, and it does a better job than the built-in defraggers.

On XP, there is no contest whatsoever between the default defragger and DK. Diskeeper's background defrag function frees me from having to defrag at all, my 4 drives are in excellent shape, and the system is always fast and responsive. XP's defragger cannot handle heavy fragmentation, low space or system files efficiently.

Vista's defragger keeps grinding away at the disks for an eternity, and it's I/O sensing is not as good as Diskeeper's so it can bog down the system occasionally. My system runs smoother and there is consistently less fragmentation after I installed DK. Applications and office documents open instantly and boot-up/shut-down times are improved. This is my personal experience (not a placebo effect either lol), so take it FWIW.

I do agree that Auslogics is no good. Infact, on Vista and WHS, you have to use a VSS-aware defragmenter like Diskeeper, if the formatted cluster size is less than 16KB. Otherwise, shadow copies may be purged.



Or, maybe Microsoft could actually use some code from 30 years of file system research instead of ignoring it. UFS2, ZFS and ext3 are all far ahead Windows' file systems.



PerfectDisk lowered the scores from a fragmented drive??? Are you trying to say the WHOLE Max PC group half-assed the testing. Excuses, excuses. Sounds like you are trying to bail out of responsibility on this.



You're right, Wandak. It's impossible that a product would be underwhelming if the marketing for said product deems it to be awesome.

10-Kick Ass awards for all!



Your thinly veiled sarcasm aside, your results still show that Perfect Disk and all other 3rd party defrags REDUCED the PCMark scores from a FRAGMENTED drive. Why? I'm not arguing that they are better or necessarily even needed, just that your testing is highly suspect. So either you're upset by my critique and simply making smart comments to sound clever, or you really cannot explain the fact that perhaps your testing was flawed? Does PCMark actually even test for fragmentation? You insinuate throughout that there are all these unknowns that you cannot account for - "Stranger still..." and then there is whole first paragraph of your conclusion. More questions than answers in this article. Sloppy.


Keith E. Whisman

Hey all you guys are scabs. We are on strike. What do we want "podcast!" When do we want it "Now!"


Keith E. Whisman

I wish I could have a job where I don't have to make a podcast every week.

Actually I do..



I hear what you and the staff are saying but I agree that it would of been more interesting if Vista was more heavily defragmented... I have my Vista defragmentation set to off and defrag every week or so with Auslogics Disk Defrag or since I have a dual boot with XP on E: drive and Vista on C: drive, I use XP Pro's defrag utility on the C: Vista drive. lol.

And Man I miss the Podcast too :(



Hey The Murph,

Have you heard about Disktrix UltimateDefrag? It not only defrags but also rearranges your files so that your most needed files are moved to the outside of the platter for faster access. Version 1 is now available as freeware. Version 2 costs money. Perhaps it deserved some consideration in this article?

Speaking of this article, I hate to say it, but it felt pretty half-assed. Right off the bat you admit that your test hard drive wasn't very heavily fragmented, which pretty much negates the entire rest of the article, especially the conclusion that defragmenting the drive didn't yield much performance gain. Perhaps it would have made a difference if you were defragmenting a drive that actually needed it? You blame this problem on Vista's fragmenter--so why didn't you get your test hard drive from a WinXP machine? Somebody in the building has got to have one, don't they?

This just gives the article a sort of, "eh, whatever" feel, like you didn't really care about doing it right, or you were rushed to get this thing out the door. Sloppy methodology leading to dubious results is not the MaximumPC way!



The point is this: is the Vista defragmenter *better* than the other defragmenters. Not "look at the differences between a 40% defragmentation rate and a 0% defragmentation rate."

Each defragmentation program will move and shift around data depending on what its individual algorithm sees as fragementation. While a ten to fourteen percent defragmentation level isn't huge, remember that Vista's is designed to run automatically every week. Thus, if you *ever* get into the 20-30 percent range, something's wrong.

And as a side note, these testing scenarios were decided by the Maximum PC staff -- no need to hang me out to dry on this one. This methodology is what we, as a whole, felt was appropriate.



I see your point. In an article pitting third party defraggers against the Vista defragger, you could have theoretically started with a drive that was reported to be 0% fragmented, as what each software considers to be "fragmented" varies, and that variation likely has a direct impact on the overall effectiveness of the software. So the initial state of your drive was really not so important as the state of the drive after defragmentation. You have set me straight. Touche, The Murph. Touche.

This touches on another question, though. Your article is obviously directed at users running Vista, which has me wondering: What percentage of MaxPC readers are actually running Vista? Everybody's been so down on it since it was released, and I clearly remember what a pile of ass WindowsME was, so I have so far saved my cash and stuck with what most are acknowledging is the superior version of Windows: XP. How many others have resisted the change? Are we the majority or the minority?

Oh, and I do realize that much collaboration goes on in the labs, so I was not entirely directing my ire specifically at you. If it will make you feel better, we can claim that every "you" in my second and third paragraphs of my first post were the plural "you all there in the lab", rather than "you The Murph".


PS. I, too, miss the podcast.

PPS. I love you guys.



It's all good. <3




you are alive! so where is the podcast?

Log in to MaximumPC directly or log in using Facebook

Forgot your username or password?
Click here for help.

Login with Facebook
Log in using Facebook to share comments and articles easily with your Facebook feed.