Also thanks for the suggestions. Versioning might be next on my list, although one might achieve similar results with incremental backups, but much less user friendly. I may consider it for larger project software development.
Try it and learn it on smaller stuff. I use it for everything. If you were to automate version control archive backups you can simply consider being checked in being backed up. Although mirroring gives a similar if not smoother feel at runtime by automatically copying files, it can overwrite a good file in the backup with broken code when it automatically backs up an incomplete change. I find the TortoiseSVN explorer extension very convenient. Many IDEs also have SVN (SubVersioN) integrations that may be convenient. CVS has an even larger number of integrations, but CVS has less useful features for me.
I use SVN over CVS primarily because it supports versioning of directories.
Valid questions! What I'm seeking to accomplish is multifold. First of all I want to be able and create/manage partitions for multi-OS booting. This is for software development when I need to verify compatibiliy under multiple OSes. It can also be used for redundancy, to keep or create a relatively clean version of the OS for quick recovery. Virtualization would also be an option, if not for a small voice in the back of my head that says that in practice, there may be a small difference (who'd claim 100% error free virtualization software) with the real deal.
Who would claim 100% error free software? You cannot guarantee someone elses stack. They will likely have a different video driver. They will likely have a different screen resolution. They will likely have a different CPU model. They will likely have a different amount of RAM. They will likely have a different amount of hard disk storage. They will likely have a different keyboard. They will likely have a different mouse. They will likely have a different DVD burner. They will likely have a different case. They will likely have a different motherboard. They will likely have a different chipset. They will likely have a different drivers running for many things.
And that short list excludes all the things that could be different and it likely excludes many things that are likely different.
But, that short list also yields millions? billions? perhaps more permutations and we have not even got into installled programs and their drivers. And what about tools like anti-virus and different anti-virus settings? There are too many permutations.
What is virtualization? Virtualization is simply a standardized software created hardware configuration. You have a software video driver, software SATA drivers, and etcetera.
Even multi-booting you have a software video driver, software SATA drivers, and etcetera.
You are still running the same OS. But instead of an individualized model you are getting one with software imposed limits that are run and tested by thousands if not millions of instances daily. This is at least comparable in stability to a production run of a system for a big box store in stock configuration. The client system becomes less stable once it leaves the stock configuration. But some of the driver updates were for this stock system where thousands or millions of users were hit by a hardware bug and needed a software update to cover it.
Software updates are part of life with consumer hardware.
With 12-16 GB of RAM, you should be able to virtualize a few systems
simultaneously. You can start, stop, and save the state of virtual machines. This means you can test on a virgin install every time. Or shut down a virtual machine and save it so then when the virtual machine opens, it is already booted with your tools open and ready to work. For most software testing on a 64-bit system with large RAM it should be a real time saver in many workflows.
I should note, I have used it in small amounts. But I lack the RAM on 32-bit to run enough systems.
Beyond that, for older OS'es virtual machines provide a more stable and backwards compatible platform. In some cases a virtualized OS might not even run on the hardware without the virtualization due to changes in motherboard technology.
Virtualization praise over.
The downside is lack of bare metal access. If you are doing programming on video card hardware, then you may need direct access for testing as virtualization may not work. But most non-graphics and non-video related professional software I have seen needs access to video card hardware.
If this is your case, then skip virtualization.
Remember, the goal of these tools is to improve your workflow and save your time.
Multiple goals, probably best served by multiple dedicated programs. The Acronis TrueImage program looks good, but that also starts to get bloated by the addition of more and more features. As complexity grows, so does the chance of bugs emerging when you least need them.
Without complexity growing, machines cannot do more for us. Whether that complexity be in how we think about our tools, or in our tools themselves to make our thinking easier.
some thoughts, <smile>
Sean