For future reference, don't say "PCI slot", it's the "PCI Express slot". It was confusing me. Anywho, the procedure would be...
- Have the current drivers accessible somewhere without internet access
- Uninstall the previous graphics drivers and just in case, run a registry cleaner. You shouldn't have to do anymore than that.
- If you're really paranoid, you can find a driver cleaner software but I can't exactly recommend one since the only one I'm aware of is outdated all to hell.
- Turn off the computer
- Install video card
- Run XP in safe mode
- Install the drivers there.
- Restart computer, go into BIOS and look for where it has the option of which graphics to boot from
- If there's a choice called "auto", leave it. Otherwise set it for the PCI Express card.
- Plug the cable into the video card
- If you have issues, take out the battery to reset BIOS settings and use the IGP port.
Alright, sorry about that. Anyway, I did figure it out after I did my last post. I just went to BIOS and chose PCI Express from the orignal option of PCI. I shut down, installed my card into the PCIe slot, and plugged my monitor into the card's DVI out. I restarted, got into windows and installed the Nvidia driver. It' working great.
I only had integrated GFX before so I didn't have to uninstall anything.
And I installed the drivers after I put in the GFX card. This is what was confusing me. How can the computer give output via the DVI if the driver is not installed? I know it has some kind of VGA driver for integrated graphics. I just read the DVI carries both analog and digital signal. So is that why you can get output thru a DVI card without installing drivers. If the card was HDMI only, then I would have some kind of trouble, right?