1.

Solve : GPU Assistance?

Answer»

I'll post my PC specs first before anything:

Motherboard: Gigabyte GA-H61M-S2PV rev.2.2
Processor: Intel® Core™ i3-3240 Processor 3M Cache, 3.40 GHz
Ram: Kingston HyperXBeast 8GB Kit / 2 x 4GB DDR3
OS: Windows 10 Pro

Now, I've had this PC for about...3-4 YEARS now I'll say. Roughly. When I started online classes, they sent out all the components and our first task was to build our very own PC from scratch. So this was (and is) my first and only build ever. Never gave me any problems whatsoever. However, something was always off to me whenever I would try to run games. They would have low FPS, even on low settings for a lot of games. And I thought that was weird, because the GPU that came with the build is a EVGA GeForce GT 730 4GB DDR3 128bit Dual DVI mHDMI Graphics Card. So what I found out was that my PC was NEVER using the GPU, only running the onboard graphics, which are the standard Intel HD Graphics (yay). So my problem (and question): what seems to be the problem? Did I do something wrong initially when building the PC? Is it the GPU itself, the motherboard, etc. I'm really lost, and I'm probably going to just get a new motherbaord and case soon anyway, but I want to know if there is anything I can do first. This is my baby  The GT 430 video card is very weak to todays standards, and 3 or 4 years ago it was weak then too, and surprised they set you up with that for hardware although understands that it was a low cost hands on build. I'd junk that video card and upgrade to something newer. A GTX 1050 can be purchased for around $100 new. As long as the display is connected directly to the video card and the driver for that video card installed for your OS, it should work flawless.

*Only concern i have is that if your display is VGA only, you will need a display with a DVI or HDMI port to connect to the better video card. I bought a GTX 1050 and tried to use a DVI to VGA adapter and sadly the Analog DVI connections were only available in the GeForce 700 series and prior as they dropped legacy support for VGA in everything newer. I ended up returning the video card to get a refund of the $120 instead of buying a new monitor for it that was Digital DVI. If you want to stick to the VGA display you have if DVI or HDMI is not available, you could buy a GTX 780 video card and go that route for much better performance than the GT 430. There is also a GT 730 out there as I have one of these too in one of my BUILDS, but the GTX is far better than the GT cards, so its worth it to spend a little extra and have the better performance instead of noticing a difference in the GT 730 and then as you start playing games you notice that the frame rates are better than the GT 430 but still lagging.

Lastly if you decide to go with video card upgrade, the GTX 1050 comes in 2 flavors one that is powered off the PCI Express slots on board power. There are others that require 12V molex connections for power supply to power it directly vs through motherboards PCI Express slot. The cards that are powered off of the power supply direct generally have better performance than that of the cards that rely on just the wattage limitations of the PCI Express slots power. So when GETTING a GTX 1050 if you decide to go this route be sure to get the one without the 12Volt Molex connections if your power supply is lacking them, however if your power supply has them tucked to side inside case, I'd go with a better video card that takes a direct 12 volt connection as they are clocked faster and draw more power which = better performance vs a card that is underclocked to stay cool and low power consumption to stay within the maximum wattage draw of the PCI Express slot.

*Note: Geforce GPUs mentioned above are not the latest and greatest but they are plenty for most games being the GTX 780 and GTX 1050. The GT 730 does way better than the GT 430 that I also own, but games like Witcher 3 only play good on the GTX 780 and newer as I've tried the GTX 260 and GTX 560 with Witcher 3 and they play it but with lag. It comes down to what games you want to play and the game requirements for those games too. Witcher 3 was just an example of a heavy hitting CPU/GPU game that I own.Dave, unless I missed something, didn't OP mention a 730, not a 430?

Quote

So my problem (and question): what seems to be the problem? Did I do something wrong initially when building the PC?

Maybe. You need to connect your monitor to your dedicated graphics cards port to use the dedicated graphics card. Otherwise it has no way to show anything. BIOS firmware usually tries to be smart to avoid problems, so if you plug the monitor into the on-board instead it will decide you probably want to see something and init the internal graphics.

Of course at that point, since even with the drivers installed there is nothing connected to the dedicated card you cannot really utilize it's hardware.yeah, it's definitely a 730, not a 430. And I've done that already as well. Should I do something in the bios first? And I've tried downloading the NVIDIA drivers as well, but the process fails since the computer doesn't recognize that I have the GPU in.There's the issue right there,,,proper drivers arent;t installed

Remove the card...do a few cold boots...then remove all power and re-install the card.
Then install the drivers...it may be best to use the ones that shipped with the card...newer drivers ain't always better drivers.

Let us knowThat's the thing though. I've removed the card. Booted without 3-4 times, PC is perfectly fine. Then, when I reinsert the GT 730 and boot, the fan starts up and runs, however, when going into Device Manager, still nothing. Under display adapters, it still only shows Intel(R) HD Graphics. I'm going to try and install the 730 drivers right now. Also, for the record, I've already gone into bios and tried changing the graphics to PCI, still nothing.There is no need to visit the BIOS until you have installed the proper drivers...waste of time.it's unclear what you specified you had done with relation to my previous post, so apologies if I'm repeating myself, but you've so far not specifically mentioned how you have things plugged in. Is the monitor plugged into the graphics card or the motherboard?

My approach when I switch from integrated to dedicated on a Windows PC:

1. Go to BIOS settings and ensure "Init Display First" or similar option is set to PCI-E
2. Shut down and install card
3. Plug monitor into card's port
4. Boot up into Windows
5. Install drivers
6. Reboot
Oh, sorry! When I plug my monitor into the NVIDIA GPU, all I get is a blank screen. However, I will try it the way that you recommended first and see if there is any change.OK, in that case I'd say a first step ought to be getting the system to show the POST/boot screens when the display is attached to the Nvidia card. Generally speaking that should be the "Init Display First" option (or similar)- That option effectively dictates which display output gets used.

There is a possibility, I suppose, that the card was Dead on Arrival and you never noticed, too- but let's stay optimistic!

That's what I'm afraid of. So even if the fan on the GPU runs, there's a strong chance it's still dead? Awesome. I mean, I could always just buy a new GPU anyway. Just wanted to run through my OPTIONS. Oh, and via BIOS I tried that already, for it to run display from the NVIDIA first. No luck. Quote
BC Stated ... Dave, unless I missed something, didn't OP mention a 730, not a 430?

Strange as I could have sworn I read 430 the other day.  Time for a new set of reader glasses. 

If you need to replace the video card and want better performance than you had though the suggestion on the GTX 1050 is still an inexpensive upgrade for a gaming system.Lol it's alright, honest mistake. And yeah, quite frankly I'm tired of using Intel HD Graphics. It suffices for now, but it's time for an upgrade.


Discussion

No Comment Found