| 1. |
Solve : Overclocking Asus ROG GTX 1070 Strix OC 8gb - Micron mem? |
|
Answer» Greetings, I wouldnt overclock unless you really need to. I personally save overclocking for end-life of my hardware when trying to push older hardware to do new tricks to stretch its life another 6 months of a year etc. Appreciate the great effort you put in writing you reply . You are absolutely correct, mining Bitcoin itself is fruitless nowadays with Nvidia or AMD but what I am aiming for is the other coins that still are somewhat "profitable". Although with one 1070 it is far less that than a proper mining rig but heck I am just curious to how and if it actually works. A good advantage for me is that the electricity bill is included in the rent fee so no extra cost there. I have flashed the card and thankfully it did not die . The reason I flashed it is for fear of what was mentioned in the article above about the card's bad memory even at stock speeds at time. One question though, I have downloaded both GPU Tweak II and MSI After burner and am curious to why they are showing different overclocking stats? The memory overclock with GPU Tweak seems at full while on MSI Afterburner, it shows 0? . BTW, I tried increasing the Mem clock on Afterburner and the card went haywire (Checker pattern on monitor - had to restart) The AMD FX-8350 is water cooled (Corsair H55 v2 Quiet CPU cooler) so I believe I won't be running into issues with its heat. The idle temps are ~30 C. The problem is that the motherboard is old so I am unable to use newer CPUs in the future, like Ryzen for example . Quote from: patio on June 27, 2017, 05:43:55 PM Decent rig...why you wanna overclock ? ?...For 8-12% performance gain ? ? It is my first overclockable rig, just testing things around . [attachment deleted by admin to conserve space]When it comes to Graphics Cards, if you want to overclock just for the sake of overclocking, then you probably shouldn't. Typically, they tend to be far less tolerant to overclocking, largely because the Cooler designs tend to be designed for the factory specifications, so unless the card comes overclocked already, you are more likely to have problems; best case scenario is the fans can keep up which means the fans will die sooner and it will be noisier. The 1070 is a rather high-end card anyway; there wouldn't be any reason to overclock it at this point. Usually Best to wait a few years when it starts to be a "lower end" offering, then you can overclock to eke a bit more life out of it. Another good reason not to overclock the graphics card in your case is that, because of your CPU, it won't have any effect- the 8350FX bottlenecks a 1070. Which leads to the other option- overclocking the CPU. That is perhaps more useful. the 8350 is a few years old now and is at the point where overclocking it can see the biggest gains; It can't catch up to the 1070 but it can at least do better. Quote The memory overclock with GPU Tweak seems at full while on MSI Afterburner, it shows 0?Well, GPU Tweak is ASUS, the same as your card. MSI Afterburner isn't. Quote from: BC_Programmer on June 28, 2017, 03:44:59 AM When it comes to Graphics Cards, if you want to overclock just for the sake of overclocking, then you probably shouldn't. Typically, they tend to be far less tolerant to overclocking, largely because the Cooler designs tend to be designed for the factory specifications, so unless the card comes overclocked already, you are more likely to have problems; best case scenario is the fans can keep up which means the fans will die sooner and it will be noisier. In this case I wont be overclocking it much or constantly. Unless I water cool it, would be much better than fans.No need to quote every reply when replying...Thanks BC on this info, since I too have the FX8350 and wasnt aware that a GTX 1070 would be bottlenecked by it. I have a GTX 570 ( as seen here https://www.newegg.com/Product/Product.aspx?Item=N82E16814130593 ) in my FX-8350 build *which is limited to only 4.0Ghz because the motherboard doesnt support the 4.2Ghz Turbo Mode, which i found out after building it and wondering why it wouldnt ramp up to 4.2Ghz when running a benchmark. Its just 200Mhz (5% on processing demand overclock) but it was slightly disappointing that the $60 AM3+ Gigabyte board didnt have chipset support for the Turbo mode. BC do you know what the maximum GPU is paired with that FX-8350 before the bottleneck starts to occur? Quote Another good reason not to overclock the graphics card in your case is that, because of your CPU, it won't have any effect- the 8350FX bottlenecks a 1070. Here is my $60 Gigabyte motherboard that doesnt support the 4.2Ghz Turbo Mode. I wasnt aware that the 760 chipset didnt support this until after the fact. https://www.newegg.com/Product/Product.aspx?Item=N82E16813128565Quote from: DaveLembke on June 29, 2017, 08:41:25 AM BC do you know what the maximum GPU is paired with that FX-8350 before the bottleneck starts to occur? Something around the R9 290 and the GTX 780 is about where the GPU starts to exceed the CPU. Which isn't to say that there is never a case where a better graphics card wouldn't afford you any benefits with the CPU, it'll just be "unbalanced" and most games/software aren't. Cool thanks for this info that will help both of us. YES, Resolution makes a substantial difference. With my 770 I found I had to reduce many titles to 1920x1080 (from the 2560x1440 of my monitor) to get playable performance (which wasn't bad, as 2560x1440 is huge). I replaced it with a 1070 and now it runs effortlessly even at 2560x1440. I think perhaps the 4770K I had "bottlenecked" the 1070 as well, as I found that when I decided to fiddle with overclocking and threw in a 212 EVO Cooler and cranked it to 4.5Ghz (From 3.5Ghz), it improved performance in many games. Similarly, this applies regardless of the game, not just the latest ones. I found Need for Speed:Shift unplayable on my old desktop, which played it perfectly with a slightly lesser card than it has now (9800GTX+, was a 9800GT); it was getting perhaps a frame every 5 seconds just at the menu. The reason was simple- it defaulted to the 2560x1440 resolution of the monitor. Reducing it to 1920x1080 made it run full-speed again. As far as overclocking, the only overclocking I've done is on this very system I'm using now- 4770K. I replaced the stock cooler with a Cooler Master 212 EVO and cranked the clocks up to 4.5Ghz without any issues (from 3.5Ghz); However I reduced it to 4.0Ghz since 4.5Ghz required me to up voltages which I didn't like. I liked the cooler so much, I actually got another one that I installed only a few days ago (Tuesday) into the aforementioned "old desktop" as the QX6700 it has runs quite hot. I think the most interesting part of Overclocking is how easy it is now. It used to be that overclocking meant actually desoldering and replacing a crystal on the motherboard, then you could overclock by changing jumpers (eg Socket 7), then you could do it via BIOS settings. Now, you just fire up a utility and it pretty much does it for you. Takes all the fun out of it if you ask me. My understanding is that for any overclock one wants to verify stability. for CPU this means using Prime95, for GPU, running something like FurMark. A few hours is usually enough, and you can verify that the temperatures don't get too high. Cool thanks for info BC... just realized that I deleted my additional info because I thought maybe I was rambling and you must have cause it to read the full extent before I edited. Content I edited out in case anyone is wondering was that i run my games at 1280 x 1024 with a 19" samsung VGA connection monitor and games seem to run better at this resolution because the GPU isnt having to render for a higher definition display. I have a FX-8300 3.3Ghz and a FX-8350 4.0Ghz and both CPU builds with this GTX 780 that I have I saw no difference in the 700Mhz slower FX-8300. Additionally Witcher 3 which is the heaviest resource game I have calls for a GTX 660 minimum but I got it to run just fine with the GTX 570 card that I have in my FX-8350 build. My FX-8300 build I put the better GTX 780 into because I like running that system better because it runs way cooler and not a fan noisy at 3.3Ghz and 95 watt TDP vs the 125 watt TDP of the 8350. I tried Witcher 3 on a GT 730 128-bit with 2GB VRAM and at 1280 x 1024 its sluggish. Didnt try reducing any further because I know the GT 730 isnt really a gaming performance card and I got bitten by marketing thinking GT was good for gaming and save my money from buying a GTX. How greatly I was wrong. Added content: The 9800GT that I have in my wifes computer, my video card from years ago, since she gets hand me down upgrades, runs better than the GT 730 for some games better frame rates, but some differences in the graphics when it comes to shadows in games etc and water effects for games like World of Warcraft. Made sure the graphics settings were the same for both systems and they were. My wife is running a Core 2 Quad Q6600 2.4Ghz with 6GB RAM and the good thing is she hasnt complained yet about her computer being too slow etc. Originally her system was a Core 2 Duo E6600 2.4Ghz and that started to lag. So I tried a Pentium E5400 2.7Ghz for 300Mhz faster and newer core design and it benchmarks slightly higher and that worked for a little while to make her happy, then I found Q6600 quadcores for $15 each on ebay and bought her an upgrade to 4 cores and that Q6600 is awesome and stays cool on the heatsink that came with that HP Tower. Also years ago there was a overclock hack for taking a 486 DX 66Mhz and getting it to 80 or 83Mhz. I never did this for fear that I would kill my computer back around 1995 and friends with money had Pentiums. The 486 was the best system I had and other systems I had as spares when the 486 was down as 386, 286, and a couple 8088's, with the 386SX40Mhz with 8MB RAM and Windows 3.11 being the only other computer that had dial up AOL 2.5 internet over a 14.4 modem. The last 486 I had was a 486 DX4 100Mhz and it wasnt that impressive running Windows 95 on 24MB RAM so I am glad I didnt do this hardware hack to overdrive the 486 to 80 or 83Mhz that I had. The Pentiums were still beating the pants off the 486's even when clocked higher and its because of core design and many more transistors. The death of the 486 DX 100Mhz was that the Y2K Bug affected this board, and AOL which might have been 3.0 or 4.0 at the time didnt like the system date/time being different than actual for certificates of sites requiring a match to computers date/time. So I gave this system away to neighbor who was interested in offline DOS games etc, and I scored a Pentium 75Mhz which was Y2K compliant for $20 at a computer show/swap meet. No one wanted that poor Dell full height tower empty of 5.25" bay drives. The guy booted it up and it had a 6.4GB drive with Windows 95 on it running healthy and 64MB EDO RAM and so I bought that up for $20 and moved an extra 4.3GB Bigfoot HDD into one of the bays and a 48x CD-ROM into the other. 10.7GB of storage back then was great when most games were 600MB or lesser installs from a single CD. The computer show they were pushing Linux and had teams of people installing Linux to systems, and it was bring what you have or buy up something there and then install to that. I was impressed that it had a healthy copy of Windows 95 running on it and so I sort of scored an additional copy of Windows 95 this way by not formatting that drive and not installing Linux to it for free that day. Ran whatever build someone else did to it which wasnt illegal since there was transfer of ownership of computer and it had the Windows 95 badge on it. Made sure it was empty of virus's before using it for email etc since it wasnt uncommon to buy a used computer back then and get more than you expected. |
|