InterviewSolution
Saved Bookmarks
| 1. |
Solve : pixel shader? |
|
Answer» http://www.geeks.com/details.asp?invtid=188-04E40&cat=VCD Does the card in the link support pixel shader 2.0? It says Full Microsoft DirectX 10 Shader Model 4.0 Support Is "pixel shader" support a generic term for the technology? I've got a game that requires "shader module 2.0" or higher and I want a cheap card.. I don't want to spend too much UPGRADING this eMachines computer..Please see the chart here: pixel shader Since the card is Shader Model 4.0 and the game only requires 2.0, it would run.well thx It just occurred to me that "shader module 4.0" DirectX supersedes "module 2.0" I guess I'm in the clear if I get that cheap card Anyone know how to use 3D Analyze? Instead of getting that card, maybe I can emulate the video card using 3D Analyze. I've tried a few suggestions on what BOXES to check while running the game executable, but I always the error: "Error while injecting dll into target process"Emulating PS 2.0?? Highly doubt you can, even if you did...the game would run like junk. The only way you can win is if you buy an actual card. Well I actually did buy that card I mentioned, but the one I bought has DVI-I built into it.. It hasn't arrived yet, but hopefully by Christmas it will be here. I had to buy a 20 - 24 pin adapter for the POWER supply that eMachines skimped on, to supply the 12 volts to the card that it requires.. It's a RELATIVELY low-wattage card and the psu in my machine has room for the required amerage draw on the 12 volt rail. Quote from: EEVIAC on December 17, 2009, 12:00:36 PM Well I actually did buy that card I mentioned, but the one I bought has DVI-I built into it.. It hasn't arrived yet, but hopefully by Christmas it will be here. my 9800GT has two DVI connectors and it came with a converta-ma-jig (DVI to D-SUB) that I'm using now.Quote from: BC_Programmer on December 17, 2009, 12:50:33 PM converta-ma-jig Lol I think I'll add that to my own vocabulary and use it from now on Quote from: EEVIAC on December 17, 2009, 12:57:42 PM
Dual DVI cards usually do come with an adapter from DVI - > VGA yes, or in my case, two! one for DVI -> HDMI and one for DVI -> VGA.Adapter? OH! you mean a converta-ma-jig. heh I've heard of watcha-ma-call-its and dilli-ma-jiggers but converta-ma-jig is a new one Got the video card in the mail just now .. I'm now the proud owner of an ATI Radeon x700 Pro.. I'm a little disappointed because at first glance, I could notice smudges on the heat sink/fan, and some dirty spots on other places... It's obviously used.. Next time I will read the fine print.... lesson learned.. It seems to work good though, so far.. I'm a little annoyed that the Fed X guy set it down by our front door, knocked and took off, I could see him walking away... Anyone could run off with it and no body would ever know... I'm willing to bet that Fed X has had many complaints because of this kind of thing.. Actually it just occurred to me that, I don't need the 20 - 24 pin adapter. I should have thought it through more before I ordered it.. My psu allows 15 amps. on the 12 volt rail. 5.4 max. will be used by the cpu.. .4 max. will be used by my fan moters, DVD/HDD moters will use 2.56 max.. 6.25 max. will be used by the video card... Total-14.61, assuming each device will draw maximum power, which I doubt will happen.. The thing is that each of those voltages are drawn off totally SEPARATE psu connectors, except for the fans that draw power from the board. The cpu has it's own 12 volt connector and the SATA drives have their own connectors leaving the 12 pin on the MAIN connector totally free to draw 6.4 amps for the card + .4 amps. for the fans... The 20 pin MAIN connector is using the "HCS" standard terminals which would allow 9 amps total across the 12 volt pin... The whole point of delivering power over over multiple pins is to prevent the terminals from melting, which is why some high wattage PCIe cards have power connectors on them, which balances the required amperage draw over multiple pins... |
|