What type of graphics cards are you guys using?

Just watched the video… it’s more biased than a political ad, really. Basically, “Wah, nvidia spends millions on development and won’t give it away… wah. AMD can’t keep up because they don’t spend that much and so open source everything so they get help from the community. Why won’t nVidia just use their open source stuff instead of continuing to push the bleeding edge? wah wah”

If I were nVidia and I had a whole team of quality software engineers, I wouldn’t want to start using someone else’s crap either… especially if I were using it for a consulting business.

It completely leaves out how AMD won the lead on DirectX since like 10 or 11 and nVidia has constantly had to catch up there.

nVidia aren’t saints for sure but that video is so clearly AMD-biased… and even then he had to admit how awesome nVidia tech is. (And it underscores what I was saying about AMD writing to benchmarks, too.) I wish they’d play nicer… but they are also the prime driver of every cool new thing we have. I cut them some slack.

2 Likes

Personally I got old AMD Radeon. Some years ago when I bought it, it was quite good. Now friends recommend me to buy modern GeForce GTX. It served well and I don’t play new games with supa graphics but I should change it because I can’t profile game. GPU goes to 100% and sometimes I miss some measurements.

Have nvidia (760x2) on windows, the only problem I have is with nvidia proprietary linux drivers specifically multi monitors. Though from what I heard the radeon 4xx series of gpus are quite good.

My GTX 760 goes to 100% all the time, I think that’s normal when running stuff.

Is there a benefit to having two cards in SLI for multimonitor setups? Since it would usually be cheaper to buy a single 770 and you’d even get better performance from it than two 760s.

They really are. My 4850 is able to play lots of modern games at somewhat decent quality. It also runs very cool. The only time the fan has to kick up in speed is when I’m using my gamecube emulator. I’d keep using it if it wasn’t reaching the end of it’s life

Nebula boy?

1 Like

I just got two since they were being covered by financial aid

1 Like

That and pretty much any slightly more demanding non-2D game I run on it. If not all the time 100% it’s somewhere 80-100 for sure. While turning my room into a sauna.

Regaeding update software, it’s called AMD crimson software and is replacement for amd control catalyst.

Intel’s latest on board series is solid. In fact I’ve never really had too many problems with Intel other than they are low energy and speed by design. I think they’re underrated. For the money you pay you get more than a bargain. Intel chips will usually out live every other component and they have a broad range of compatibility.

Low temps? bahaha take thet junk of nuclear reactor away from me (I had a nvidia 6600 (god i start to get old) burn itself crispy(2 days after warranty), since then I kinda dislike nvidia)

My power supply does that to me.

Yeah it’s become more and more of a joke in recent years but still, AMD had to give the 295 X2 stock water cooling out of the box to keep it from melting while even a Titan or a 1080 can work with a crap reference blower fan.

But on the other hand with Nvidia reference coolers you need to go get some ear plugs to no go deaf and preserve your sanity.

Well that is a biased video in some ways (I wouldn’t go as far as a political ad), and I do cut NV a little slack–enough to honestly consider them as a purchase option. Don’t have a problem with businesses competing, or even becoming a monopoly, if it’s for the right reasons and they’re a monopoly because the market decided it–but they aren’t there yet.

AMD has worked well enough for me for the last few years. The market is so lopsided now that I want to support the underdog as much as I can. It doesn’t hurt that they (admittedly, probably not by choice) work with the community more. NVidia hasn’t earned my trust… I believe that if they become a true monopoly in the discrete GPU market, they’d do all they could to abuse it. Not all companies do that, but reputation and history doesn’t indicate that their leadership culture “gets it”.

One way to look at it: AMD is a “tiny” (6 bill market cap) company, which also makes CPUs. Unless I’m missing some big part of this equation, I think it’s amazing they’ve been able to stay as competitive as they have going up against the Nvidia and Intel giants (36 billion and 180 billion respectively).

I’m using the Intel HD4400. The one embeeded in my i5. So far it’s working fine mainly because I’m developing a game for android so I try to maintain the specs of my game low. However, recently I’ve tried to record some videos of my current project and it’s a pain. My idea is to get an nVidia card for Christmas, something like this: http://www.evga.com/Products/Product.aspx?pn=02G-P4-2951-KR to be able to record videos without problems and play some more demanding games.

1 Like

Yup, Intel whatever is that’s embedded in the CPU.
I started to solely use my job’s laptops as my personal computers and even if I had one with ATI graphics ended up disabling it due to shitty linux drivers.

I mostly play old games that were ported to linux or are easily emulated and I don’t do anything advanced in my hobby game projects.

I have an Nvidia GTX 770M and a i7 4700M processor on my laptop. It was a good deal back then because there were no laptops with enough power in AMD. So far I like my setup and is more capable that I initially imagined

So far I’m with nVidia, for long enough time by now. Not that I consider them being right in every aspect they do. In particular, that “just press OK button” approach for the end-user leads to that you have to install few hundred megs bundle, and it’s called “drivers”. And for 3D vision you need another few hundred megs bundle. I personally would prefer to have some checkboxes and download/install/configure just the things I know I need (while they still could leave the whole bundle to users who check “default” option). But from the hardware point of view - not benchmarks in this case - I mean board reference designs and chips themselves nVidia looks to me a bit more solid, and that’s not just for one or two boards. And yeah, I’m still impressed by the time they outperformed Intel by transistor-per-chip count, that means something from a system engineer’s point of view. While driver (and game!) optimization may be of different level and may show different results on different tests, bare metal layout/performance is something that is there once you’ve bought it, and will still be there with new driver version.

p.s. and the thing that makes me really sad nowadays, is that you can’t actually buy nVidia card made by nVidia. You can’t buy Intel motherboard anymore. You can’t buy Micron memory (at least not in my country). And many companies that were aware of real hardware excellence just disappeared - DEC, 3DLabs, Advanced Gravis… continue the list yourself. What you can buy now? Oh yeah, a shiny black-colored motherboard with gold-plated radiators and 12xUSB 3.0 ports integrated… who needs all those ports when the board (a top board of the vendor) is dead in half a year because of a shitty 20-cent electrolytic capacitor? I am not saying that dropping prices is a bad thing. I am saying that bad thing is when you can’t buy quality even if you are ready for high price…

Yep… I don’t know if it’s just mobile crapware to blame for the PC industry’s collapse (hopefully it’s starting to stabilize now) or if it’s some other market trend, like people can’t afford expensive PCs due to a lousy economy or whatever. Pour less of society’s resources into an industry, you have to sacrifice something… quality and progress I guess.

I … I don’t profess to knowing much about hardware, but is it that bad? Everything I’ve bought in the last 5+ years has been budget and lasts forever. I’ve never had a motherboard go my entire life, and usually end up buying new purely because it’s just so old and out of date.