What kind of hardware do you guys like to use for programming purposes?

Ok. Well it used to be the other way around, I guess or I’m just misremembering. My P-pros and circa 2001 Xeons had small jet engines in them to keep them cool.

Either way, the AMD CPUs come with their own heatsink + fan and the CPUs are prepasted… and so far it’s been fine on all 10 of my AMD based workstations here.

I’ve had to scrape and repaste a couple of the older ones when a fan died and that was no different than what I had to do with Pentiums ‘back in the day’. It had squished and spread just like it was supposed to.

Hm, I just thought about that “many slow cores” versus “few fast cores”.
I think there are strong arguments for the fast cores when targeting 3D games.
Isn’t it so that the jME has only one rendering thread (which is usually the case)?
Okay, you can’t run so many world building threads and A.I. threads with fewer cores.
But you should get the higher frames per second with “few fast cores” - like someone already mentioned.
Maybe there’s an error in my reasoning that I don’t see at the moment?

P.S.: I assume that there is a fast graphics card and fast RAM and fast SSD etc. :chimpanzee_smile:

EDIT: And I know you CAN run many threads, but not at dedicated cores.
( I feel like I need to put this here, because of missing “soft filter” )

Even when my games are bogged down at < 60 FPS, it’s not the CPU that’s working hard.

Usually you will have at least two threads: rendering and physics. Generally you will also have one for sound (if you use the built in sound stuff then this is automatic).

But anyway, unless you are doing crazy complicated stuff on the render thread, usually the GPU is the bottleneck. A faster CPU won’t buy much in that case.

I wouldn’t say it’s a difference between fast and slow cores, either. Better to have many fast cores than a few super-fast ones, for sure. Besides, we are talking about a machine for game development and not game playing. Developers will use those cores.

Okay, I wrote “I assume that there is a fast graphics card”.
You wrote that you invest $ 200 into graphics cards. It’s fast, but not that fast.
:chimpanzee_closedlaugh:

Even though JME is part of it, the scope I’m looking at encompasses all different types of programming, at least for my uses. JME, Netbeans for various projects like JavaFX, or web based programs in JavaEE.

I’m not sure if this would be the most workload intensive though.

$200 is the sweet spot for graphics cards. To get much better, you have to double the price… go up again and nearly double again… but in a year that double price GPU will now be $200 and my $200 GPU will be $150. In two years, I can buy another $200 GPU.

You are only renting a spot on a nasty looking curve so you might as well stay in the middle for as long as possible. $200 every two years or so seems to be the right spot in my experience. Though honestly, I also haven’t felt the need to upgrade my GPU in a while either. Mythruna still runs at 140 FPS or so… and that’s fine for me.

Edit: and even at that, the CPU core running the rendering thread is never more than 50%… usually 30% or so.

I think the error in my logic is, that besides doing some enqueued tasks to create or delete geometry, almost nothing happens in the render thread. At least that’s how it should be, because of the mantra “separate visuals (Spatial) from game objects (your custom objects)”.

In Europe i would say around 200€ is the sweet spot as well. My slightly older R9 middleclass card is capable enough of running battlefront and bf4 on nearly ultra
(with slight adjustments, but who needs 16Xantialising in full hd anyways)

16GB ram is also kinda s wett spot, the moduels are cheap enough and reliable enough.

I’m personally no friend of overclocking new hardware. It is one thing, if you overclock a old piece of shit, to make it last a few month longer, but if you buy new, buy adequate to your needs. Also you can get all kinds of nasty hard to track bugs that way. For example rendering issues, object flickering, random application crashes, ect.

Interesting thread, I don’t think there’s any “right” answer as it depends what work your’re doing. And it also depends if it’s “work” or “pays the bills work”. If my work machine can complete tasks I need to do faster, that can make a real difference. I can talk about the hardware I use, but it’s not “normal”, as I do a lot of data process/visualisation.

So I prefer Xeons, as they let you address more RAM (i7s are limited at 64Gb), fast PCIe SSDs (i’ve not used a HDD, even in a personal machine, for over 7 years), and two or three titans (for compute use).

Way overkill for a bit of jME dev though! I use a powerful workstation as I have specific needs, if you don’t know if you need a xeon etc etc then I’d say you don’t need one.

RAM wise, buy off the QVL for the motherboard you use.
PSU get a good one, don’t skimp here (as above).
A decent case makes life a lot easier.
I’d far rather a good monitor,keyboard, mouse, SSD than any other component in the system.

Spending a bit more on quality/quiet versions of GPUs etc generally always is worthwhile IME.

…and if you have the option, look for one with rolled edges. Those little metal cuts can be as annoying as paper cuts only you don’t know when you get them until later when you are washing your hands or something. Ouch.

I think the OS is way more important than the hardware here. If you don’t just want to do Java development but maybe also development for embedded systems an others I’d go for a UNIX / BSD system because of the availability of compilers for different systems. If you want to do iOS / Mac development theres basically no way around a Mac, which also brings the advantages of the underlying BSD system. Support for compilers and development environments on Windows is abysmal. If you have to feel like you’re “saving money” by making your own hardware (and investing the time in that and configuring the software) at least install a Linux or BSD distro on it. On OSX its as easy as unpacking the Mac and installing XCode and a BSD package manager.

I like the $500-600 every 5 years. Especially with the current development process of cards. between 600 series and the 900 series for 1080p not much has improved. My gtx 680 can still hit over 60fps on the witcher 3 ultra settings.
You might think well i hit 100+ on my 980ti or r9 nano/390x, but the more common monitor supports 60hrz 1080p. Newer cards might be worth it if you have a 144hrz monitor @ 1440p. But many people buy a motor bigger than the car and leave much power unused.
With that said, some games support displaying at 4k and downscaling to 1080 which makes a huge fidelity improvement. I cannot think of one off the top of my head right now i wana say skyrim does it but dont quote me.

make sure to do research on the monitor you will be hooking up to the graphics card and the gfx card itsself.

:raised_hands: :pray:preach it @normen

It’s mainly people hacking into their graphics card drivers to do that.
This was also the configuration that disclosed the GTX 970 scam.

And a GTX 680 (if you can still find one) is about $200 now. I guess it was new 3.5 years ago? My problem with the $500-$600 price point is that you pay 2x or 3x the $200 price but you don’t get twice the performance. And you are on the part of the price to performance curve that falls off really quickly. I did it once with a $1000+ pair of cards for an SLI monster and then was very disappointed in it less than a year later.

This it the curve that I mean (poorly drawn) :
http://i.imgur.com/o6kgOWU.png

We are renting time on that curve. It won’t be very long before that $500 point slides down to the $200 one as time moves on and then both slowly slide off the back. For the same price, the $200 guy gets to upgrade more often and stay on the flat part of the curve for cheaper overall.

But yeah, needs and preferences vary. I, for one, haven’t upgraded this machine since my GTX 460 coming up on 5 years ago. Graphics are generally good enough that I can still run Mythruna as 140 FPS or so and haven’t felt the need to go higher. I’m very much on the backside of that curve by now but we are all pretty lucky that the bottom axis of that curve is really ‘high’. These days I mostly game on the PS4, though, so that comes into play also.

(In the lab I built for my kids and I, they are running GTX 650s… they were $100 a year ago, I guess. We built 3 computers that weekend was lots of fun.)

Hm… so you bought 3 computers. Your lab has … 10 or so … workstations you said earlier.
I think that’s the difference - I only have 2 computer here and might buy a Macbook.
I buy circa one computer every 4 years.
I know that price thingy and probably the others too. Some test sites calculate fps per $ too.

I said I have 10 workstations in my house.

I have two workstations in my home office. (One is the mythruna server but it’s a matched pair to the one I’m typing on right now.) I have a laptop for day-job work. There is a quad core xeon in the closet and an older workstation that I have turned off (it’s my old dev workstation, water cooled SLI thing from a while ago). (There are two computers on the floor which lately have been ‘hangar queens’… one is recently bootable minus a PSU as that’s my wife’s old computer.) On the main floor, there is my wife’s computer in the kitchen. In the basement, there are the three computers my kids and I built (and I was wrong, they are only 6 core phenoms, not 8 core). All three (including monitors, keyboards, everything) were just under $2000 combined. Down there is also the house server and a PC hooked up to the home theater that would still play a mean Half-life 2 if I turned it on. :wink: Oh, and there is the MAME arcade cabinet. Paul's MAME Cabinet

Post-apocalypse, I could probably also rebuild society with just the old parts I have in the ‘where things finally go to die’ closet. Anyone need a P-Pro chip or a 486 DX4? :slight_smile: Though I’d have to team up with a friend down the road who seems to similarly collect random cables… boxes and boxes of them.

Aside from the laptop, I built every one of those machines from parts.

I’ve heard nothing but horror stories about dual gpu(gtx 690FTW) cards and sli/crossfire

I appreciate the ms paint graph and all the thousands of hours that went in to it. :smiley:
but if we started release date of the gtx 600 series. The 680 price was released at $500 the 660 $230
2 years later the 900 series is released. I’m still benching at the level of the $200 960 card according to synthetic benchmarks(too be honest before the driver updates i benched higher than the 960) PassMark Software - Video Card (GPU) Benchmarks - High End Video Cards

this year you would be spending another $200 to be equal to me. so you are now $430 against my original $500. So we are $70 apart. I could argue i also got a 50$ game and had it much earlier, and the gtx960 deal is some freetoplay garbage but i think thats all conjecture.

we can move to spreading thermal paste vs squishing it if you like.
I spread for superior contact. :smiley:

Overall lots of good posts, I appreciate it. So much info, but it’s so hard to decide what to go with…

Could go with AMD, but their new processors are coming out soon, so it would be a waste to invest in one of those, besides the heat issues…

I could go Xeon, but they can get pricey, and also not too sure about them overall for my uses, plus some don’t come with integrated graphics.

Or go with Intel’s new Skylake, but there is complaints about it not being as good as older hardware, bugs and whatnot…

The thing I gather is that more cores = better, regardless (according to @pspeed).

The thing I’m interested in, is if we will be using all of the cores, all of the time. I would assume for JME, we will probably use more than 1 core (depending on what’s going on), but what about other applications?

Some small projects might not even use more than 1 thread, so I’m curious about the argument of more cores vs less fast cores, if we don’t really do much multi-threading, and all that jazz?

Thanks.

Thanks for the information. This info was what brought me here, since I was curious about overclocked gear and it’s affects on compilation. I also hear that ram is an issue, and people prefer ECC ram, your thoughts on this?

This will be a work machine, as well as other uses, basically an everything machine for myself.

Good info on the Xeons, and yeah I’m going to move over to SSDs asap.

JME Dev, and others, but not too sure if I need all that for what I’m doing, or how much I’ll be diving deep.

The lists that a MB posts isn’t always the list that it supports, just the ram they have personally ytested.

PSU I’m going for a high end EVGA SuperNova G2, but open to other options.

I’m using an old NZXT case for now, and after I’m done moving I’ll probably upgrade.

Yeah, I need a new keyboard for sure, and prob will spend a buit on that.

If I go for a regular processor, I will try to get integrated graphics for now, but will upgrade in the future most likely.

Thanks!

I’ve never actually used Linux before, but a lot of people say I should check it out. I really don’t know where to begin with it, and all that jazz, but is Linux something most people here use?

Also, you mention that Windows development tools are “abysmal” I would assume that windows would have a lot of good options, but I guess not? Linux does?

I have a mac computer in case I need to test that stuff on it, it’s j ust what I’m going to do for my main OS… Thanks for the tips, not really sure what to do with this.

Interesting comments on dual GPUs.

Also, fwiw, I read/saw a video that spreading it is better, and will “Eliminate air bubbles,” but I saw another video saying that this isn’t true, and shows a piece of acrylic or w/e pushing down on the paste, and showing that spreading had more air bubbles than if you did the “pea” method as that will spread it out.

I saw another article saying that the CPU is a thin part in the middle, and that the heat transfer is mostly in the center, so you want to make sure that area is covered, while the rest it doesn’t…

Arguments on both sides, but I’m curious about your thoughts? :smile:

Yes, if you shut down every other application before you run a new one then you are never doing multithreading. I guess you are also running DOS. :smile:

Having multiple cores is not about making a single application run better. (Though these days many applications do use threading.) It’s about making your day to day life nicer because you don’t have to wait as often for one app to stop tying up the limited CPU resource.

But now I’m repeating myself. I suspect everything that needs to be said has been said now.

1 Like

I guess you are also running DOS.

Now we are running linux :smile:

For sure, you mentioned it uses a few in the background, but I’m not sure how many threads AMD cpus have. Intel seems to have 2 threads per core.

Hmmm… >( Black Friday Deals…!!!

But now I’m repeating myself. I suspect everything that needs to be said has been said now.

It’s never over >(