Jump to content

V 3.0 performance discussion


Recommended Posts

  • 3 weeks later...
  • Replies 237
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

I’m thinking nobody has any idea.

eSim most certainly does not have banks of machines with every possible combination/permutation of graphics card and motherboard (let alone shadows settings, if aliasing is on or off, etc.).

Best thing I’d suggest is wait.

Then buy ver 3.0 (even if just a one month “rent”).

Then see how it goes with your particular setup.

Try shadows and 5 and if the machine chokes, reduce it to 4, 3, 2, 1 ... and see how you go.

Then join MP at whatever setting you chose and see how it goes in MP.

If you must have shadows at a certain level (subjective choice) then you’ll know if you need to buy a new graphics card or whatever to satisfy that requirement.

Link to comment
Share on other sites

Choosing graphics cards can be tricky, and their names are a rocket science at times, and some cards get a boost by doubling the memory, and others almost none. In the above example, the GTX 650 seems to imply by being a 600-series card that it is faster than an older card of the 500-range. But compare a GTX 650 to a GTX 560, both are representing the same price range nowadays. And the 560 probabaly is the faster card. And probably by a significant margin.

Wait until SBP 3.0 is here, and see what you get with the hardware you have. If needed, then try find a good deal for not he actual but the last generation of cards. Much better bang for your buck. I do so since around ten years now. It has saved me many hundreds of "mice". :cul: New stuff, especially gfx cards, are hopelessly overpriced, and their living time of being actual, is extremely short. Worst bang for your buck one can imagine.

But hey, some people are so stinking rich that when they get bored they get the motor block of their rusty rotten old car leaf-gilded just to kill time, so...

Link to comment
Share on other sites

I didn't mean to imply that eSim had a computer for each possible hardware combination, that would be absurd. ;) Nor do I expect every combination be included in the sticky. I'm also refraining from buying anything until I have tested 3.0 on my rig, but for the mean time I have been doing a lot of research.

Going by what Ssnake told me in another thread and from my own research (esp in CPUBoss and GPUBoss), my CPU (Athlon X2 250 3GHz) is most comparable to the E8600. The GTX670 is significantly faster than the 650 but both are much better than what I have now (GeForce 9600GT 1GB). I was thrilled to find out that my motherboard can accept up to Phenom X6 CPUs but single core performance was not much better than what my CPU already does. What intrigued me is the E8600 and GTX650 combination being listed as "not recommended". The issue of bottleneck comes to mind and I wonder if even a faster GTX670 could yield better performance or not. This is where I am seeking enlightenment.

The easiest solution is in fact to invest in the hardware with "great/good" performance ratings. Unfortunately the easiest solution is not the practical or possible solution for me. Instead I must eek out as much as I can out of my motherboard. I traditionally buy previous generation stuff because thats what I can afford to do and that's why I'm asking about the particular combination I mentioned originally. And I definitely would like to have shadows under the vehicles!

As usual thank you for all the feedback.:thumbup:

Link to comment
Share on other sites

  • Members

I must confess, I'm not sure if a GTX670 would fare a lot better than a GTX650 if paired with an E8600. On paper the E8600 looks like a pretty good deal - above 3 GHz dual core - should be sufficient for a single threaded application like SB Pro. Unfortunately I somehow suspect - without being able to point my finger at specific evidence - that dual core CPUs are more of a bottleneck than what one might expect. I used to recommend dual core CPUs over quad core processors of the same price because they would usually have a higher clock speed. But there has been more going on than just an inflation of core numbers. Processors have also become more efficient, they do more work per cycle than older CPU generations do even though the nominal clock speed has stagnated over the past five or more years.

Link to comment
Share on other sites

The more I look into it the more I'm realizing how foolish it would be to invest in any CPU by AMD. According to CPUBoss, i5 CPUs of the same clock speed remain superior to even AMD's latest FX 8-core CPUs! I should've built my current rig around an LGA-1165 socket for much better longevity! I see AMD joining Cyrix in a few years unless they start becoming competitive again.

Thanks again everyone for all the feedback, and thank you OP for "letting me borrow" this thread.

Link to comment
Share on other sites

As I compare video cards at GPUBoss I'm learning that many models come in 1GB or 2GB (sometimes 3GB!) versions. Shall I assume that the cards in the performance sticky are 2GB versions?

EDIT: Nevermind. The answer I needed was actually already discussed in the comments of the sticky.

Edited by Scrapper_511
Link to comment
Share on other sites

The more I look into it the more I'm realizing how foolish it would be to invest in any CPU by AMD.

I see AMD joining Cyrix in a few years unless they start becoming competitive again.

I have to retract a little bit now that my frustration has subsided, and give AMD credit where it's due. While AMD single-core performance keeps getting trounced by Intel, the lower price of the AMD's still make them an attractive option. In my case it was just too irresistible to upgrade my Athlon X2 250 (3GHz, dual-core) to a Phenom II X4 970 BE (3.5GHz, quad-core) for $125 (boxed). Nothing by intel comes close to that price for the performance. The most attractive thing about it is I get to keep my existing motherboard...which saves me from buying a new one and consequently a new hard drive, etc. (totally beyond my budget and a headache I was really avoiding). AMD's marketing strategy does work and it served me well.

As I look to upgrade my Geforce 9600GT (1GB) card with something with 2GB (256-bit), the Radeon 9850 has also become a very attractive alternative to the better performing and more expensive GeForce GTX series. The sub $180 price of Radeon 9850's has all but sealed the deal for me.

I also said I was going to refrain from any upgrades until I had 3.0 installed, but I've been doing some fps benchmarking with some very busy scenarios (such as "Red Horde", observer view) and have noticed some very low frame rates during arty attacks around cities with many units involved. So, my rig could use the upgrade even with 2.654. I don't expect substantial improvements but I think there will be just enough to justify the ~$310 upgrade cost, and maybe, just maybe I can get "HD low-medium performance" at Level 3-4 graphics in 3.0. This pretty much puts my rig at the end of its upgrade path and I won't be upgrading for many years unless I win the lottery.

Post-upgrade observations to follow.

P.S. Thanks for the 3.0 Performance sticky. It helped a lot during my research.

Edited by Scrapper_511
Link to comment
Share on other sites

  • 3 weeks later...

So I just upgraded my Athlon X2 250 (3GHz, dual core) CPU and GeForce 9600GT (1GB) video card to a Phenom X4 970 (3.5GHz, quad core) and Radeon HD 7850 (mis-typed in previous post as 9850).

IL2 1946 and Lock-On Flaming Cliffs 2 used to get choppy on my rig, but they are both running extremely smooth now even at higher resolutions and detail settings. I have yet to experience any sign of the fps choking in even the busiest scenario. I have 4X AA and other graphics settings enabled at medium settings in the Catalyst application.

As I stated elsewhere, my observations for SBProPE with this upgrade have largely been limited to using "The Battle for Fulda South" (by Cobrabase) as a benchmark. After upgrading I did notice an increase in framerate overall, but in this particular scenario the framerate will drop to as low as ~7fps when the shmack is hitting the fan and there are lots of trees, buildings, smoke, and of course units on screen. Obviously this scenario is on the extreme side, otherwise the framerate is so smooth I can get dizzy panning the view around. I should add that all stock scenarios that I've tried have always run very well on my rig even before the upgrade.

Given that my benchmark scenario still manages to choke my new hardware, this doesn't bode well for my rig running SB 3.0 at shadow level 3 or up. According to the GPUs included in the Performance sticky and GPUBoss, the Radeon HD 7850 compares well with the nVidia cards that get "Good Performance". Thus, I can recommend this video card especially at under $160US.

However, when I compare my Phenom X4 970 to the setups with i5 and i7 CPUs, there is a huge disparity in single-core performance according to CPUBoss. In fact, the Phenom X4 970 is only on par with a much older E8600 as far as single-core performance is concerned (the AMD otherwise whups it). For this reason I cannot recommend this or even the latest AMD offerings. I also have to retract my previous statement that intel doesn't offer performance at this price range. I discovered after the fact that even an older dual-core with slower clockspeed intel CPU (I forget which) would still run SB better than my new Phenom.

The saving grace, for me, is I spent just ~$310US for both CPU and GPU (Newegg) and I didn't have to swap out my motherboard, and I would say also avoid having to reformat the hard drive as a consequence, but I ended up doing exactly this after Win7 started acting weird after the upgrade (I've since rolled back to WinXP). Photoshop and Paint Shop run faster too actually.

If you are looking to upgrade for 3.0 and run it at the higher shadow levels, I suggest you research CPUs that have superior single-core performance. It won't take long to realize that even the latest and greatest AMD CPUs can be outperformed by much older and equally inexpensive older generation intel CPUs.

Link to comment
Share on other sites

Now that I've learned that Alt-D allows me to adjust details on the fly, I've discovered that lowering the details (from default settings) improve the framerates quite a bit. This would be an obvious result but I was wondering if this actually reveals anything about which hardware is the bottleneck. Is it my Phenom or Radeon? I have been doubting my new CPU's single-core performance now I'm wondering if it could be the Radeon. What do you folks think?

Edit: The improvement I observed occurred during my benchmarking when the fps dropped to about 7fps. When I dialed the details down, it improved to around 14fps. Otherwise, when things are already running smoothly, decreasing or maxing out the details don't make much of a difference (maybe a loss/gain of 2fps).

Link to comment
Share on other sites

One of the things they have been doing for some time now, is selling graphics cards with slower memory for cheap.

So, lets say it cost $50, and says its got 2GB! of ram. Well, that's 2GB of older DDR3. If you buy any graphics card, make sure it has GDDR5.

Not saying that's your issue Oceola, but its worth mentioning.

Link to comment
Share on other sites

So how big should it be?Mine is 2 Gb and I do experience some stuttering

Well that depends in part on the level of detail that you want displayed, how much time you spend in the 3D environment, how "busy" that environment is (100s of tanks driving around while masses of artillery is falling will have a bigger demand than 2 or 3 vehicles moving cross country), etc.

So in part, how you play the game and what size scenarios you run, influences the performance.

Link to comment
Share on other sites

The saving grace, for me, is I spent just ~$310US for both CPU and GPU (Newegg) and I didn't have to swap out my motherboard, and I would say also avoid having to reformat the hard drive as a consequence,

Why would you have to reformat the hard drive after a hardware upgrade?

I've never found this necessary in the past.

Link to comment
Share on other sites

If you switch out your motherboard, you'd better also reformat the hard drive. You need to remove all the drivers for the old MB, and install the new ones. Otherwise, you can run into all kinds of problems. Just changing your graphics card, upgrading the CPU or similar minor things is different, of course.

Link to comment
Share on other sites

Why would you have to reformat the hard drive after a hardware upgrade?

I've never found this necessary in the past.

I've never had to either (unless I was also upgrading the motherboard) but right after the upgrade Win7 stopped working properly. Windows update wouldn't work and the MS Windows Experience Index refused to refresh. I doubt it was the hardware upgrade itself that caused the problems but rather something in the software. System restore didn't do the trick either and after several wasted days troubleshooting I resorted to tinkering with the registry. When that didn't work I just decided to wipe the hard drive clean and go back to WinXp Pro.

As far as video memory, my Radeon HD 7850 is 256bit and GDDR5.

Link to comment
Share on other sites

  • Members

Depending on the scenario, either the CPU or the graphics card can become the bottleneck. It is impossible to say when exactly which one will be the limiting factor without comparative runs of iterations of the same scenario with different graphics settings, or with variations of a scenario that varies in complexity/number of combatants (and non-combatants) while maintaining the same graphics settings.

Link to comment
Share on other sites

Wow, I'd be really surprised if my Radeon is the bottleneck! In the sticky, the GTX670M-i7 3720 combo is rated for Medium performance for levels 4 & 5 which means no framerate below 30fps. The Radeon HD 7850 is superior to this GTX according to GPUBoss. If the particular benchmark I'm using is able to reduce framerates to way below 30fps and I compare my Radeon HD 7850-Phenom X4 970 combo to the sticky's GTX670M -i7 3720, then I'm inclined to say that it's not the Radeon causing the bottleneck, but my Phenom CPU. As a matter of fact, from what CPUBoss tells me, the i7 3720 even with its much lower clockspeed of 2.6GHz, signficantly outperforms my 3.5GHz Phenom in single-core benchmarks.

Maybe the benchmark I'm using and the one eSim used for the sticky are too apples and oranges and doesn't make a very good basis for comparison...I don't know. The only way to know for sure is to actually run the eSim benchmark on my rig.

Which begs the question, will that benchmark eSim is using actually be a scenario that's going to be included in 3.0? I know it's kind of moot by the time 3.0 is actually installed on my rig, but my inquiring mind wants to know if instead of spending $125 on replacing my Athlon X2 250 3.0GHz CPU, should I have invested that money into the video card budget ($310 or so would have got me a GTX670)?

I really should just get over the whole thing. I got my upgrade and my budget is spent and it's just a matter of 3.0 arriving so I can test performance myself. With the exception of the benchmark I've been using, SBProPE has always run very well on my rig, and even more so with my latest upgrade. I don't really play the large, super-busy scenarios much and the scenarios I hope to finish making one day will involve a much smaller scope, not a campaign-sized mission.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...