Going from 100 fps to 170-180 in certain games, where it should have only been maybe 40-50 fps higher. Luckily I played at 4K and it worked fine in 4K so I never RMA'ed it or anything nor figured out what it was but when I did this last cpu upgrade it was crazy the difference in fps at those lower resolutions. You're welcome, what is that benchmark from? I mean, it could be motherboard related or chip related.I had issues with my 4790K at 1080P and was always way off from other 4790K performance, 30fps at times or more. I would look at upgrade cpu and ram too if you can afford but I would cpu do first. Even the 5600x and 6600xt that is pushing out 300-340 fps. cpu usage on the others were also low as well as gpu usage.all around 30-40% max for both of them. Valorant is pretty cpu bound especially running low res, low settings. 3rd Gen they started to strike back harder, now the 4th gen 5600 is a gaming beast. It's all in ipc increase mainly generation vs generation for AMD. These scenarios are more dependant upon either IPC or clockspeed and memory speed and/or a combo of all. Third gen 3600 and even a 2600 vs 1600 in these type of scenarios can be a pretty major upgrade. There were some R5 36XT benchmarks I saw with low settings 1080P that were on average around 240 FPS and that is mainly coming from the CPU difference, low cpu and gpu usage also. The GPU, especially one this powerful is not going to be the deciding factor for how much fps you get in this kind of scenario, don't get me wrong a better gpu will give you fps increase still but it's not as important here until you turn settings and resolution up or have a stronger cpu. Especially if you are trying to run a game like valorant at max fps with low settings, you would want your CPU overclocked if you can and fast ram if you are trying to compare vs others. In the majority of games, there's probably not a large performance difference between 2666 vs 3000 MHz but in high fps, low res and low setting scenarios.cpu clockspeed as well as memory speed will have more of an effect. As can I game settings with lighting, Ai, grass detail levels, viewing distance etc.So comparing and looking at valorant for example.180-200 fps.so a few things, I notice your ram is slower and if I'm not mistaken on first and even 2nd gen it was really important to have higher speed for ram and I think 3000-3600 is the sweet spot but 3000 could be a 10-20 fps difference vs 2666 MHz in a game like this. Slow ram, ram in wrong slots, lack of boost, cpu cooling, single channel ram, slow storage, bunked up windows, badly optimized games, bios/Agesa incompatibilities etc can all cause low cpu fps outputs. The issue is with the amount of fps the cpu is capable of, not the gpu. It already exceeds the cpu fps at ultra, so output doesn't change, just the details in the picture. The 3070 is an extremely capable card, so changing detail levels did nothing. That's why lowering settings did nothing. Cpu is sending 100fps, so at ultra the gpu can output 100fps, but changing to low it's still only going to output 100fps because that's all it gets, even if it's capable of 500fps. Lowering to low gets you closer to 200fps. Basically if at ultra the cpu is sending 200fps, the gpu is outputting 100fps, lowering to medium is giving the render process a break and the gpu can output 150 frames instead. Lowering settings will only raise fps if the gpu is struggling to render everything received from the cpu. In simpler terms, it's not having to use very much vram, speed, cache, bandwidth etc, even at maximum settings, to put every frame it gets on screen in a second. The 30-40% usage is a representation of the resources needed for the gpu to render a frame, not how much of the gpu is being used.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |