Gaming preference (CPU VS GPU) - gpu

I have a pc with nvidia gt520 2 gb ddr2 graphics card and a Intel core 2 duo processor with 6 gb ram
While playing assassin creed syndicate
In graphics setting it only need 1850 mb of VRAM although it is very lagging. does it is due to CPU?
I want to know that how high end game use CPU vs GPU

Related

Why are the GPU utilization rates different for the same data?

On two PCs, the exact same data is exported from “Cinem4D” with the “Redshift” renderer.
Comparing the two, one uses the GPU at 100% while the other uses very little (it uses about the same amount of GPU memory).
Cinema4D, Redshift and GPU driver versions are the same.
GPU is RTX3060
64GB memory
OS is windows 10
M2.SSD
The only difference is the CPU.
12th Gen intel core i9-12900K using GPU at 100%
AMD Ryzen 9 5950 16 Core on the other
is.
Why is the GPU utilization so different?
Also, is it possible to adjust the PC settings to use 100%?

External GPU for Mac

I'd like to buy an eGPU for my MaxBook Pro to use for simple deep learning tasks. My setup is:
MacBook Pro (15-inch, 2017)
Graphics: Radeon Pro 555 2 GB
Intel HD Graphics 630 1536 MB
Version: Mojave 10.14.5
I understand for Deep Learning (i.e. use of tensorflow-gpu) this is not currently supported for my Mac. Due to previous disputes between Nvidia and Apple I assume that Nvidia's support is reluctant to offer any kind of hacky solution with their graphic cards. On saying that, I was recommended to purchase the NVIDIA TITAN RTX or NVIDIA Quadro® GV100, but they're quite pricey at 1000s of euros/dollars a piece. At first, I just want something to play around with.
I watched this and this to see how to configure the Mac with an eGPU that is CUDA supported.
What Nvidia eGPU would you recommend for simple i.e. not mega large data sets for DL processing? There seems to be so many models to choose from that it's not clear what would satisfy my needs. Would a GIGABYTE GeForce® GTX 1050 Ti OC 4GB suffice?
I decided to ditch using my Mac for using an external Nvidia graphic card. There are apparently some hacky solutions but I figured - after reading a many forum posts and online articles - the best way to proceed is just to buy a new (gaming) Desktop PC. One reoccurring theme I did see come up was that in order to have an effective Deep Learning workstation, one should consider at least 16 GB Ram (32 or more is ideal, Intel i7 or better, and 0.5 TB of SDD).

How to program intel hd graphics gpu clock rate?

I found a tool (Intel Extreme Tuning Utility, os win 7 ulti x64) what I can use to change the gpu clock on my laptop (cpu type Intel Core i5-4210U, built in Intel HD Graphics 4400). I marked the slider (belongs to that function) with red on this screen of Intel XTU to avoid any mistake.
I would be happy to build such functionality into my own program. That is enough if my program works at least on my own processor (processor type linked above).
My problem is, that I do not know, how to access gpu clock rate or absolute gpu clock, or whatever really exists behind the scene. Some documentation or any advice in advance will be great.

SteamVR performance test says my GeForce GTX 980 Ti is not ready for VR

I am thinking about getting a Vive and I wanted to check if my PC can handle it. My motherboard and processor are pretty old (Asus M4A79XTD EVO ATX AM3 and AMD Phenom II X4 965 3.4GHz respectively) but I recently upgraded to a GeForce GTX 980 Ti graphics card.
When I ran the Steam VR test program, I was expecting it to say that my graphics card was OK but that my CPU was a bit too slow. Actually, it's the other way round. Screenshot of steamVR.
Your system isn't capable of rendering low quality VR and it appears to be >mostly bound by its GPU.
We recommend upgrading your Graphics Card
I've made sure I have updated my NVidia drivers.
When I look in GeForce Experience, I get the picture I was expecting to see:
GeForce Experience screenshot. It thinks my graphics card is OK but my processor doesn't meet the minimum spec.
But, since the Steam VR test is actually rendering stuff, whereas the GeForce experience is just going by the hardware I've got, it makes we think that my GPU should be capable but something about my setup is throttling it.
I'd love to know what the problem might be. Perhaps because I'm using an NVidia card in an AMD chipset MB?
Well, I never found out exactly what the cause was but the problem is now resolved. I bought a new Motherboard, processor and RAM but kept the graphics card. After getting everything booted up, the system is reporting "high-quality VR" for both CPU and graphics card.
So, for whatever reason, it does seem like the MB/processor was throttling the graphics card in some way.
Steam VR only tests if your rig is able to keep steady frames over 75fps. I can run VR on my laptop and it's only got a GTX 960m. My CPU is a little more up to date. I7 6700k 16gb of ddr4. I also have a buddy able to run VR on a 780ti.

Varying RAM speeds / brands / frequencies

I have just built a new build PC and have intermittent BSOD stating a memory management error.
I have the following at my disposal:
Slot #1 Module G.Skill 4096 MB (DDR4-2137) - XMP 2.0 - P/N: F4-3000C15-4GVR
Slot #2 Module Crucial Technology 4096 MB (DDR4-2400) - XMP 2.0 - P/N: BLS4G4D240FSC.8FBD
Slot #3 Module G.Skill 4096 MB (DDR4-2137) - XMP 2.0 - P/N: F4-3000C15-4GVR
Slot #4 Module Crucial Technology 4096 MB (DDR4-2400) - XMP 2.0 - P/N: BLS4G4D240FSC.8FBD
*Taken from CPU-Z
The real question is ... The modules installed in 1 & 3 are advertised as DDR4 - 2400MHz. However, from the test I suspect they are running slightly slower. With the full 16GB of RAM installed it is running at 1198.3 MHz - Ratio 1:18. However with just 8GB of the ballistix RAM it is running at 1197.3 MHz - Ratio 1:16.
I use my PC for live streaming and gaming. Can somebody please explain what to do in this situation and why?
Other specs
M/B: MSI H270M BAZOOKA (MS-7A70)
CPU: i5 7600K
GPU: GTX 1050ti SC
PSU: 750W
Run one set or the other. I have the same g-skill and run xmp 2.0 profile at ddr4-3733 speed. Timings are 17-19-19-39 but if you enable the Xmp profile in your bios it should let you load the profile. On my board I have to set a higher boot voltage on the ram to get it to train when I push it up to 3900