phantomjs low cpu utilization - phantomjs

I'm spawning multiple phantomjs.exe process in parallel from c# to generate pdf served from my local iis. When using wkhtmltopdf.exe I can see that my cpu (core i7-4790k) utilization is always high >90% but using phantomjs, cpu utilization is low 30-70%.
Thus its slower for phantomjs to generate all my pdf compared to wkhtmltopdf. I would like to use phantomjs because it render nicer. How to fix this ?

Related

Docker Desktop Windows - Abysmal performance in AMD system?

I've recently assembled a new AMD Desktop, to replace an older Dell Latitude E7540 laptop.
The AMD Desktop:
Ryzen 3 3100 # 3.8GHz (4C/8T), 32GB DDR4 3600 CL17 RAM, Corsair P600
Gen4 SSD
The DELL Laptop:
Dell Latitude E7540: Intel I7-5600U # 2.6GHz (2C/4T), 16 MB RAM DDR3 1600, Samsung mSATA PM851
On the new AMD Desktop, when executing a docker build command, two situations occur:
The performance is dreadful, even building a simple image, it takes a long time for the command to start. After starting, it takes a long long time to complete (when it completes)
The build window crashes almost 50% of the time.
The benchmarks indicate that the new AMD Desktop is 3.5x faster at single core, and 6x faster at multicore.
As such, I was expecting a much better performance with the new AMD Desktop.
Unfortunately, that's not the case, and for the same Dockerfile (which generates a very big image):
The Dell starts faster
The Dell completes faster (10m vs 30m)
On the Dell, the build window never crashes.
The only difference between both systems is that one is an Intel platform, the new one an Ryzen 3 AMD.
Environment Details are the same on both machines:
Windows Version: Windows 10 Ent. 19049
Docker Desktop Version: Docker 3.0.0
What can explain this abysmal performance on Docker-Desktop on the new AMD system?
After a few troubling days, i can confirm that the problem is not AMD related.
The culprit is the Antivirus, that when ON, its scanning the files used by Docker, which cause all the problems i've described.
Docker documentation states how to disable the antivirus to scan Docker related files:
https://docs.docker.com/engine/security/antivirus/
When antivirus software scans files used by Docker, these files may be locked in a way that causes Docker commands to hang.
One way to reduce these problems is to add the Docker data directory (/var/lib/docker on Linux, %ProgramData%\docker on Windows Server, or $HOME/Library/Containers/com.docker.docker/ on Mac) to the antivirus’s exclusion list. However, this comes with the trade-off that viruses or malware in Docker images, writable layers of containers, or volumes are not detected. If you do choose to exclude Docker’s data directory from background virus scanning, you may want to schedule a recurring task that stops Docker, scans the data directory, and restarts Docker.

why ninja build and msbuild are unable to utilize more than roughly 50%?

I have a Lenovo Z51-70 laptop (Windows10). it had 8GB RAM by default and SSHD. When I used to compile large projects 20K c++ files Task Manager always showed 90-100% CPU utilization. A week back I upgraded SSHD to SSD and 8GB RAM to 16GB RAM for gaining speed in the compilation. But build time hasn't improved (it is almost same) but Task Manager always shows roughly 50% CPU utilization. Why it is not able to utilize anyway near 90-100%? and why same build on SSHD & 8GB RAM always used to consume roughly 90-100% CPU utilization? It is not specific to a particular build system, i have tried MSBUILD, NINJA. All build system show same CPU utilization. I have tried to compile different projects for excluding any reason which may be the project-specific.
Any thoughts?

wkhtmltopdf goes extremely slow on different environment

I am developing a PHP web application with CakePHP 3.4 framework, and i am using wkhtmltopdf 0.12.4 to output dynamic content in a .pdf file. Currently i'm using three different environments where i develop and test my application:
In my local environment (XAMPP 32-bit for Windows), wkhtmltopdf works great. It takes ~1 second in rendering .pdf files
In a remote testing environment (CentOS 7 64-bit using apache2, 4GB memory), works great too.
In my third remote testing environment (Another CentOS 7 64-bit distro with similar CPU specs as the second one and 4GB memory), wkhtmltopdf takes up to 20 seconds in rendering the same .pdf file
What could be causing this behavior in the third environment? How can i monitor or debug wkhtmltopdf process to help me identify why .pdf rendering is so slow?
Remove rgba and set border-radius to 1px in your CSS files (or completely remove them if they are not needed). That should speed up the PDF generation process.
Similar issue
Another reson can be your current default printer on windows. See:
https://github.com/wkhtmltopdf/wkhtmltopdf/issues/4891
If you use a standard printer that is
a network printer
you use the windows default driver for that printer and have not installed the manufactors printer driver
-> wkhtmltopdf is really slow

Virtualize a CPU without AES-NI

I have an application with AES-NI compiled it, but supposedly select the implementation at runtime based on cpuid. I want to test if it really functions correctly on an old CPU without such dedicated instructions. VirtualBox cannot help because the CPU is the same. How can I do such a test without having access to an old CPU?

VMware Player VM - 1 core CPU limitation

I'm using a VM with VMware Player to write code and compile.
As my current program is huge, the compilation takes a while to be done (upto 5 minutes)
using 25% of my 4 cores CPU on my host = 100% of one core.
It seems that the VM is limited to use 1 single core.
Is there a way to optimize the number of cores a VM can use?
I'd like to use 50% or 75% of my 4 cores CPU.
Thanks
It sounds like you're limited by the number of parallel build tasks you can run, not the VM CPU configuration, e.g., by default, make will run a single step at a time. Try running several steps in parallel, e.g., run make -j4 or equivalent for your build system.
On a separate note, a VM may be more overhead for you than you might like; consider using Docker to host your development environment.