How can i test webpage/app rendering for slow speed connection?
Use Fiddler web debugger - it has a feature to simulate slow modem speeds.
What you're concerned with is what is actually going over the wire. A tool like Fiddler will help you understand how much data is being transmitted. You can then work to whittle it down to as little as possible.
Firebug's net console will tell you what each page is downloading, how much and how long its taking. Tools like YSlow and Google Page Speed will give you suggestions about how to speed up the page both loading and rendering.
Related
Our development team is scratching our heads wondering why Google App Engine latency tends to go through the roof from time to time with very little predictability or warning. When latency jumps like this we start to see db timeouts between our app and database. The CPU util is pretty flat across the instances at this time which also makes this hard to understand.
We are using the Flex environment to host a .NET Core API. We like AppEngine for its PaaS feel and its always on feature. We are thinking about looking at Cloud Run as an alternative to test with since we can't figure this out.
Any suggestions on where to look or how we could troubleshoot this?
Here's the latest spike in latency from last night. Plenty of Cloud SQL db timeouts and other exceptions happening here due to this spike as well.
There are couple of reasons for this. This page has all the possible causes and the debugging steps. Mostly check your monitoring for memory usage, there might be a memory leak. Also, check for the autoscalar configuration.
I am using face-mesh for one of ma POC the landmarks and detection are absolutely superb.
But what i found is while moving the face the tracking is little delayed ? is there any configuration available to sort this out ? when i move quickly it taking even more time to detect landmarks on right place. i tried with both wasm and webgl. perfomance wise absolutely no problem here but. tracking is delaying which is not giving the expected result.
is anything i'm missing to get this right ? any help on this is highly appreciated
A small delay is to be expected as inference will probably be tens of milliseconds at least from my experience.
What FPS are you getting? It will very much depend on your hardware of course. As WASM improves hopefully things will get faster too for free on CPU.
On my machine (Desktop running GTX 1070) I get around 50 FPS in WebGL and 25 FPS with just the i7 CPU via WASM only.
From what you have said I do not think you have missed anything. If you could post a screen recording of the situation and live demo eg on CodePen or Glitch, that would could help me compare with what I am seeing to confirm this.
Does anyone know a faster jpeg compression library for development in Objective-C for iOS/OSX? I'm trying to stream a series of images between devices and the built in routine isn't fast enough. Basically it would look like a video on the receiving end, but the origin would be images and I want as little latency as possible. Or is there something better I can use than compressing each frame as jpegs?
Try iResize for Mac.
I've used it for a while now and is really impressed with it so far! Definitely a good program.
http://www.macupdate.com/app/mac/13039/iresize
Am using PhantomJS and CasperJS for screenscraping and stuff. The issue which I am facing is that its taking too much CPU usage which makes me feel it might not be that scalable. Are there any ways to reduce CPU usage for the same. Some of which I can think of are:
1) Disable image loading
2) Disable js loading
Also I want to know if python is more light(in terms of CPU usage) than phantom for the scraping purpose.
Why CasperJS / PhantomJS only? Are you scraping websites that load content with JavaScript? Any tool that doesn't run a full webkit browser will be more lightweight than one that does.
As mentioned in the comments, you can use wget or curl on linux systems to dump webpages to files / stdout. There are many libraries that can handle & parse raw HTML such as cheerio for NodeJS.
Still want some form of scripting? Because you mentioned python, there is a tool called Mechanize that does just that without running webkit. It's not as powerful as Casper / Phantom, but it allows you to do a lot of the same things (filling out forms, clicking links, etc) with a much smaller footprint.
After 5 and a half years I don't think you are having this issue anymore, but if anyone else stumbles across this problem, here's the solution.
After finishing scraping, quit the browser by typing browser.quit(), browser being the name of the variable you set.
I now have 8GB of ram in my server, so bare that in-mind when making recommendations on how much to up settings.
Basically, Apache won't concurrently load more than one page at a time. What the hell could be causing this? This causes real problems when I execute a page that takes a long time to load, no other pages will load.
Total idiot here, so advice desperately needed!
Thanks guys, must be something quite simple.
Problem solved, changed memory usage in scripts.