I can find out the page load time with Scrapy + Splash but is it possible to get the Time To First Byte (like on Chrome's network tab)?
I want to measure how fast the server rendered the page.
I found it here:
local entries = splash:history()
local last_response = entries[#entries].response
last_response.timings.wait
Related
My video src is an AWS presigned request url, it expires in x amount of time. The video will start playing just fine in video.js. For large video files after the brief url expiration time, changing the seek bar causes a network error because the original src link has expired. How do you refresh the src with another unexpired presigned url without restarting from the beginning of the video? I don't want the video to go back the beginning.
So far I have found that you can capture the change of the seek bar by listening for the event 'timeupdate' and in the passed event testing for e.manuallyTriggered.
Thanks
i had this same issue today. i'm using plyr instead of videojs, so i'm not sure if you can do this exact thing, but my solution was:
bind an error handler to the player for when the link has expired and someone tries to play/seek, and then in the handler...
store the current time of the video
send an ajax request to my server to get an updated signed URL
update the source of the player
set the current time of the video to the previously stored time
it's kind of slow/clunky, but it's the best fix i could come up with at the moment, aside from loading the entire video before allowing playback (which didn't seem like great UX).
update: this does work with videojs...but it doesn't work with either player in Safari, which apparently doesn't send the error event at all.
strong textI am new to Sencha and I am evaluating it.
Every time I try to open a browser and visit the API page at docs.sencha.com/whatever ... it takes forever to load up.
I mean, what the hell is it doing at the back end? Is it taking that long to load up all the necessary extjs app files? Or is it loading me the whole API library while I am only try to see one page?
So far, I have gone through a couple of examples, and I like Sencha a lot . However, I have a concern about the loading speed in production, because the speed they load up the API scares me.
If you are experienced with Sencha, could you tell me what is going on at the back? Please don't say "API is 20MB big, takes time to load ...", because I only want to see one page per visit, i believe it is wrong to load me the whole API to initialize a page.
UPDATE ------------------------------------
I face this loading screen for 20-30 seconds everytime when I open the browser. IE only. Chrome and FF are fast??
UPDATE 2 ----------------------------------
I did a profiling for IE. Btween http://projects.sencha.com/auth/session?... and /architect/2guides/intro/README.js?... IE went to sleep for over 20 seconds blank doing nothing (as u can see from the highlighted blank gap in between the 2 rows in the picture), then suddenly came back and finish loading the rest of the page!
I copied those links 1 by 1 and load them up in a new IE window. They all individually came up in milliseconds, so there is no speed issue. It is just that, IE went to sleep for 20+ seconds (CPU monitor shows no activity) StranGe!
Since you're starting up, you should definitely start building your application using sencha cmd. This allows you to build a version of the extjs file that uses only the components that you use.
But as a side note. I use the full sencha api and it takes me less than 2 sec to load the whole API. I use the production version ext-all.js and ext-all.css and gzip everything. After the zipping, I get a file size of less than 500KB, which is like nothing actually.
EDIT:
I checked the API docs page. The total download size is less than 1 MB and that too cause there are a lot of icons which aren't combined as sprites. Hence the browser takes a lot of time in requesting the icons. That's why the page is slow.
For IE, well sencha can't do much about it. The browser itself is slow. Any webpage you load will suffer from the same problem. Not just sencha's. The page speed will improve if you do some optimizations. The size of the API isn't the problem.
While running the rasterize.js example provided by PhantomJS, I find it I have to wait 20 seconds or more until a web page image is produced.
Is there any possible way to speed this up without consuming a lot of resource? I am basically looking to rapidly produce series of sequential images captured from webpages loaded with PhantomJS. It would be really great if I could output Phantomjs somehow to a video stream even.
For now I would look for something that just takes a screenshot of a web page within 1~2 second range with PhantomJS. If there's already a project or library which accomplishes this that would be great too.
In case your images URL are hardcoded into html response, then you can do next things:
Get html body
Parse it and get your images
And then render them into something like PhantomJS or anything else WebKit based.
You can take a look to this sample, https://github.com/eugenehp/node-crawler/blob/master/test/simple.js
Like:
var Crawler = require("../lib/crawler").Crawler;
var c = new Crawler({
"maxConnections":10,
// "timeout":60,
"debug":true,
callback:function(error,result,$) {
console.log("Got page");
$("img").each(function(i,img) {
console.log(img.src);
})
}
});
c.queue(["http://jamendo.com/","http://tedxparis.com"]);
My requirement is to test video played on particular web page. I need to capture its screen at particular interval. Its start time, end time, duration etc. How to do that
The application I test also has video requirements. Rather then trying to get Selenium to do something it really isn't made for, I created my own methods that parse the URL of the video and make sure it gets a 200 HTML response, just to make sure there's a video there.
Otherwise, I don't believe there are any ways to get the duration of the video without relying on what's in the markup. If you do use what's there, the Java API allows you to take a screenshot as follows:
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
I'm trying to get screenshots from a web page with multiple browsers. Just experimenting with Selenium RC, I wrote code like this:
var sel = new DefaultSelenium(server, 4444, target, url);
sel.Start();
sel.Open(url);
sel.WaitForPageToLoad("30000");
var imageString = sel.CaptureScreenshotToString();
This basically works, but in most cases the screenshot is of a blank browser window, because the page is not yet ready for display. It kind of works if I add a sleep just after the WaitForPageToLoad, but that slows down the fast browsers and/or may be to short for the slower browsers (or under load).
A typical solution for this seems to be to wait for the presence of a certain element. However, this is meant as a simple generic solution to get a screenshot of a local web page with as many browsers as possible (to test the layout) and I don't want to have to enter certain element names or whatever. It's a simple tool where you just enter the Selenium Server URL and the URL you want to test, and get the screenshots back.
Any advice?
I use Selenium-RC to capture screenshots of remote pages where the waiting time is variant. In such cases, checking the title of the page and using time.sleep(n seconds) usually does it for me.
May be you can make use of Browser status bar to verify whether that page is loaded fully or not. I think this is the best solution.