Front end test automation - testing

I had to recently updgrade from jQuery 1.3 to 1.10. As a result lots jquery code was changed and plugins were upgraded. And consequently this resulted in lot of hours of manual testing.
This got to me think that there should be a better way of testing/validating pages after any js/css change across my site.
I would like to do the following.
crawl all pages on my site.
Check all links on page work correctly
Check for basic html tag validation
Check for any JS & css errors.
jquery version compatible code
Any recommendations for tools that will allow me to perform all of the above test.
Thanks

Free tools exist for quite many of those tasks. There are both web page based validators and browser plugins. I don't know if some tool would do all the below in one or couple steps, but these should get you at least started.
For crawling the site and checking all links you can use tool from The World Wide Web Consortium (W3C):
http://validator.w3.org/checklink
The results page also provides links for for validating html and css of each page.
If you want to validate a file otherwise, you can straight use
http://validator.w3.org/
and
http://www.css-validator.org/
For validating javascript syntax, you can use jslint
Finally, if you want to be more sure that your javascript is working correctly, write unit tests for example with QUnit

Related

JMeter Screenshotting

I am able to get a screenshot in my JMeter test scripts using the Selenium chrome web driver. This works, but now I want to get screenshots following an authentication request. Is there a way to capture the screen as displayed in the HTTP request?
I don't think there is, theoretically you could try libraries like this one or this one from JSR223 Test Elements using Groovy language, but I don't think you will get what you want
The main reason is given at JMeter project main page:
JMeter is not a browser, it works at protocol level. As far as web-services and remote services are concerned, JMeter looks like a browser (or rather, multiple browsers); however JMeter does not perform all the actions supported by browsers. In particular, JMeter does not execute the Javascript found in HTML pages. Nor does it render the HTML pages as a browser does (it's possible to view the response as HTML etc., but the timings are not included in any samples, and only one sample in one thread is ever displayed at a time).
HTTP Request sampler downloads only HTML, you won't get any images, scripts, styles, fonts, etc. so even if you try to use the aforementioned libraries you will get a "screenshot" which doesn't have anything in common with how does the page look in reality.

How to automatically test application web page looking for error?

Is there an easy way to test some web page looking for some errors. I mean usually while doing web testing we want to check the display which could be very difficult to maintain.
So I wanted to know if as an alternative strategy there are some practice to massively test the URLs of a website and just look for any kind of error including JS error in the console.
I think it is possible using a framework like Selenium but it might be a bit overkill no ?
Also the idea will be to do that on a production server, in addition to test server.
The website have some authentication so just hitting the URLs will not work.

Automation in Go Lang - How to use browser automation like Selenium?

I am new to Golang. And I am looking for automating signup, login processes in a web app. Please suggest a good tool like Selenium and how can I implement it in the go language.
I want to do the following process automatically using Golang:
Start a browser. Currently, I'm using https://github.com/skratchdot/open-golang
Auto entry on the signup page and auto-submit a form.
Login check for the registered user. Everything needs to be done automatically for more users.
You can also use Playwright for Go, which is a wrapper for the Playwright project. Playwright provides a single API to automate Chromium, Firefox, and WebKit to automate browsers which was created by Microsoft. With it you interact with the sites, record videos, make screenshots, and emulate other browser specific behaviour.
If you are going to use GO for web automation testing - Selenium is a good option. Still it's nothing more than a library that allows you to interact with browsers. So you are going to need to develop your own framework or reuse someone already implemented.
My advice is to consider Agouti, since it supports Ginkgo BDD and xUnit Gomega. Everything else is pretty much the same from architectural perspective. You can design it like any other language binding. There are common patterns that appear over and over again in browser automation frameworks, like
PageObjects: A simple abstraction of the UI of your web app.
LoadableComponent: Modeling PageObjects as components.
BotStyleTests: command-based approach
Another good resource for building your Test framework is the xunitpatterns guide. It gives a great content overview of the patterns, smells and refactoring strategies you can use. Also look at this test frameworks tutorial. It'll help you choose the most proper solution for your case.
My guess is that you are going to need some CI server support for
everything needs to be done auto for more users.
Here is a good article how-to achieve this with TravisCI.
update:
you can use Selenium for Golang

How to optimize Google Maps API integration

I've tried to evaluate my website with PageSpeed and I saw most warnings (Leverage browser caching, combine external javascript...) come from Google Maps API integration.
So my question is, is there a way to use it as PageSpeed would like?
Thanks.
Probably not, the external js files can change. If you combine them or host them locally with expires headers you may encounter bugs when they change.
I suggest to read this article in Google Blog: https://maps-apis.googleblog.com/2015/09/map-tips-speeding-up-page-load-times.html
Asynchronously loading JavaScript on your pages can give you huge performance wins. We have just updated all of our JavaScript Maps API samples and you can make these changes to your site, too. Read on to find out what we did and how it works, or skip straight to Unblocking scripts to see what you can start to update on your site today.

To build an App for an Internet site without its API and Schema

I was asked to build a control-system for a Ebay-like Finnish auction-site huuto.net.
The system would reopen closed auctions by a specific rules. It would be completely external from the main site, running at an external website.
The site is however unwilling to release its API and Schema. I know no way to build such a system without knowing its API.
How do you build an internet site without its API and Schema?
You could try some form of automatic browsing: mechanize
Edit:
Examples here.
I think you're asking about building a site that interacts with another site without using a well-defined API. Is that right?
You can interact with an external site without using an official API - in order to do so, you need to imitate a normal site visitor and send your requests to the site frontend (in much the same way as a web crawler does). Tools like hpricot, mechanize and curl can help you parse the content of pages and send requests, but in doing so your system may be quite brittle. Any change to the target site might mean you have to rewrite portions of your system.
It might be possible to get the data you need by screen scraping the site. You could perform the operations you want to do by POSTing data into their forms or using a WebClient type API to make your program act like a web browser but that's likely to be an extremely brittle solution.
Honestly though, without an API, there really is no good solution.
you either need access to the database or an API, otherwise no point in even trying.