I need to automate a scenario where I should verify the connection to the url is secure without any SSL certificate issues.
I am using selenium webdriver with java and the browsers on which I need this are IE and Chrome. I cannot provide any url having this issue due to some security restrictions.
Issue to be caught
I found an old question (Check if website has any ssl certificate warnings using Selenium webDriver) which was answered only for Firefox and so it is of no use in my case.
Regarding browser (better say browser profile) settings you can add expectation of specific elements, see attached image for example in IE.
In Chrome those links can not be inspected, let's try something like this:
driver.get(UrlWithSslIssue);
Thread.sleep(5000); // ofc better wait methods available
WebElement lnk_advanced_settings = driver.findElement(By.partialLinkText("ADVANCED"));
lnk_advanced_settings.click();
Thread.sleep(5000);
WebElement lnk_continue = driver.findElement(By.partialLinkText("Continue"));
// lnk_continue.click();
Related
I'm launching a website using Selenium Python! On loading the Chrome browser, ZAP proxy is getting attached to it and capturing URL. I have 2 things to that needs to be clarified here:
How to capture URL/requests when user parses through different links? ZAP is capturing it in GUI. Is there any API that gives me the full URL List?
How to use Selenium (Python) to capture URL? It captures only current URL and when I go to other link, it doesn't print the new page URL.
The short answer is yes - Zap has an amazing API, and you can find the documentation here. For a longer explanation, I will need some details as I don't fully understand your question.
1 - are you looking to get all the request that were proxy through Zap? You can use the following: /JSON/core/view/sites/?zapapiformat=JSON&formMethod=GET.
2 - Not sure - seems like a selenium question, correct?
i use cucumber ruby for automation testing at browser. i need print cookies the browser
how to get cookies in browser in capyabara
1. inspect element at browser
2. application
3. print the cookies
how to print cookies browser in cucumber capyabara
i have try
puts Capybara.current_session.driver
but print like this
#<Capybara::Selenium::Driver:0x007fcbf52e2250>
Since feature tests (which is what Capybara was/is designed for) really shouldn't be dealing with cookies (test user visible things, not implementation details) there is no Capybara cookie API. This means any access is going to be driver dependent. In your case you appear to be using selenium so it would be
page.driver.browser.manage.all_cookies
i try this can be solve
Capybara.current_session.driver.browser.manage.cookie_named("browser_id")[:value]
What are the ways of getting the cookies from a website from ways other than the following?
Inspecting the browser
Using Selenium web drivers by using driver.get_cookies() command
you can also run javascript via selenium and use document.cookie.
it return the cookies in string format:
key=value;key=value
Using PHP Codeception and WebDriver PHP wrapper (Facebook), is it in general possible to get the environment variables of the actual page request, made by PhantomJS or a real browser used?
Maybe it is just my misunderstanding of the technology behind acceptance tests, but given the fact that a testing framework like Codeception is requesting a page using PhantomJS or a real browser like Chrome or Firefox, I would expect to have access to e.g. the $_SERVER global variable. Unfortunately I can not find any methods providing this in WebDriver Codeception module or Facebook PHP WebDriver wrapper.
Specifically, I have a page which is supposed to use SSL only, so a 301 redirection is expected to happen when visiting the page.
I need to have an acceptance test case in Codeception to check just that and checking the $_SERVER['HTTPS'] global variable should do it.
First I tried to match the URL against 'https://' but the WebDriver wrapper method _getCurrentUrl() delivers only the URI part without protocol and host.
Then I tried to get the $_SERVER variable inside custom Helper action, but the one accessed directly looks like it comes from the CLI environment, not a browser request.
No, you can't access $_SERVER in acceptance tests, because $_SERVER is in server-side and all you have is a client.
If you want to check a complete url, you can use getCurrentURL method of webdriver instance, it can be accessed in the same way as _getCurrentUri method in your helper.
public function checkUrl()
{
$url = $this->getModule('WebDriver')->webDriver->getCurrentURL();
//do your checks here
}
If already used WebDriver module:
$currentUrl = $I->executeJS('return jQuery(location).attr("href");');
We have a bunch of redirects in our Apache configuration. I would like to automate the testing of redirects with Selenium, which led me to some problems:
Call an URL, but assert on the redirected page
Check the URL of the browser after redirected
Check Response Header, to determine the type of redirection (301, 302)
Maybe Selenium is not the best solution for this. Any other suggestions?
Selenium-RC has a traffic capture mode, defined as selenium.start("captureNetworkTraffic=true"); that will enable you to capture HTTP responses, including redirects and error codes.
Here is an excellent resource on how to capture and process/format this information once retrieved. It uses Python, though, but should give you a start.
For checking the URL of browser, you could use selenium.getLocation();
in python's implementation,
driver = webdriver.Firefox()
print driver.current_url
http://selenium.googlecode.com/svn/trunk/docs/api/py/webdriver_remote/selenium.webdriver.remote.webdriver.html#module-selenium.webdriver.remote.webdriver