I developed a web app in VueJS, but it seems to be having some issues with CloudFlare.
When a user is prompted with the Captcha verification screen, upon completing the verification the page loads, but it is blocking resources like my JS files.
However, if I say whitelist a certain country and access it, the page works as intended.
I've also had some issue when changing certain elements and the page not updating correctly, even when purging. I feel this issue is related to the one above.
(I'm also using SSL if that makes any difference.)
Out of curiosity, I've recently seen a lot of pages using a “newer” 5 second protection style page rather than a captcha. I was wondering if cloudflare offered anything like this as I feel the captcha gets very annoying for my users.
Thanks and any help is appreciated.
Related
Situation:
I am writing test automation for a website. There comes a point where there is a link button on my website. Clicking this I am redirected to an external website. There I have to log in and as soon as I do that I am redirected to my original web-page which contains some 'connections' that I need.
Problem:
As soon as cypress clicks on the redirection button it does into a blank page.
Ideal solution:
I would want to automate the entire scenario. If not then at-least a work around.
As suggested in the Cypress Docs, you should really be using cy.request() to log in. You don't control a 3rd party site, and that makes your test very flakey.
For example, a lot of login pages are constantly changing and are A/B tested for the purpose of preventing a bot from logging in, including testing bots. The data:, url is probably the result of a http redirect.
Thankfully, using cy.request() you can 'fake' logging in by making a request to the server through code (which doesn't change as much) and you will never have to leave your app to log in
Here's a recipe for Single Sign-On for example.
Hope that makes sense!
I have administrated a community site for quite a while and over the time a handful people have told me that their browsers got infected with a "malware addon" that randomly inserts ad banners in websites they visit.
While they say some websites don't seem to "allow" such foreign insertions, my forum did "allow" the adding of the browser ads. (For example a malware thing called "OnlineBrowserAdvertising")
Is there a way (through HTML, Javascript...) to prohibit browsers to add something to my site? I am 100% sure that my site or my webspace are not infected. It's the visitors' browser add-ons messing with my page.
You cannot prevent browsers from injecting their own code on your end - the data is retrieved from your server and stored locally (at least temporarily) by the user's device. From the moment the data leaves your server, it is essentially out of your control.
This is purely a client-side issue that can be rectified by following standard security practices. Tell your users to keep their software up-to-date, use an antivirus/firewall solution, and not to visit or download from suspicious websites.
This issue began last week. Prior to that, I was not having any problems, and I am not aware of any changes to my site or platform.
Now, when I want to share a new blog or article to Google+, it fails. Typically I would use the embedded +1 buttons on a specific post and use the expanded box to create my post. The box comes up, but instead of a nice title and image, it now uses the URL only for the page, and sometimes adds extra characters to the end of the URL. If I take the URL from the address bar and try to compose a new update directly on Google+, I get the same issue. If I enter the URL into the Link field on a status update, it usually comes back "could not load website."
Here's a sample
You'll find Google+ sharing buttons above and below the article. Sharing to every other network works as expected.
My site is a Drupal site that has been operating for 10 months. I am a Drupal developer, but have never encountered an issue like this.
It appears that requests from the Googler crawler are being rejected by your server. I tried testing the microdata with the structured data testing tool and it runs into problems connecting to your site. Other sites work fine.
If you have access to your site's Apache access logs, I would check those for problems coming to that URL. You can narrow your search down by looking for the user agent of the crawler: Google (+https://developers.google.com/+/web/snippet/)
My guess is that something changed in your server's configuration that is the cause. Start with the logs and see what's there.
I am currently assessing what are the best options to integrate multiple sites to a single sign-on system. The ambition is to have a unified header with shared assets across the sites. Currently it operates as a separate login page where the user is redirected back to the page they were on before, similar to Google accounts.
There has been a proposal for an iframe or a popup iframe.
The benefits for this appear to be entirely for the user, so the user does not have to leave the page they are on. My concerns with this approach are
if we make changes to the login page itself we will need to make changes to the iframe which could require a redeployment of all the sites at the same time
the suggestion for a regular iframe is intricate to the design and will create problems across browsers
pop up iframes are problematic on mobile devices
if a user has scripting disabled they will be unable to login
a user may have a pop up blocker in place
Does anyone have any other arguments for or against using iframes for an SSO system? Any critique on the points I have already raised are also greatly appreciated.
Thanks!
This is just a quick question really for my own peace of mind more than anything.
When accessing an app hosted on Heroku through https://myapp.herokuapp.com I get a warning in Google Chrome and Firefox (but not Safari) along the lines of:
You have requested an encrypted page that contains some unencrypted information. Information that you see or enter on this page could easily be read by a third party.
I don't really want to pay the $20 a month for the SSL-Endpoint addon and was just wondering why these warnings appear and whether there was a way around it/how to find the content that is unencrypted.
I have tried both with and without config.force_ssl = true
Any suggestions? Any help would be appreciated
Turns out that this was an issue with a request from Google maps.
As far as I can remember, they changed their API to allow access over https:// after we began development.
It was simply a case of following the new process that was documented on the Google API page