On some websites that I access frequently, I get a Google checkbox saying I am not a robot. On some occasions, it requests me to choose pictures of rivers or trees as an additional step on some occasions. Is it possible to bypass this step. I am a human being trying to access the site and not a computer programme. What can I do to convince Google about this?
There could be several reasons for this captcha to prevent you from accessing websites. You may have some malware/virus/trojan installed on your pc or there are some unusual traffic from your network. You can check you system and network. If you want to "bypass" you can try any proxy sites to access these websites.
Related
please tell me why there is such a big difference between the included "Total Blocking Time" on the site https://developers.google.com/ and the local litehouse for a mobile device?
I am not able to give you the address and show you the site, there is just a text and a picture. The site is made on nuxt js and assembled completely statically.
My computer - mackbook air m1, google uses less powerful emulation and results in fewer parrots?
This official page explains it well: https://web.dev/performance-scoring/
Common problems:
A/B tests or changes in ads being served
Internet traffic routing changes
Testing on different devices, such as a high-performance desktop and a low-performance laptop
Browser extensions that inject JavaScript and add/modify network requests
Antivirus software
Otherwise, this one is another answer: https://www.debugbear.com/blog/why-is-my-lighthouse-score-different-from-pagespeed-insights
Google PageSpeed uses a “combo” of lab and real-world data[historical data of your website], whereas Lighthouse uses lab data only[locally tested on your machine] to build its report.
Note: You should trust PSI metrics over just lab data.
I have got a DNN website and would like to test the site on multiple devices. I currently use Google Chrome, but it is not always as accurate. Is it possible to use Xamarin Test Cloud or any other software? My company do not want to spend money on a Device Board.
Have to tried https://www.browserstack.com.
It allows you to test the site across many devices and browsers but it does have some limitations. It only gives you a screenshot of the page and the page must be publicly view-able.
We currently experience a diminished with one of our customers at our main production site. All subpages and resources seem to be affected as well.
The customer reports a completely broken experience for themselves with the site not working correctly at all, mostly due to assets not loading correctly.
We already started investigating and have found that - so far - nothing seems to be wrong with the site itself.
Quick rundown:
The production site has a Cloudflare layer and almost all of it's assets are delivered either via CDNjs or Amazon's Cloudfront (behind Cloudflare) - all assets are reachable via HTTP as well
The site uses SSL and enforces it (the dynamic cert from Cloudflare)
We could secure a HAR from one of the requests for the request to one of our sites, the request times are extremely long. If you like to try, here is an online HAR viewer, be sure to uncheck validation of the file.
The customer uses Internet Explorer 8 and Chrome (39). While the site is not optimized for IE8. It should run fine in Chrome, in fact, in runs in most browsers above IE9 just fine for all of us.
Notes
We already ruled out:
Virtual delivery problems (there could be physical limitations we are not aware of)
General faultiness of our setup (We tried three different open VPNs to verify this)
Being on the customers blacklist by accident (although we cannot be entirely sure of this)
SSL Server name indication (SNI) problems
(Potentially) a general problem with the customers network, the customer does not report any problems with "the rest of the internet".
The customer will not give access to their VPN/disclose security details so we cannot really test for the situation ourselves. We suspect that the customer uses an internal proxy that might cause the problems described, but we are not sure.
Questions
My questions here are:
Is there any known problem caused by internal networking in conjunction with our setup that can cause this behaviour?.
Are there potential problems on our end that we could have overlooked or things that we do different from other sites?
It seems the connection is being done (or routed) through a low bandwidth high latency link (or a very congested one). Most of the dns lookups and connects seems to be taking ~10s.
In the HAR you can see that it affects fonts.googleapis.com and cdnjs.cloudflare.com. https://www.google-analytics.com/analytics.js has no data captured. To me the affirmation that the customer does not report any problems with "the rest of the internet" seems kind of dubious, seeing that in this HAR it hasn't been able to load the analytics js and access to usual cdns are very slow.
My guesses (pick one or more):
they are testing in a machine different than the one they have no problems with "the rest of the internet"
this machine is very, very slow
it has some kind of content filtering, antivirus, whatever filtering the web (perhaps with a ssl certificate installed in order to forge & inspect https traffic)
the access is done through a congested route, or a low bandwidth high latency link
Two hotspots:
It happens sometime for CDN points to be inconsistent, I spent a lot of time to understand this issue. How? In a live session with the client when I opened each resource loaded one by one I understand there are differences between CDN access points (Mine eastern Europe - His central Europe ). CDN hosting was one of the biggest US player in the world, anyhow we fixed this by invalidating(deleting) all files from CDN as so new/correct ones were loaded.
You need to have CDN that supports serving files over HTTPS, then use that CDN for the SSL requests.
We have an issue where we have a website for test and an equivalent website for live. What we are finding is that due to carelessness our testers are using the wrong site (e.g. testing on the live site!).
We have total control over both sites, but since the test site is for acceptance testing by the users we don't want to make them different in any way. The sites must look the same and there is also a layer of management that will kick up a storm if the test and live sites are in any way different.
How have other people solved this problem? I am thinking of a browser plugin to make the browser look different somehow (e.g. changing the colour of the location bar when on the test website). Does anyone know of a plugin or a technique that would work? (We primarily use Firefox and Chrome)
Thanks,
Phil
UPDATE
We eventually settled on a program of: different credentials for the test and live site (this was not popular!) and making a series of plugins available for those who wanted them (colourize tabs for Chrome and Firefox users - we never did find a good plugin for IE).
Thanks to those who answered.
In our company we use different site's names:
www.dev.site.com - for developers
www.qa.site.com - for QA's
www.site.com - production site
Another good practic is to use different users credentials for dev\qa and prod sites.
I have bought a ipad website and it's moved to my server.
Now i have tried to make an addon domain, but it does not work on my first hosting account.
On my second hosting account it works, but on that server there is another ipad website so i don't think this is smart to do because of the same ip adresses.
So adding an addon domain does not work and the site is down now!
I have added a service ticket, but i think this will cost at least 8 hours before i get an answer.
Can anyone tell me how bad this is for my serp position in google.
The website has always been on the first page.
Will this 404 error do bad to my site?cOr is it better to place the site on the same server as the other ipad website?
EDIT:
It is not ideal to serve a 404/timeouts, however your rankings should recover. You mentioned that the sites are different. Moving the site to a different server/IP shouldn't matter too much as long as you can minimize the down time of the said process performed (and should probably be preferred over downtime, if possible). I want to ensure this is communicated, but do NOT show site #2 as site #1 in the short term as you will experience duplicate content issues.
If you don't already have it, you might open up a Google Webmaster Tools account. It will provide you with some diagnostics about your outage (e.g. how many attempts Google tried, the returned response codes, etc..) and if something major happens, which is unlikely, you can request re-inclusion.
I believe it is very bad if the 404 is a result of an internal link.
I cannot tell you anything about which server you should host it on though, as i have no idea if that scenario is bad. Could you possibly host it on the one server, then when the next is up, host it from there?