How to stop spam bot from accessing site using htaccess? - apache

I have an apache server running WordPress, and recently I noticed large traffic from a spam bot more specifically bot-traffic.xyz which shows in the "Top Referrals" section when looking at Google Analytics. My question is since I don't know the source IP address, how do I block the spam bot using the .htaccess?
I have found a post (https://moz.com/blog/how-to-stop-spam-bots-from-ruining-your-analytics-referral-data) pointing out the process, but I'm not sure since it's from 2015.
The post says to do something like this:
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://.*domain1\.com/ [NC,OR]
RewriteCond %{HTTP_REFERER} ^http://.*domain2\.com/ [NC]
RewriteRule ^(.*)$ – [F,L]
If this is correct, how would I block bot-traffic.xyz?
Can some describe what the above code is?
Thanks,

These bots are not hitting your site at all but rather using your Google Universal Analytics (UA) code and hitting Google directly & pushing their website URL in the page variable. Adding a rewrite rule won't help. Check your Apache log file and you won't find any traffic.
These URLs all go back to the same website run by someone selling fake traffic as a service. You can try to setup a filter in Google Analytics however that's going to be a game of wack-a-mole. Best way to get rid of this is to setup view filters in Google Analytics Universal. Admin icon (lower left) -> Filter (in right most column). Add a filter. Choose custom filter, exclude from Request URL. Then build regex to remove the offending sites.

Related

How to move opencart site http to https

I have already installed ssl certificate on my opencart site but some pages are working fine with https but category pages not working with https. Do I need to change all url in database also? In the config file, I already set https.
Some of these may not apply to your particular installation but in the interest of creating a comprehensive answer, I've tried to cover all the bases here:
Note: you might need to adjust the table names depending on your store's table prefix if they don't begin with oc_
Open config.php and admin/config.php and change all those constant url declarations to https - make sure to include HTTP_SERVER and HTTP_CATALOG
In your admin panel go to system > settings, click edit and in the server table set Use SSL: to Yes.
In your database update the store_url column in the oc_order table so that all links are https. This is important because updating orders can fail if the api attempts to access http version of your site. you can use this query: UPDATE oc_order SET store_url = REPLACE(store_url, 'http:', 'https:')
If you have any hard coded images and links in your description tables you should replace those as well. SSL will still work but will show the warning flag in the browser bar. This includes oc_product_description, oc_category_description, and any other tables where you might have created html content.
Same as above for your theme files. It's fairly common to find hard coded http:// links and images in footer.tpl and header.tpl for starters. You can simply browser your site to see if any of the pages are not showing the green lock icon in the browser and take it from there.
Another culprit breaking https can be third party extensions which can exist both as files and in OC2 as ocmods in the oc_modification table.
Finally, create a redirect in .httaccess to gracefully let traffic know that your pages can now be found on https. I've excluded robots.txt and any connections for the openbay routes because, based on experience, when I tried to redirect ebay webhooks it broke things and they seem to be http only by default. I suspect this may be a shortcoming in how openbay handles those requests, or possibly a configuration issue but I was unable to find a workaround that didn't break openbay so for now I'd recommend leaving those requests untouched. I am using this in .htaccess:
RewriteCond %{HTTPS} off
RewriteCond %{REQUEST_URI} !/robots\.txt$
RewriteCond %{QUERY_STRING} !^route=ebay/openbay/*
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
That should do it!

How to integrate Google OAuth 2.0 with a Subdomain

I have found some similar questions on the website, however I couldn't find the proper answer that works for my website.
So far, OAuth it looks hard to implement for me, Facebook was much easier.
I am trying to integrate OAuth to one of my websites. The problem is that I am using a subdomain for it and I'm getting and error when I press on the Google login button:
The redirect URI in the request: http://a.example.com/auth/google did not match a registered redirect URI
In Google Developers Console, I didn't added anything in the Javascript Origins, but added the following to the Redirect URIs:
https://a.example/auth/google_oauth2/callback
I also found this .htaccess code and used it, but doesn't seem to change anything:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^google\.
RewriteCond %{QUERY_STRING} state=([a-z0-9]+)
RewriteRule ^(.*)$ http://%1.example.com/$1 [L]
Anyone knows what needs to be done to integrate OAuth with a subdomain?
The answer is in the question. "http://a.example.com/auth/google" doesn't match "https://a.example/auth/google_oauth2/callback".
The match needs to be character perfect.

.htaccess Redirect based on HTTP_REFERER being empty

I'm trying to set up a redirect on a WP blog installation that will detect anyone coming in from nowhere (i.e. not from another site). The idea is to trap some of the spambots that plug pre-constructed URLs into the system to create comments/posts. I figure if they don't have a referrer site, I can pop them back to the homepage (www.domain.com/index.php or just www.domain.com), which should mess with the bots but not with real people.
I understand that the referrers can be forged but hopefully it'll stop the stupids, at least.
I have very little clue about .htaccess rewrite rules (I apologise for being a noob), but I couldn't find one that did this in existing answers or anywhere else online, despite several searches. Either no one's done it or I'm not phrasing correctly.
Any help appreciated. :)
I'd advise against this. By doing it, you may annoy and alienate a portion of potential your users: for example my browser is set not to report referer information, others use anonymity networks. The dump bots you can catch by matching their reported user agent string (as seen here).
Otherwise it's simple: match against the HTTP_REFERER environmental variable in a RewriteCond:
RewriteCond %{HTTP_REFERER} ^$
RewriteRule .* http://example.com/
The RewriteCond checks to see if the referer is an empty string; the RewriteRule redirects everything to http://example.com/ root. This is a hard redirect, meaning that the server will issue an R=301 moved permanently header. If you just want to sneakily serve another resource, use a soft redirect by specifying a relative URL, like RewriteRule .* index.php. However, it may be kinder for people not reporting referrer information to redirect them to a page saying something like "You should enable referrer reporting if you want to read this page".
For more examples on such things, see the manual. There's a very similar prevent-hotlinking method there.

Prevent users from accessing files using non apache-rewritten urls

May be a noob question but I'm just starting playing around with apache and have not found a precise answer yet.
I am setting up a web app using url-rewriting massively, to show nice urls like [mywebsite.com/product/x] instead of [mywebsite.com/app/controllers/product.php?id=x].
However, I can still access the required page by typing the url [mywebsite.com/app/controllers/product.php?id=x]. I'd like to make it not possible, ie. redirect people to an error page if they do so, and allow them to access this page with the "rewritten" syntax only.
What would be the easiest way to do that? And do you think it is a necessary measure to secure an app?
In your PHP file, examine the $_SERVER['REQUEST_URI'] and ensure it is being accessed the way you want it to be.
There is no reason why this should be a security issue.
RewriteCond %{REDIRECT_URL} ! ^/app/controllers/product.php$
RewriteRule ^app/controllers/product.php$ /product/x [R,L]
RewriteRule ^product/(.*)$ /app/controllers/product.php?id=$1 [L]
The first rule will redirect any request to /app/controllers/product.php with no REDIRECT_URL variable set to the clean url. The Rewrite (last rule) will set this variable when calling the real page and won't be redirected.

Can Apache configuration check cookies?

My situation:
We have a mobile version of our website, and want to start redirecting mobile users to it. The plan is to do this in Apache httpd.conf or .htaccess, using something like this:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (iPhone|Blackberry|...)
RewriteRule (.*) mobile/$1
However we want there to be a way for users to override our default action of redirecting them. One way we thought to do it was to show a link on the mobile site directing back to the regular site, and store a cookie when they use that link.
Could the Apache configuration file check a cookie before redirecting?
Is there a better way?
The HTTP_COOKIE server variable contains the cookies passed from the client to the server. You can look in there to find out what cookies have been generated by a script or module.