Ignore specific url from Cloudflare ddos protection - api

Is it possible to ignore specific urls (example.com/oauth/*, example.com/api/*) from Cloudlfare DDoS protection?

You'll want to use Page Rules to apply specific settings to individual URL's (including wildcard paths) -
https://support.cloudflare.com/hc/en-us/articles/218411427-Understanding-and-configuring-Cloudflare-Page-Rules-Page-Rules-Tutorial-

Related

Set Base URL using .htaccess

I'm setting up a clients area so my customers can review their site during development. I want to set it up so the URL is http://clients.mydomain.com/clientname/
Is there a way in the .htaccess file to set that as the base URL? I'm using the leading / format for my URLs in the page (ie /about/ or /css/), which will is fine locally & when I deploy to production, but doesn't work in the scenario outlined above.
The proper way would be to use relative links in your HTML, it's unreliable to try to track the referer and rewrite every subsequent request to shove the /clientname/ back in as a prefix.
If you make a subdomain for each customer, and develop sites there, you don't have to change the base URL. This will prevent other htaccess rules to break also when deploying to the live server...
So use:
http://clientname.mydomain.com

Is it better to 301 redirect requests to a www. subdomain or non-www?

A lot of people talk about 301 redirecting incoming requests to one canonical url for SEO or other purposes.
This can be useful. Suppose if for example a search engine ranked the urls and unfortunately treated a url with a www. and without differently. Like for example, for Facebook, there is http://facebook.com, http://www.facebook.com, (and even more, like https://).
I guess my question would be, if there is a difference, would it be better overall to redirect to the url with a www. subdomain, or without? Reasoning would be really appreciated. Thank you.
Often, shorter is better (e.g. use domain.com instead of www.domain.com), since your URLs will be shorter and thus your HTTP requests and responses will also be smaller.
However, one thing to keep in mind is that if your site uses cookies, and you set a cookie on domain.com, that will get sent to all subdomains. If you want to keep a "cookieless" subdomain for performance reasons (e.g. requests for images at images.domain.com don't carry the cookies) then you should consider using the "www." prefix in the canonical URLs if you need cookies to be sent in the requests for pages but not for sub-resources.
You'll also want to use the "www." form if your domain is of a funky (ccTLD) format like XX.YY, because cookies work properly on www.XX.YY but you'll have problems with older browsers if you try to set cookies on XX.YY.

preventing from the site any external links

I am using DokuWiki and as we've tried to secure it as much as possible the best security for us to keep it's location on our server secret. Therefore we want to make sure no link can be clicked on any pages which would reveal the location of our infrastructure. Is there any way to configure this restriction with DokuWiki or are there known ways to pass URLs through a third party?
Did you tried to protect the site with .htaccess and .htpasswd?? Is a good solution for other to not enter on your site.
And if the site is online you should include a robots.txt to avoid crawlers to index it
User-agent: *
Disallow: /
Hope i help you

Will not having a www redirect affect google/bing seo?

If my website only responds to www.example.com, and not example.com, does this affect search rankings at all? I haven't found anything to confirm or deny this for any major search engine, and I'm curious.
i was reading an article on this a while back from ScottGuthrie that relates to IIS SEO Toolkit - the main points are as follows:
4 Really Common SEO Problems Your Sites Might Have
Below are 4 really common scenarios that can cause your site to inadvertently expose multiple URLs for the same content. When this happens external sites linking to yours will end up splitting their page links across multiple URLs - and as a result cause you to have a lower page ranking with search engines than you deserve.
SEO Problem #1: Default Document
IIS (and other web servers) supports the concept of a “default document”. This allows you to avoid having to explicitly specify the page you want to serve at either the root of the web-site/application, or within a sub-directory. This is convenient – but means that by default this content is available via two different publically exposed URLs (which is bad). For example:
http://scottgu.com/
http://scottgu.com/default.aspx
SEO Problem #2: Different URL Casings
Web developers often don’t realize URLs are case sensitive to search engines on the web. This means that search engines will treat the following links as two completely different URLs:
http://scottgu.com/Albums.aspx
http://scottgu.com/albums.aspx
SEO Problem #3: Trailing Slashes
Consider the below two URLs – they might look the same at first, but they are subtly different. The trailing slash creates yet another situation that causes search engines to treat the URLs as different and so split search rankings:
http://scottgu.com
http://scottgu.com/
SEO Problem #4: Canonical Host Names
Sometimes sites support scenarios where they support a web-site with both a leading “www” hostname prefix as well as just the hostname itself. This causes search engines to treat the URLs as different and split search rankling:
http://scottgu.com/albums.aspx/
http://www.scottgu.com/albums.aspx/
full article at http://weblogs.asp.net/scottgu/archive/2010/04/20/tip-trick-fix-common-seo-problems-using-the-url-rewrite-extension.aspx
Google treats www.example.com and example.com as two separate domains (since 'www' is technically a sub-domain). Neither is better than the other in terms of SEO, as long as you don't mix and match links - i.e. some links point to example.com while others point to www.example.com.
If you don't have any redirects from one to the other, then links into the site (and so visitor traffic) may be split between the two sub-domains, effectively meaning you're competing with yourself in search engine rankings. It's probably a good idea to pick one (either example.com or www.example.com) then set up redirects on the other domain, and/or add canonical links to pages so that search engines know that the pages should be treated as the same site.
See here for more on canonical links in www vs non-www links.

.htaccess, YSlow, and "Use cookie-free domains"

One of YSlow's measurables is to use cookie-free domains to serve static files.
"When the browser requests a static
image and sends cookies with the
request, the server ignores the
cookies. These cookies are unnecessary
network traffic. To workaround this
problem, make sure that static
components are requested with
cookie-free requests by creating a
subdomain and hosting them there." --
Yahoo YSlow
I interpret this to mean that I could experience performance gains if I move www.example.com/images to static.example.com/images.
Although this is easy to do, I would lose the handy ability within my content management system (Joomla/WordPress) to easily reference and link to these images.
Is it possible to use .htaccess to redirect all requests for a particular folder on www.example.com to a folder on static.example.com instead? Would this method also fool the CMS into thinking the images were located in the default locations on its own domain?
Is it possible to use .htaccess to redirect all requests
for a particular folder on www.example.com to a folder on
static.example.com instead?
Possible, but counter productive — the client would have to make an HTTP request, get the redirect response, then make another HTTP request.
This costs a lot more than the single line of cookie data saved!
Would this method also fool the CMS into thinking the images
were located in the default locations on its own domain?
No.
Although this is easy to do, I would
lose the handy ability within my
content management system
(Joomla/WordPress) to easily reference
and link to these images.
What you could try to do is create a plugin in Joomla that dinamically creates these references.
For example, you have a plugin that when you enter {dinamic_path path} in an article, it appends 'static.example.com/images' to the path provided. So, everytime you need to change the server path, you just change in the plugin. For the links that are already in the database, you can try to use phpMyAdmin to change them in this structure.
It still loses the WYSIWYG hability in TinyMCE, but is an alternative.
In theory you could create a virtual domain that points directly to the images folder, such as images.example.com. Then in your CMS (hopefully at the theme layer) you could replace any paths that point to the images folder with an absolute path to the subdomain.
The redirects would cause far more network traffic, and far more latency, than simply leaving things as they are.
It would redirect the request but the client would still be sending its cookies to the server, so really you accomplished nothing. You would have to directly access the files from a domain that isn't storing cookies for it to work.
What you really want to do is use staticexample.com/images instead of static.example.com/images so that you don't pick up any cookies on the example.com domain that you may have set. If all you do is server images from that domain with a simple apache server or something then you can configure that server not to return even a session cookie.
The redirects are a very bad idea. Cookies cause some performance hits but round trips to the server such as a redirect would cause are a much more serious performance issue.
I did below and gained success:
<FilesMatch "!\.(gif|jpe?g|png)$">
php_value session.cookie_domain example.com
</FilesMatch>
What it means is that if you do not set images in cookie information.
Then images are cookie-free with server.