I have hosted a static webpage on Gitlab pages. The URL of the webpage is myname.gitlab.io
I have another website hosted with hostgator which has the URL "mysecondwebsite.com". "mysecondwebsite.com" has thousands of static html pages hosted on the various paths like "mysecondwebsite.com/charts/folder1/1.html", "mysecondwebsite.com/charts/folder1/2.html", "mysecondwebsite.com/charts/folder1/3.html" & so on.
I don't want "mysecondwebsite.com" to be accessible directly nor the pages in it. Hence, I've enabled hotlink protection which works as expected. Now, I also want to allow access to "mysecondwebsite.com" ONLY FROM myname.gitlab.io. This website has list of hyperlinks which when clicked should open anapprpriate page in "mysecondwebsite.com". To achieve this, I've entered the following in .htaccess file on hostgator which isn't helping. I see 403 forbidden
# IP to allow
order allow,deny
deny from all
allow from gitlab.io
Current hotlink protection settings -
# DO NOT REMOVE THIS LINE AND THE LINES BELOW HOTLINKID:r2xGl7fjrh
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mysecondwebsite.com/.*$ [NC]
RewriteRule .*\.(.*|jpg|jpeg|gif|png|bmp|tiff|avi|mpeg|mpg|wma|mov|zip|rar|exe|mp3|pdf|swf|psd|txt|html|htm|php)$ https://mysecondwebsite.com [R,NC]
# DO NOT REMOVE THIS LINE AND THE LINES ABOVE r2xGl7fjrh:HOTLINKID
I am in no way an expert with web hosting. Please could I get some help to get this working.
UDPATED htaccess
Options All -Indexes
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http(s)?://((myfirstwebsite\.com)|((www\.)?mysecondwebsite\.com))/ [NC]
RewriteRule .* - [F]
HTTP LIVE HEADER DUMP
https://mysecondwebsite.com/charts/thisfolder/thisfile.html
Host: mysecondwebsite.com
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:98.0) Gecko/20100101 Firefox/98.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Alt-Used: mysecondwebsite.com
Connection: keep-alive
Upgrade-Insecure-Requests: 1
Sec-Fetch-Dest: document
Sec-Fetch-Mode: navigate
Sec-Fetch-Site: cross-site
Sec-Fetch-User: ?1
GET: HTTP/2.0 403 Forbidden
cache-control: private, no-cache, no-store, must-revalidate, max-age=0
pragma: no-cache
content-type: text/html
content-length: 699
date: Wed, 06 Apr 2022 07:13:17 GMT
server: LiteSpeed
content-security-policy: upgrade-insecure-requests
alt-svc: h3=":443"; ma=2592000, h3-29=":443"; ma=2592000, h3-Q050=":443"; ma=2592000, h3-Q046=":443"; ma=2592000, h3-Q043=":443"; ma=2592000, quic=":443"; ma=2592000; v="43,46"
X-Firefox-Spdy: h2
---------------------
https://mysecondwebsite.com/favicon.ico
Host: mysecondwebsite.com
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:98.0) Gecko/20100101 Firefox/98.0
Accept: image/avif,image/webp,*/*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Alt-Used: mysecondwebsite.com
Connection: keep-alive
Referer: https://mysecondwebsite.com/charts/thisfolder/thisfile.html
Sec-Fetch-Dest: image
Sec-Fetch-Mode: no-cors
Sec-Fetch-Site: same-origin
GET: HTTP/3.0 404 Not Found
content-type: text/html
last-modified: Mon, 28 Mar 2022 13:48:20 GMT
etag: "999-6241bca4-dfd29bee5117e228;br"
accept-ranges: bytes
content-encoding: br
vary: Accept-Encoding
content-length: 911
date: Mon, 04 Apr 2022 10:11:14 GMT
server: LiteSpeed
content-security-policy: upgrade-insecure-requests
alt-svc: h3=":443"; ma=2592000, h3-29=":443"; ma=2592000, h3-Q050=":443"; ma=2592000, h3-Q046=":443"; ma=2592000, h3-Q043=":443"; ma=2592000, quic=":443"; ma=2592000; v="43,46"
X-Firefox-Http3: h3
---------------------
allow from gitlab.io doesn't work on the http referer header like you seem to be expecting. Rather it works based on the IP address of user making the request.
Instead you want to use something that checks the referer and denies access when it doesn't contain myname.gitlab.io or your own website's host name. You can do that with mod_rewrite by placing the following in your .htaccess file:
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http(s)?://((myname\.gitlab\.io)|((www\.)?mysecondwebsite\.com))/ [NC]
RewriteRule .* - [F]
This would allow referrers from your gitlab site, and would then allow those pages to fetch further resources such as images, js, and css. In this rule:
RewriteEngine on - turns on rewrites, this needs to be specified once in your .htaccess and is shared between all the rewrite rules and conditions
RewriteCond - specifies a condition for the next rewrite rule
! says that the following regular expression should be negated (not matched)
^ is the beginning of the regular expression
NC is "no case" meaning that this rule is case insensitive and will work for both upper-case and lower-case input
RewriteRule is the actual rule
.* says that it matches all URLs (in this case the condition specified above it what matters)
- means that there is no destination URL
F says that it should show the "forbidden" status as opposed to redirecting or internally changing the URL.
The problem with this approach is that it will forbid some requests that actually are referred from gitlab. Not all browsers actually send a referer header in all circumstances.
Please could you share what the exception rule script is that you're thinking?
This is just an alternative to #StephenOstermiller's excellent answer...
You could instead keep your existing "hotlink protection" script unaltered, as generated by your control panel GUI (and make any changes through the GUI as required). But include an additional rule before your hotlink protection to make an exception for any domains you need to give access to.
# Abort early if request is coming from an "allowed" domain
RewriteCond %{HTTP_REFERER} ^https://myname\.gitlab\.io($|/)
RewriteRule ^ - [L]
# Normal hotlink-protection follows...
This prevents the hotlink protection from being processed when the request is coming from the allowed domain. So access is permitted.
This does assume you have no other directives that should be processed, following this rule.
Can someone please help me with setting rules such that i get only the data which is being posted using POST. I have a form where i am submitting name and email id. I want to save just that part to be saved in the log file. In my scenario i just want below data in my log file:-
--29000000-C--
name1=ssn&email1=ssn%40gmail.com
--29000000-F--
HTTP/1.1 200 OK
X-Powered-By: PHP/7.2.4
Content-Length: 16
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=UTF-8
My present mod_security looks like:-
<IfModule security2_module>
#Enable the module.
SecRuleEngine On
SecAuditEngine on
#Setup logging in a dedicated file.
SecAuditLog C:/wamp64/bin/apache/apache2.4.33/logs/website-audit.log
#Allow it to access requests body.
SecRequestBodyAccess on
SecAuditLogParts ABIFHZ
#Setup default action.
SecDefaultAction "nolog,noauditlog,allow,phase:2"
#Define the rule that will log the content of POST requests.
SecRule REQUEST_METHOD "^POST$" "chain,allow,phase:2,id:123"
SecRule REQUEST_URI ".*" "auditlog"
</ifmodule>
I found a solution to my question. We can set below field as per our requirement:-
SecAuditLogParts ABIFHZ
In my case i set the field as:-
SecAuditLogParts C
however it will display as:-
--84670000-A--
[29/Aug/2018:14:49:58 +0200] W4aWdqHJuCcOQzTIgCiEqAAAAD8 127.0.0.1 60735 127.0.0.1 80
--84670000-C--
name1=red&email1=red%40yahoo.com
--84670000-Z--
I used htaccess to limited ip which can connect to admin and user login page.
RewriteCond %{REMOTE_ADDR} !=127.0.0.1
RewriteRule (admin|user)$ http://redirect_example.com [R=301,L]
But the problem here is I used 301 redirect which mean when I can't change url http://redirect_example.com to another url. It's has already cached. My curl -I http://example.com/user result:
HTTP/1.1 301 Moved Permanently
Date: Fri, 17 Feb 2017 03:46:19 GMT
X-Content-Type-Options: nosniff
Cache-Control: max-age=1209600
Expires: Fri, 03 Mar 2017 03:46:19 GMT
Content-Length: 313
Content-Type: text/html; charset=iso-8859-1
Location: http://redirect_example.com
Age: 251965
X-Cache: HIT
X-Cache-Hits: 56
Connection: keep-alive
How do I change http://redirect_example.com to another url?
Caused by Varnish. You should ban the cache by run the command:
varnishadm
Then ban the cache by domain:
ban req.http.host ~ "redirect_example.com"
It's just cached in your browser. Clear your browser cache. You can use 302 redirects while testing to help with this.
i am using Apache as backend server and nginx as frontend server. I need to make PDF files downloadable (at this moment they are opening in a browser window).
Here's a link:
link
Here's what i have tried so far in my .htaccess file:
<FilesMatch "\.(pdf)$">
ForceType application/octet-stream
Header set Content-Disposition attachment
</FilesMatch>
Didn't work, just opens the file in a browser.
AddType application/force-download pdf
Didn't work.
AddType application/octet-stream .pdf
Didn't work.
UPDATE
Tried: wget --server-response -O /dev/null http://domain.com/files/teltomat.pdf
And got response:
HTTP request sent, awaiting response...
HTTP/1.1 200 OK
Server: nginx
Date: Wed, 24 Sep 2014 17:40:54 GMT
Content-Type: application/pdf
Content-Length: 3116445
Last-Modified: Wed, 24 Sep 2014 13:28:07 GMT
Connection: keep-alive
Keep-Alive: timeout=60
ETag: "5422c6e7-2f8d9d"
Expires: Thu, 31 Dec 2037 23:55:55 GMT
Cache-Control: max-age=315360000
Accept-Ranges: bytes
Length: 3116445 (3,0M) [application/pdf]
Saving to: ‘/dev/null’
You could try the HTML5 solution of adding a "download" instead of "target":
link
As it looks like the server's end is doing the right thing (by making the disposition "attachment") but maybe the browser is deciding on its own that it can handle PDF's inline and opens a new window instead.
I have setup apache2 with django and mod_wsgi in Debian Wheezy. I enabled mod_mem_cache with this configuration:
<IfModule mod_mem_cache.c>
CacheEnable mem /
MCacheSize 400000
MCacheMaxObjectCount 100
MCacheMinObjectSize 1
MCacheMaxObjectSize 500000
CacheIgnoreNoLastMod On
CacheIgnoreHeaders Set-Cookie
</IfModule>
based on the fact that MCacheMaxStreamingBuffer is the smaller of 100000 or MCacheMaxObjectSize as stated in the docs.
When I try hitting a page with size 3.3KB I get these response headers in firebug:
Connection Keep-Alive
Content-Encoding gzip
Content-Type text/html; charset=utf-8
Date Wed, 27 Aug 2014 14:47:39 GMT
Keep-Alive timeout=5, max=100
Server Apache/2.2.22 (Debian)
Transfer-Encoding chunked
Vary Cookie,Accept-Encoding
and the page isn't served from cache. In the page source there is however the correct header 'Cache-Control: max-age=300,must-revalidate' but doesn't show up in firebug.
In apache log I only see correctly:
[info] mem_cache: Cached url: https://83.212.**.**/?
With another test page that I created outside of django that doesn't have chunked encoding as a header, caching works fine. Why is the page not served from cache? Has anyone seen something similar?