How do I prevent my page say 'index.php' and all other web pages in different folders on server to be accessed anyone by typing the path in address bar of browser like www.kkweb.com/web/index.php. Kindly help.
If you have php correctly installed and running, they get the parsed site. I.e., they can open the website, but cannot read your source code. If you want to avoid even that, you can implement access protection. Google .htpasswd and .htaccess for that.
Related
Following a migration, I am trying to reconfigure TYPO3.
The backend is accessible by its url and I manage to connect. But the home page for example gives me the following error: The page did not exist or was inaccessible. Reason: No site configuration found.
All pages are not accessible by their url normally but are accessible through different query parameters, like /index.php?id=2&L=0.
I already tried to replace the .htaccess with the basic one provided by TYPO3, I also checked the Apache configuration and everything seems to be ok. I am not familiar with the TYPO3 CMS so I don't know if some of these configurations are done directly in its files. To me, it seems to be a mod_rewrite problem but I can't find it.
Someone would have a solution or at least a track to solve this problem?
As stated in the error message that you've posted it seems that the site configuration is missing. In the TYPO3 backend you can create new site configuration via Site Management => Sites.
Please check the site handling documentation for details.
We have a web site in IIS 7, that has a default document (index.asp) set. The entire site uses basic authentication except the index.asp page. This is working correctly if I enter the domain with the actual page name (www.mysite.com/index.asp). However, without the page name (www.mysite.com) I am prompted for credentials, even though it is the same page.
I've tried various web config changes and tried it with anonymous user set to the app pool and the IUSR account and it doesn't work either way. I'm thinking maybe I need to enable and then configure URL rewriting for this site but I'm not positive.
If you need any other info let me know.
thanks!
Good morning everyone,
I am developing an app for QNAP which has also a web interface. In my qpkg.conf I set QPKG_WEBUI and QPKG_USE_PROXY and I can see correctly the Web interface inside the QNAP interface once I am logged in. It seems perfectly integrated with the QNAP interface, BUT, I can see it also writing the right url in my web browser, even if I am not logged in the QNAP and I cleared all possible cache/cookies.
I want to give access to my Web interface only to valid users. Unfortunately I do not know how to do it. I tried to write a .htaccess to deploy with my application, but without any success (obviously I can not modify the apache standard configuration, and with the standard configuration I was not able to do it).
The only thing I found, inside the Apache folder, there is a pwauth executable that let me ask for username/password (even if I do not want to ask, I want only to see if the user is ALREADY logged in). Anyway with the standard apache configuration, the external module is not loaded, then I can not use the pwauth inside the .htaccess. Maybe I could create some custom cgi program that call it, but I would prefer to avoid custom solution, I really would like to follow a "standard" way to do it, it should be one.....
I would like to know if there is some QNAP variable to set in the qpkg.conf file, or some configuration to set in a .htaccess that does what I want: grant the access only if the user is ALREADY logged in.
Thanks very much to everyone, I could not find anything in google or in the official documentation.
Question:
How can I include both https: and http: results from a single domain in a Google custom search engine but display any such result in an iframe with a secure parent window?
How It's Structured:
My Google custom search engine currently searches "mydomainname.com/directory/" with the option to "Include all pages whose address contains this URL". It operates on a specific page of the website to search pages within the specified directory. The Link Target set in Websearch Settings is an iframe on the same page as the search bar.
The browser window and the iframe src are both on the same secure domain. And since the search results are all from a directory within the site structure, are all on this same domain as well.
Currently some results appear as "https://..." and some appear "www...". Obviously, this creates a mixed-content error when the browser window is https:// and an attempt is made to display a http:// search result in the iframe.
The results that are http:// will, of course, also work as https:// urls. I do not know what makes a page or file appear in the search results as "www." or "https://" when they all originate from a single secure domain.
The "http://" results appear even if I specify the site to be searched as https://www.mydomainname.com/directory/. I don't want to exclude these results, but I want them to be able to be displayed when browsing the site securely.
The Objective:
So the bottom-line rule that I need to work around is that insecure pages or files cannot be loaded into an iframe on a secure web page. I obviously want users to be able to utilize the https:// site but then I need the search to function in such a way that allows for all possible search results for these users.
The reason I need the results' target to be this iframe is that this is the frame that displays all the content of the web page. The search results work in harmony with the organization of other information. Such that choosing a link from a category in the page's navigation and choosing a search result from the custom search result display the chosen content into the same location, the iframe.
What I've Tried:
I've tried designating https:// specifically in the Google Search Engine (gse) settings and removing : 'http' from the script line gcse.src =(document.location.protocol == 'https:' ? 'https:' : 'http:') + '//cse.google.com/cse.js?cx=' + cx;.
I looked in the script file that it's linking to: http://cse.google.com/cse.js?cx=012685392925564329750:ghl2znnfada but I can't decipher what might need to be changed in it.
In the error log on the console I don't see much to be relevant except for the expected inability to load insecure pages while browsing securely. But there is this that looks like (maybe) it's relevant? though I could be completely wrong because I can't really decipher it either:
Mixed Content: The page at
'https://mydomainname.com/directory/index.php' was loaded over HTTPS,
but requested an insecure script 'http://www.google.com/jsapi?
key=ABQIAAAAdCtw6Xq1Q31YAr7VSQOSvxS5g7WKqCWUBuUdhz3-
rUOumR2saRSPGvey2WjYALW7f5_JzakSL3lAEg'. This request has been blocked;
the content must be served over HTTPS.
Insecure Script from Error Message:
http://www.google.com/jsapi?key=ABQIAAAAdCtw6Xq1Q31YAr7VSQOSvxS5g7WKqCWUBuUdhz3-rUOumR2saRSPGvey2WjYALW7f5_JzakSL3lAEg
Proposed Paths to a Solution:
I am open to any solution methods that may be possible. I have considered several routes but am not sure how to properly execute them or have failed in my attempts to execute them.
Some solutions I thought may work are:
Show all results as https:// links (without excluding any) so that they can be accessed whether on a secure connection to the site or not.
Redirect any links clicked without https:// to be loaded into the iframe as https://
Change something about the pages and files on the server so that they only appear in the search results as https://
Change something about Google's search engine script so it parses all found results as https://
Somehow show links as http:// if browsing non-secure, and https:// if browsing secure *
*I don't know how viable or efficient this would be
The most robust solution is to migrate all your website in https :
use 301 (permanent) redirect from http to https
and activate HSTS (if possible with includeSubdomains)
Google will take a little time to update his index but the HSTS will automatically replace http by https so you should avoid any mixed content issues.
I need to protect a site that has a ton of static .html files. The standard .htaccess scheme doesn't meet the requirements.
Is there a way to specify an .htaccess style of password protection with a custom handler? That is I need to write the code to determine if the user is allowed or not, but I don't want to modify a million .html files all over the place.
Thanks!
Maybe. It depends on what modules are loaded on your web server. Your options will range from keeping a simple list of users in a flat file, to keeping them in a database and customizing the queries.
http://httpd.apache.org/docs/2.2/howto/auth.html
Another option - just brainstorming here - is to use something like mod_rewrite to redirect the calls to the physical file to something like a PHP script that can manage the user/password authentication for your, and if authenticated, go out and load the html file that was requested. So calls to www.some.com/10203.html actually get directed to www.some.com/auth.php?10203.html, which would control access to that underlying html file. That would of course require mod_rewrite to be installed, which is pretty common even for shared hosting environments.