I have a number of subdomains, which are using crossdomain.xml file and I'm looking to a simple way of managing them all - which get semi-regularly updated. One way I've thought is a PHP script, which pushes and overwrites the xml file. The other, which I much prefer is a an apache redirect on a single file.
So, question is how would I, across multiple domains, redirect an xml on dom1.domain.com and dom2.wirewax.com to the same crossdomain.xml file without Flash getting upset about. i.e. not a 302 HTTP redirect, but internal file fetching.
You can write a PHP script that fetches the content from a single location (database or text file) and sends it as-is to Flash. Yes, the script itself needs to be copied on all hosts.
If you have all websites hosted on same webserver, perhaps mod_alias could help:
Alias /crossdomain.xml /path/to/shared/crossdomain.xml
I have not personally tested this. The reference page includes instructions to setup the shared directory so that it can be read by multiple hosts.
Related
I am creating my whole application with .html extension, to play with the database I am using jQuery Ajax. I have created the project structure like WordPress, for each file I am having a folder and inside that folder I am having an index.html file.
In the above picture, I have created user/equipment/index.html, in this file all the equipment are being shown, now I want that if user clicks on an equipment then the URL should be like 'domain.com/user/equipment/equipment-title' and the file should be called user/equipment/details/index.html
I believe that this can be done with a .htaccess file.
Any solution for the problem would be much appreciated.
Well, you need to store the references in that index file the way you want them to be, request rewriting (wo which you refer as ".htaccess") cannot do that for you. Why you can do with request rewriting, so inside a distributed configuration file (".htaccess") is to the internally rewrite the incoming requests. For that you need a mapping from request URLs to your detail pages. If the mapping the simply the name as to be found in the "equipment" folder (this is unclear from your question), then you indeed can simply implement a rewriting rule.
This would be such an example:
RewriteEngine on
RewriteRule ^/?user/equipment/(.*)/?$ /equipment/$1 [END]
This will deliver the content of the file /equipment/equipment-title when the URL https://example.com/user/equipment/equipment-title gets requested and that file exists.
For this to work the rewriting module has to be enabled inside your http server and, if you want to use a distributed configuration file for this, the interpretation of such files also needs to be enabled for that location inside your http server. Usually the better alternative is to place such rules in the real http server's host configuration, though.
I am working with a custom website built in PHP running on Apache server. The client wants to move it to a new server. I moved everything including the .htaccess file, the homepage loads fine but all the other urls like site.com/register isn't working. I'm sure this is not handled by code in the old server because I renamed everything (including .htaccess) and it still works. If I create a file like test.php in the old server, I can access it like site.com/test. It doesn't even hit the index.php file. Also, not all the urls work like this, some are loading through files in other folders.
So my question is - what are the possible ways that Apache can let user access site.com/test without the .php extension. It must not be using .htaccess. Also, we should be able to add exceptions to this so that some urls can be loaded differently.
you can achieve same thing in hosts file if you are using Linux server. you need to define same rules in hosts configuration file.
(LAMP server configuration)
As a workaround for another problem, I need PHP to be able to access local files, but prevent these files from being served over http by Apache.
Normally, I would just use .htaccess to accomplish this, however due to institutional restrictions, I cannot. I also can't touch php.ini, although I can use php_ini_set within php.
As a creative solution, I thought that if php executes as its own linux user (not as apache) I could use normal chown's and chmod's to accomplish this.
Again, the goal is simply to have a directory of files that apache will not display, but php can access.
I'm open to any suggestions.
Put the files outside of your web accessible root (DocumentRoot), but keep them accessible via PHP.
Suggestion:
/sites
/sites/my.site.com
/sites/my.site.com/data // <-- data goes here
/sites/my.site.com/web // <-- web root is here
Here's a thought. Set the permissions on the files to be inaccessible to even the owner, then when PHP needs them, chmod() then, read them, then chmod() them back to inaccessible.
I have a directory on my website specifically for javascript files, I want these javascript files to be hidden, so if I type the url to it it says Forbidden or disallows access, but my front-end website files can still access them to execute them when needed. Is there a way to do this through a FTP client?
Cheers,
Dan
You can't do this trough a ftp client. It is the task of your webserver to forbid access to certain files.
If you change permission, the webserver won't have access to them anymore, so this is not the way to go.
You must configure your webserver to restrict the access. If you're using Apache, you can use an .htaccess file. There's different ways of doing this, many depends on the way the webserver is configured.
The easiest is to put an .htaccess file in your Scripts folder which contain only this none line :
deny from all
However, like peeter said, there's a good chance this will break your site, since the browser must access theses files, so you can't restrict access.
Put a htaccess file in your scripts folder containing deny from all, but this will stop your pages from accessing the scripts also (though not if you pass them through the PHP engine first)
You're trying to hide JavaScript files that are executed on the clients side. If a client(browser) cannot access the files means non of your javascript code is executed.
If I understood your question correctly then you cannot achieve what you're trying to achieve.
I currently have css and javascript file calls (amongst other things) like the following:
href="/css/default.css"
src="/js/ui_control.js"
putting the preceding / in to make the files relative to the root.
This works great when my page is in the root of the domain.
However, I'm currently in the middle of transferring my site to a new hosting provider and as such have a temporary URL which is: HOST-IP/~username
As such, all file calls are trying to be called from HOST-IP/css/default.css etc instead of within the ~username sub-folder.
Of course I can wait until the domain name servers propagate but that's beside the point.
How would I go about writing a rule in the .htaccess file that would redirect all file calls that start with a /, from going to HOST-IP/FILE-CALL, and instead to go to HOST-IP/~USERNAME/FILE-CALL. ?
Any ideas?
I'd suggest changing the references in your HTML to the files to be relative, as this will work either in a sub folder or as the root of the domain.
This works great when my page is in the root of the domain. However, I'm currently in the middle of transferring my site to a new hosting provider and as such have a temporary URL which is: HOST-IP/~username
How would I go about writing a rule in the .htaccess file that would redirect all file calls that start with a /, from going to HOST-IP/FILE-CALL, and instead to go to HOST-IP/~USERNAME/FILE-CALL. ?
Unless you can put a .htaccess at HOST-IP/.htaccess on the new server, you can't do this with .htaccess. It sounds like you're on a shared host, so any approach that'd let you do this with .htaccess would allow you to hijack everyone else's site on the server.