Apache redirect to https and modify url structure - apache

Goal:
Redirect from http://example.com/library/page123.htm
to https://example.com/library/folder/page123.htm
I have found plenty of example on how to perform redirects to ssl and how to add folders, but am struggling to execute these together. Additionally, there are multiple pages under the "library" folder of the old url structure.
Appreciate any help!

Related

resolve URL domain with external subdomain contents

I've purchased a new domain, lets call it example.ca.
I need the above URL to resolve in the browser, with the contents of another site, in this case www.test.ca/othersite
The expected behaviour is so when a user types example.ca, the site will load the contents of the www.test.ca/othersite, while still appearing the above URL as example.ca
I am not looking for a redirect, so wondering how this can be done using DNS and apache rules. Any help is appreciated!

Redirect url for REST service

I am able to reach my website at a certain ip address and I am going to implement a REST service. I have some PHP files that perform actions on a database and I am calling them from the client. I am using linux ubuntu as server and so far I can do this:
http://xxx.xxx.xxx.xxx/api/create/?id=someId&val=someValue
http://xxx.xxx.xxx.xxx/api/delete/?id=someId
I can do the above because inside /var/www/html I have a folder called api that contains another folder called create. The create cointains the file index.php so that I can omit it and execute the URL you can see above.
This works fine but I don't think this is the proper way to do it. I am new with this so I don't know what to do. After some researches I have found that my goal probably be achieved using an .htaccess file use url rewriting but I am not sure.
How can I do this? Do I have to place all the php files in a single folder and then use an htaccess file? (^)
(^) To be more precise: instead of having this
http://xxx.xxx.xxx.xxx/api/create/index.php?id=someId&val=someValue
http://xxx.xxx.xxx.xxx/api/delete/index.php?id=someId
//and so on with other actions...
Do I have to create a folder like
http://xxx.xxx.xxx.xxx/files/
containing all my php files (create.php, delete.php, view.php...) and the use an htaccess to redirect?
I see that websites offer their api using www.domain.com/api/something/?data=Value or www.domain.com/api/something/dataAbout/. Are they doing what I have said about the .htaccess? I hope I have well explained my problem.
htaccess:
RewriteEngine On
RewriteRule ^api/([\w-]+)/?$ files/$1.php [L,NC]
This is inside /var/www/html and I have api inside /home/username/api .
Thanks Emma
Do it like this:
Create php files in a folder files/ subdirectory as create.php, delete.php, view.php etc (by renaming each individual index.php file, you mentioned).
Move away api directory somewhere outside site root.
Once that is done use following .htaccess file in /var/www/html/:
RewriteEngine On
RewriteRule ^api/([\w-]+)/?$ files/$1.php [L,NC]
Then use new URLs as:
http://xxx.xxx.xxx.xxx/api/create?id=someId&val=someValue
http://xxx.xxx.xxx.xxx/api/delete?id=someId
In first, this is not the right way to create a RESTFUL API. My suggestion is to you read a best practices article.
You shouldn't create a CREATE and DELETE folder. You should use HTTP actions.
To create a new record you should use POST. In example, POST /user and in the body you pass the user's information.
In another example, you could use the same route by using different HTTP methods: DELETE /user/1 to delete a user and PATCH /user/1 to edit some already existent user's information.
Hope it's help you.

Redirecting from example.com to www.example.com

My site uses AJAX, and it seems like I have to include the full path when I use it to access a function. This is fine, as I can code it in. The problem is, I have hardcoded http://www.example.com... but if a user has gone to http://example.com (without the www) this gives an access problem and the AJAX won't execute.
I think the easiest way to resolve this would be to make sure that if a user goes to mysite.com, they are redirected to www.example.com.
I can find solutions online, but they all involve the htaccess file, which my server doesn't support - I have to use rewrite.script instead.
How can I do this using rewrite.script, or is there an alternative way to approach this?
Thanks for the suggestions - I was directed to this: http://seo-website-designer.com/Zeus-Server-301-Redirect-Generator which created the rewrite.script file I needed.
In .htaccess found in your root document:
Redirect http://example.com/ http://www.example.com/
If there is no support for .htaccess on your server, you may have to include meta tag redirect on the head of your page. or using javascript to check the URL source then redirect it.

Redirecting Pages: Names to Standard Address

I have WordPress installed in the root of a website, and recently enabled a custom permalink structure just for the sake of having good looking page URLs (only pages are used in this website, no posts at all — it's not a blog). Unfortunately this is causing some problems with other parts of the website, outside WordPress.
So I'd like to go the manual way: and redirect URLs like /my-page to /?page_id=32 just for a selected amount of pages. Is it possible to do that using the .htaccess file? What would the rules look like?
If you're redirecting pages from Wordpress to other URLs, you can use .htaccess. But it's probably easier to use a plugin to redirect rather than edit .htaccess.
See WordPress › Redirection « WordPress Plugins to easily set up redirects and log redirects, errors, and more.

sitemap for multiple domains of same site

Here is the situation, i have a website that can be accessed from multiple domains, lets say www.domain1.com, www.domain2.net, www.domain3.com. the domains access the exact same code base, but depending on the domain, different CSS, graphics, etc are loaded.
everything works fine, but now my question is how do i deal with the sitemap.xml?
i wrote the sitemap.xml for the default domain (www.domain1.com), but what about when the site is accessed from the other domains? the content of the sitemap.xml will contain the wrong domain.
i read that i can add multiple sitemap files to robots.txt, so does that mean that i can for example create sitemap-domain2.net.xml and sitemap-domain3.com.xml (containing the links with the matching domains) and simply add them to robots.txt?
somehow i have doubts that this would work thus i turn to you experts to shed some light on the subject :)
thanks
You should use server-side code to send the correct sitemap based on the domain name for requests to /sitemap.xml
Apache rewrite rules for /robots.txt requests
If you're using Apache as a webserver, you can create a directory called robots and put a robots.txt for each website you run on that VHOST by using Rewrite Rules in your .htaccess file like this:
# URL Rewrite solution for robots.txt for multidomains on single docroot
RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
NginX mapping for /robots.txt requests
When using NginX as a webserver (while taking yourdomain1.tld and yourdomain2.tld as example domains), you can achieve the same goal as post above with the following conditional variable (place this outside your server directive):
map $host $robots_file {
default /robots/default.txt;
yourdomain1.tld /robots/yourdomain1.tld.txt;
yourdomain2.tld /robots/yourdomain2.tld.txt;
}
This way you can use this variable in a try_files statement inside your server directive:
location = /robots.txt {
try_files /robots/$robots_file =404;
}
Content of /robots/*.txt
After setting up the aliases to the domain-specific robots.txt-files, add the sitemap to each of the robots files (e.g.: /robots/yourdomain1.tld.txt) using this syntax at the bottom of the file:
# Sitemap for this specific domain
Sitemap: https://yourdomain1.tld/sitemaps/yourdomain1.tld.xml
Do this for all domains you have, and you'll be set!
You have to make sure URLs in each XML sitemap match within domain/subdomain. But, if you really want, you can host all sitemaps on one domain look using "Sitemaps & Cross Submits"
I'm not an expert with this but I have a similar situation
for my situation is that I have one domain but with 3 sub-domain
so what happen is that each of the sub-domain contain the sitemap.xml
but since my case was different directory for each of the sub-domain
but I'm pretty sure that the sitemap.xml can be specify for which of each domain.
The easiest method that I have found to achieve that is to use an XML sitemap generator to create a sitemap for each domain name.
Place both the /sitemap.xml in the root directory of your domains or sub-domains.
Go to Google Search and create separate properties for each domain name.
Submit an appropriate sitemap to each domain in the Search Console. The submission will say show success.
I'm facing a similar situation for a project I'm working on right now. And Google Search Central actually have the following answer:
If you have multiple websites, you can simplify the process of creating and submitting sitemaps by creating one or more sitemaps that include URLs for all your verified sites, and saving the sitemap(s) to a single location. All sites must be verified in Search Console.
So it seems that as long as you have added the different domains as your properties in Google Search Console, at least Google will know how to deal with the rest, even if you upload sitemaps for the other domains to only one of your properties in the Google Search Console.
For my use case, I then use server side code to generate sitemaps where all the dynamic pages with English content end up getting a location on my .io domain, and my pages with German content end up with a location on the .de domain:
<url>
<loc>https://www.mydomain.io/page/some-english-content</loc>
<changefreq>weekly</changefreq>
</url>
<url>
<loc>https://www.mydomain.de/page/some-german-content</loc>
<changefreq>weekly</changefreq>
</url>
And then Google handles the rest. See docs.