Sitefinity: Need to make a PDF available on a very specific URL but can't do it - sitefinity

I have a website on SiteFinity 4.4. I need to make a document available on a very specific URL, i.e.
http:www.example.com/reports/the-report.pdf
If I just create a directory in the root of the site it does not work (503 error). Also when I try to use the 302Redirect.xml file to redirect the URL to the PDF it does not work either (same error). The link has already been published and has to be exactly as specified. How do I solve this?
Any help would be greatly appreciated.

Sitefinity wouldn't block a folder. Adding a physical folder and dropping that report on the proper place should function, so it probably means you'll have to check your server configuration.
Anyway, the fastest way outside Sitefinity, would be to just create a IIS rewrite rule. Make the http:/www.example.com/reports/the-report.pdf the pattern and redirect them to the url of the document from the sitefinity library.
When you upload a document to the library in sitefinity it gives you an direct url, something like /docs/defaultlibrary/document. You can verify the url by going to content >> documents and files and chose Embed link to this file. That gives you a pop-up with the url.

Related

How to disable the download of files in an Apache2 webserver?

I took over a website which I'm supposed to admin and somebody brought to my attention that certain Indexes and Files are available, which shouldn't be. I will be using dummy names.
You were able to access example.com/intern before, but I changed a line in /etc/apache2/apache2.conf according to this https://stackoverflow.com/a/31445273 . This worked partly, as I get a 403-Forbidden when I now navigate to example.com/intern and that's basically what I want.
However the directory intern governs a file called file.php.bak aswell as file.php. When I navigate to example.com/intern/file.php I get a white website. I am however not sure, if you are able to access file.php in another way, because the site does load and I don't get a 403 like before. What is way worse and the reason I am struggling with this is: If I go to example.com/intern/file.php.bak then my Browser (Firefox) offers me to download file.php.bak, which I can read in plaintext. I want all files in intern to not be accessible via the website, but I have no idea how to do this. Can anybody help?
Things I've tried:
Removing the Indexes from the apache2.conf file like mentioned above. It only puts the 403 on the directory itself and not recursively for all the files in it.
Writing a .htaccess file as described here: https://fedingo.com/how-to-prevent-direct-file-download-in-apache-server/ and putting it in intern with the same result as in 1)
Putting an empty index.html file in the intern directory. This leads to no more 403 in example.com/intern, but the download on example.com/intern/file.php.bak is still possible. I've also tried index.php with the same result.
File System:
The application runs from /var/www/application which is also the folder for the /var/www/application/index.php I want to use. The /var/www/application/intern directory is also there. While it isn't browsable anymore, the files in it still are accessible. /var/www/application/intern/file.php can be navigated to via example.com/intern/file.php, but it seems like it can't be downloaded or read as it results in a white page. /var/www/application/intern/file.php.bak can however be downloaded via example.com/intern/file.php.bak.
Let's say Apache document root is set to DocumentRoot "/folder_one/folder_two"
Placing files in a folder_one will prevent people browsing your apache server and requesting the files directly.
Place index file in folder_two and include some code such as PHP to tell apache to include whatever files you want from folder_one.
In this manor Apache will still be able to serve whatever files you want from folder_one and people will not be able to request the files directly as the are located in a directory above the Apache document root.

Block the user from viewing files in the directory via the url

I'm pretty new in developing for VB.NET. I have a web application and I want to block the access to the files in a directory.
For example:
If the user access the following page: application.com/example/page.aspx
If the user remove the page name and type "application.com/example/", he will see all the files inside the folder "example".
I want to block this possibility and, when a user try to access the files, redirect to an error page.
I know how to solve this in PHP through htaccess, but I have never done it by VB.NET. Any help?
If you are using IIS then
Go to your sites-->Desired Site
From Features Views Select Directory Browsing disabled.
web.config or .htaccess will do.
Remember it is a security risk to have directories open for anyone to see.

Enable the use of SSI

Using HostGator, I can't seem to get SSI to work on my server. I'm using Dreamweaver to build the site and the everything works just fine in the preview. But when I actually upload the pages to my server, any elements that are includes files don't appear. Does anyone know how I can enable SSI on my web server?
Your last comment gave me the information I need. The issue is that the file is not in the same directory as the file you're trying to add the footer.inc file to. Try this code:
<!--#include virtual= "includes/footer.inc" -->
when using the file= parameter, the file you're including must be in the same directory. If the file you're including is not in the same directory, then you will have to use virtual. See this page for more information: SSI: The Include Command.
And here, from the source, is pretty much the rule of thumb: Use file= when the included file is within the same directory as the page that wants it. Use virtual= when it isn't.
EDIT: I think I got it now. Copy and paste the above code and it should work for you. Make sure you follow these guideline: after <!--, there is no space between the last - and #. Additionally, there is a space between the closing " and the first -. These rules must be adhered to. You can view more information here: Server Side Includes Not Working

Make Indexed File Downloadable In Apache Solr

I am trying to indexed pdf file to Solr which I have done successfully using the command
curl "http://localhost:8983/solr/update/extract?literal.id=id=true"-F myfile=#filename.pdf"
I am able to see the file contents and search, but when I try to click on file name it shows
HTTP ERROR 404
Problem accessing /solr/collection1/id. Reason:
not found
What I want is to have a link which allows downloading the file, I know Solr merely indexes the file and stores it. I was wondering if there is a way by which I can add attribute location like you have done and proceed from there, can you please share with me what you have done, if you want any more clarity regarding my problem do ask.
We have the actual files hosted through a separate web application to be download from with auditing and additional security.
you can always directly host these files through http server.
If you are having the file names with id, it is as easy as appending the id.extension to the fixed http hosted url.
Else index the path of the file with an additional parameter e.g. literal.url.
The url will the solr field which will now be available with the Solr response.

How to use relative URL's in website with two base URL's

I have our basic corporate static html website installed in our web root directory and our billing software installed in /portal. I have integrated the websites to look like a single site by including the /menu.tpl smarty template file in the /portal/header.tpl file. However, if I use relative URL's, the menu sysem doesnt work as the base url for the billing script is /portal. i.e. if I create a link to faq.php in the menu.tpl and I load a page on the portal site, the link in the menu back to the faq page is now /portal/faq.php whereby if I load a page off the root site the link is just /faq.php as it should be.
The obvious answer is to just use absolute URL's, but I need the site to be portable as I have many developers who need to install and test it.
I cant find anyway to resolve this. Any ideas?
I ran into the same problem as you a while ago, and after trying a lot of dead ends, I finally ended up with the following solution:
For any URL you need to be a chamelion, i.e. change its path depending on the environment, insert a PHP function that writes out the correct URL.
If you include the PHP function from a single central file, then you can change all of the URL's in the entire site automatically, based on a setting, or some pre-detected switch such as the current domain name, etc.
Example:
<?php print_base_url_plus("/menu.php"); ?>
... where print_base_url_plus() is a function which appends the base URL onto the output.
You may find that you have to change some of the URL's to be php, so they are preprocessed by the PHP engine, or, you can alter the web settings so that standard .htm files are piped through the PHP engine, just like .php files.