Storing and retrieving files stored separately from codebase coldfusion - file-upload

We currently have a site running cold fusion 11. In an effort to improve some aspects of security we would like to store all files uploaded by our users on a server separate from our codebase and DB servers.
I'm pretty much starting from scratch here as I wasn't able to find much in my searches so far. What's the best practice for doing this and what cold fusion functions would work for storing and retrieving files from an external source?

I could use some more information to be more helpful. But let's say you have a separate server that stores all your user files on a Windows network. I would use CFContent to serve those files with the file being retrieved over a UNC path.
I'd recommend reading this blog entry of mine on Securely Serving Files via CFContent. Wil, also from CF Webtools, posts one here: Serving File Downloads with ColdFusion

We had a similar issue when we migrated to a Unix platform. Our solution was to mount a file server to the webserver. It's accessed programmatically by ColdFusion as if it's on the same server, but it's inaccessible from the web root (browser). It's worked very smoothly for us.

Related

Where and how to store files uploaded by a user using rest api?

Currently I’m using a shared storage(azure file storage) to store profile pictures and company logos and also some custom python scripts uploaded by admins. My rest services are running in a docker swarm cluster where all the nodes have access to the shared location. Are there any drawbacks to this kind of design? I’m currently saving the files to the location and creating a url for that file and serving it as a static resource using my nginx reverse proxy/load balancer. So I was curious to know if there are any drawbacks to this design and how can I make it better?
There are several ways to access, store, and manipulate files in Azure file storage using REST API:
The Azure File service offers the following four resources: the storage account, shares, directories, and files. Shares provide a way to organize sets of files and also can be mounted as an SMB file share that is hosted in the cloud.
More info here
When it comes to the design, it will depend of what kind of concerns your customers may have, slow connectivity, are they going to need these files permanently etc ...

Uploaded File Storage/Retrieval

I am developing a web application that needs to store uploaded files - images, pdfs, etc. I need this to be secure and to scale - I don't have a finite number of uploads to plan for. From my research, the best practice seems to be storing files in the private file system, storing paths and meta data in the database, and serving through an authenticated script.
My question is where should these files be stored?
I can't store them on the web servers because I have more than 1, would be worried about disk space, and don't want the performance hit from replication.
Should they be programmatically uploaded to a CDN? Should I spin up a file server/cluster to handle this?
Is there a standard way for securely storing/retrieving a large number of files for web applications?
"My question is where should these files be stored?"
I would suggest using a dedicated storage server or cloud service such as Amazon AWS. It is secure and completely scalable. That is how it is usually done these days.
"Should they be programmatically uploaded to a CDN?" - yes, along with a matching db entry of some sort for retrieval.
"Should I spin up a file server/cluster to handle this?" - you could. I would suggest one of the many cloud storage services though.
"Is there a standard way for securely storing/retrieving a large number of files for web applications?" Yes. Uploading files and info via web or mobile app (.php, rails, .net, etc) where the app uploads to storage area (not located in public directory) and then inserts file info into a database.

Uploading umbraco site on server from local host

I've successfully created site using Umbraco now its time to upload it on hosting server..
i've searched and got one paid product for the same..and i dont want to use it.
Has any body tried developing Umbraco site on local and then uploading it on server?
If yes then please help me doing that.
First I run the umbraco install from a local IIS website. Then I setup my visual studio solution for that website (and my souce control). Then I work, until I reach a beta version, then I go through this process for deploying:
Ftp over to the remove website and copy the whole website (I actually use Beyond compare).
Connect to my local database with management studio and create a .bak file.
Upload the .bak file to the database server.
Restore that database
Review connection strings in web.config
Then I'm pretty much done.
Once I'm "live" and have content I don't want to lose, when I want to work on the website, I bring back the live database through a .bak file, then I make my changes. They often include DB changes since the schema is basically in the database. I note all the operations I do. Once my changes are ready I manually replicate the changes on the live site as I update the files.
This is very painfull but I tried solutions like courrier and other things like that and they are not reliable enough for production I find. Manually is the only risk free way I see for the moment.
Hope this helps.
Yes, that happens all the time. Use FTP to copy your local installation to your webserver, modify the web.config to point to the correct database and your website should be up-and-running.
I'm sure there are more elegant solutions with less clicks but here's how I do it on azure websites with sql, not sure what hosting/db you're using:
1) Create an empty db on azure with the same login and user as my local db.
2) Create an empty site on azure connected to my db.
3) Download the publishing profile.
4) Upload the db the first time with Sql Azure Migration Wizard.
5) Import the publishing profile into and upload the site with WebMatrix.
6) Thereafter I deploy the site and db with WebMatrix.
WebMatrix uses WebDeploy or FTP, you can use WebDeploy through IIS if you like, and FTP.

How to move a single DotNetNuke portal to a new server?

I've got a DotNetNuke system (v 5.6) that's hosting several different portals, and I'd like to move one of them to another hosting provider. What's the easiest way to do this?
Every web site I find that claims to explain how to move a DotNetNuke site essentially says "Copy the entire database over to the new system." That's great if you've only got one portal in the database, but I've got a dozen of them. I only want to move one portal, not all of them.
Exporting the site to a .template is another popular suggestion. This exports the structure of the site (all the tab definitions, for example), but it doesn't include any of the actual HTML content. As such, that's essentially worthless.
There must be a reasonable way to do this short of trying to strip one individual portals data out of every single DNN table. Right?
When you export a site template, you can include the content of the site, as well (for the modules that support portability, which includes the standard HTML module). This is how the default site template has all of its content. When you do this, there will be a .template.resources file that you'll need, as well as the .template file.
The other option is to do a full backup and restore, and then remove the other sites once you've restored. If you have significant content in a module that doesn't support portability, I think this will be your best bet.
FYI, I did find a solution from someone over on the DotNetNuke forums.
Create a 2nd version of that install, then delete all the other
portals. Move the install with the one portal. We've done this several
times with installs with lots of portals and it works just fine. Yeah
there's still some noise left in the db, but it's a quick and
effective way of doing things.
Edit note that this will give you an install with 1 portal. You can't detach a portal from one install and reattach it to an existing
install (well, you can, but basically you have to export the portal as
a template and that isn't 100%)
This is the approach I took, and sure enough, it works.
In a nutshell:
Mirror the files for the web site to another server.
Mirror the DNN database to another server.
Log in a Host on the new setup and delete all the portals but the one you want to migrate.
Delete any module definitions that are not in use by the remaining portal.
Open up your favorite SQL tool and delete any entries in the Users and UserProfile tables that no longer have a matching row in the UserPortals table. DNN does not remove these by default, which is frustrating.
Hop in to Windows Explorer and delete all of the Portal folders you no longer need (ie: /Portal/1, /Portal/2, etc.)
Back up the database using Enterprise Manager to create a .bak file
Make a .zip of the entire DNN installation folder.
You now have a .bak that contains the database and a .zip that contains the files. Send those off to the new hosting company, and you should be all set. Just make sure to update your web.config to set the connection string properly to point to the new database server at the new hosting company.
It's just that easy. ;)

Joomla 1.5 Site Backup Strategy

I would like to make a complete backup of my whole joomla 1.5 based site from time to time. How would this ideally be done? Are there any common pitfalls? Not that I only have ftp access to the hosting server. Is there a step by step tutorial somewhere? I am using latest Joomgallery and Kunena 1.0.9 (Legacy mode).
Maybe there is a good way to automate this?
There's two parts of the backup you have to worry about, the database and the files.
The first part is the database. It can be backed up using something like phpMyAdmin. If you don't have this available on your server already, it's not too hard to upload and get it going yourself. From there, you can just Export the entire database to a gzip file.
The second part is the code and uploaded files. The code base shouldn't change too often, so you could probably just make one backup of this. There's a number of ways. The simplest is to just download the entire folder via FTP, though if you're Linux, I'm sure someone will know a single command line to get all the changed files (rsync?).
The database is the main thing you have to worry about though: everything else should be able to be rebuilt just by reinstalling.
I think this: http://www.joomlapack.net/ is what you need. I use it myself and it works like a charm. Both for backups and for moving my Joomla installations from developer sites and to the real site.
get an FTP synchronisation tool and keep an up-to-date copy of your site locally. Then you could run the batch script
mysqldump -hhost -uuser -p%1 schema > C:\backup.sql
to create a backup of your mysql tables at various points in time.
edit
you would have to have MySQL Server installed on your local machine and path to its bin directory in you PATH, in order to run the mysqldump command without much hassle. -p%1 would take the command-line provided password, as you wouldn't want to store passwords in your batch script.
If you only have FTP access you are in a bit of a problem, as beside all files you'll also have to backup the database. Without accessing the database, a full-backup won't do you any good.
Whatever backup strategy you choose - be sure it can handle UTF-8 correctly. Joomla 1.5 stores all content with UTF-8, even when the database charset is set on 'iso-5589-1' - so when the backup solution is detecting the database charset, some characters like € or é will result in "strange" ¬ / é - not really what you'll want.
I absolutely endorse using Joomlapack - it works great. The optional remote tools allow you to initiate the backup from a Windows desktop machine - it performs the backup and downloads it. The remote has a scheduler, and you can also set it off to backup and download a list of sites.
Joomlapack also provides a file "kickstart.php" which you copy to your empty server account along with the backup, which automates the restore procedure. You do have to create an empty database with PHPMyAdmin or similar, and you are given the opportunity to supply the database parameters (host, database, username, password) during the process.
One pitfall I did run into with this though is that some common components can have absolute URLs in their configuration - e.g. SOBI2, Virtuemart. It's then just a matter of finding the appropriate configuration file, editing it and re-uploading it.
Another problem was one archive file (either ZIP or their JPA format) got a filename with a "?" character in it (from a Linux server) and this caused a bit of a problem trying to install it locally on a Windows WAMP stack - the extract process on the ZIP file failed, and it stopped the process completing cleanly.
I suggest using automatic backup service by http://www.everlive.net
Update:
Ok, here is some more information. EverLive.net is a website where you can create a free account. Enter your website details and you are ready to take your backups withe just one click. Restore is also possible in the same way.
Further you can use automatic backup option to take automatic backups at defined intervals. Other than that, you can use the website health check service to inform you if your website is not available.