I am trying to upload *.ttl files in a virtuoso server using iSQL and the function ld_dir('path','file','graph').
If I run the thing in local everything works fine. I add the path in the DirsAllowed inside the virtuoso.ini config file. Then using the isql I run ld_dir(,,) and rdf_loader_run() and I upload the file.
I would like to do the same thing but from a remote computer. How should I configure the virtuoso.ini to allow paths from a remote computer?
Thanks, and sorry for the cross-posting.
Current versions of Virtuoso do not support loading data files from remote locations. The data file must be accessible through Virtuoso's local filesystem, though it may be a remote mount (e.g., NFS), and the containing directory must be included in Virtuoso's DirsAllowed setting. Loading from remote locations is planned for a future version.
You'll generally get faster Virtuoso-specific answers by asking on the Virtuoso Users mailing list or the Virtuoso support forums
Related
I have installed Aerospike in Ubuntu. When I run aql command "show namespaces", it shows namespaces "test" and "bar". I tried to find out that where are they in hard drive or what is their exact location in ubuntu but no vain. Can anyone help me?
You wouldn't see any of the namespaces directly exposed on your file system when running Aerospike.
Having said that, the "bar" and "test" namespaces are default in the configuration file and both should be configured as 'storage engine memory' which means that the data will be stored in memory and not on your hard drive. Even if you were to switch those to be 'storage engine device', and either configure the underlying device as a raw SSD one or using a file, you would still not see any direct mention of the namespace...
When using raw SSD, Aerospike bypasses the file system and directly manages blocks on the device.
When using a file, Aerospike also manages blocks on the file system which makes the file not 'readable'.
There is a possibility to see existing namespace and to create other namespaces,
If you have installed Aerospike in ubuntu then see the file /etc/aerospike/aerospike.conf . This configuration file has namespaces
Currently I have a bunch of local copies of dev/production websites. Each copy contains the "files" directory, which contains files uploaded by site users. Currently I use rsync to synchronize the directories contents from remote servers (via ssh).
There are some annoyances:
I have to run rsync manually each time when I want fresh files (this could be automated of course, but as I have a lot of website copies, it's not a good idea).
The rsync execution takes some time.
Disc space on my laptop is running out.
I think all of this could be solved if there is some kind of a software that can work like a proxy:
When I list files, it requests the file list from the remote server and caches the results for some (configurable) time.
When I first time request file contents, it retrieves the remote file and saves it locally.
When I update a file, it only gets updated locally.
When I save a new file in the "files" directory, it not goes to the remote server.
Of course, the logic of such software should be much more complex, but I hope, my idea is clear: don't waste disk space, download files on demand, no remote changes.
Is there any software that works like that?
Map a network drive with NFS or sshfs. Make local copies if you really need a file.
I did not mention it in the question, but I needed this for work with Drupal. And now I have found a Drupal-only solution, the Stage File Proxy module.
It does exactly what I need: downloads files from a remote server only when they are requested.
We currently have a site running cold fusion 11. In an effort to improve some aspects of security we would like to store all files uploaded by our users on a server separate from our codebase and DB servers.
I'm pretty much starting from scratch here as I wasn't able to find much in my searches so far. What's the best practice for doing this and what cold fusion functions would work for storing and retrieving files from an external source?
I could use some more information to be more helpful. But let's say you have a separate server that stores all your user files on a Windows network. I would use CFContent to serve those files with the file being retrieved over a UNC path.
I'd recommend reading this blog entry of mine on Securely Serving Files via CFContent. Wil, also from CF Webtools, posts one here: Serving File Downloads with ColdFusion
We had a similar issue when we migrated to a Unix platform. Our solution was to mount a file server to the webserver. It's accessed programmatically by ColdFusion as if it's on the same server, but it's inaccessible from the web root (browser). It's worked very smoothly for us.
I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download
I would like to make a complete backup of my whole joomla 1.5 based site from time to time. How would this ideally be done? Are there any common pitfalls? Not that I only have ftp access to the hosting server. Is there a step by step tutorial somewhere? I am using latest Joomgallery and Kunena 1.0.9 (Legacy mode).
Maybe there is a good way to automate this?
There's two parts of the backup you have to worry about, the database and the files.
The first part is the database. It can be backed up using something like phpMyAdmin. If you don't have this available on your server already, it's not too hard to upload and get it going yourself. From there, you can just Export the entire database to a gzip file.
The second part is the code and uploaded files. The code base shouldn't change too often, so you could probably just make one backup of this. There's a number of ways. The simplest is to just download the entire folder via FTP, though if you're Linux, I'm sure someone will know a single command line to get all the changed files (rsync?).
The database is the main thing you have to worry about though: everything else should be able to be rebuilt just by reinstalling.
I think this: http://www.joomlapack.net/ is what you need. I use it myself and it works like a charm. Both for backups and for moving my Joomla installations from developer sites and to the real site.
get an FTP synchronisation tool and keep an up-to-date copy of your site locally. Then you could run the batch script
mysqldump -hhost -uuser -p%1 schema > C:\backup.sql
to create a backup of your mysql tables at various points in time.
edit
you would have to have MySQL Server installed on your local machine and path to its bin directory in you PATH, in order to run the mysqldump command without much hassle. -p%1 would take the command-line provided password, as you wouldn't want to store passwords in your batch script.
If you only have FTP access you are in a bit of a problem, as beside all files you'll also have to backup the database. Without accessing the database, a full-backup won't do you any good.
Whatever backup strategy you choose - be sure it can handle UTF-8 correctly. Joomla 1.5 stores all content with UTF-8, even when the database charset is set on 'iso-5589-1' - so when the backup solution is detecting the database charset, some characters like € or é will result in "strange" ¬ / é - not really what you'll want.
I absolutely endorse using Joomlapack - it works great. The optional remote tools allow you to initiate the backup from a Windows desktop machine - it performs the backup and downloads it. The remote has a scheduler, and you can also set it off to backup and download a list of sites.
Joomlapack also provides a file "kickstart.php" which you copy to your empty server account along with the backup, which automates the restore procedure. You do have to create an empty database with PHPMyAdmin or similar, and you are given the opportunity to supply the database parameters (host, database, username, password) during the process.
One pitfall I did run into with this though is that some common components can have absolute URLs in their configuration - e.g. SOBI2, Virtuemart. It's then just a matter of finding the appropriate configuration file, editing it and re-uploading it.
Another problem was one archive file (either ZIP or their JPA format) got a filename with a "?" character in it (from a Linux server) and this caused a bit of a problem trying to install it locally on a Windows WAMP stack - the extract process on the ZIP file failed, and it stopped the process completing cleanly.
I suggest using automatic backup service by http://www.everlive.net
Update:
Ok, here is some more information. EverLive.net is a website where you can create a free account. Enter your website details and you are ready to take your backups withe just one click. Restore is also possible in the same way.
Further you can use automatic backup option to take automatic backups at defined intervals. Other than that, you can use the website health check service to inform you if your website is not available.