I've configured SymmetricDS to upload files from C:\files (Windows machine) to /home/foo/files (CentOS server).
SymmetricDS is running as root user on CentOS server, so new files and changed files are created with root:root permissions. Is possibile to change the ownership of these files (say to foo:users) just after uploading them?
I know I can use http://www.symmetricds.org/doc/3.5/html/configuration.html#filesync-beanshell after_copy_script script, but I can't find any example and I have no Java knowledge.
Create a cron job calling periodically a shell script that will be changing the ownership of all files under /home/foo/files/. No need for Java development
Solved: Runtime.getRuntime().exec("chown foo:users " + targetBaseDir + "/" + targetFileName);
Related
I'm developing an automated backup system for a server using PostgreSQL and Tomcat. The environment is CentOS minimal 7. Long story short, a VM will download the .sql dumps and a .tar.gz folder containing Tomcat via FTP.
No problems in setting up vsftpd, I can access the Server via FTP with a custom user (ftpuser) which currently can access a specific folder (/home/ftpuser/backups/). I can compress tomcat there so my script will fetch the backups/ folder and download it, but I cant figure out how to dump the postgresql db to the /home/ftpuser/backups/ folder without having to do some stupid things with sudo.
Postgres user haven't the permission to write there and i can't give them to him even with chown or chmod. I inserted postgres in sudoers and if I dump the db and then I "sudo cp" it to that folder is okay, but in this way I cant use a script to do that, due to "sudo" asking password.
The question is.. Is there a way to enable "pg_dump" to write .sql dump to /home/ftpuser/backups/ folder?
Thanks for the replies.
pg_dump does not need to be run from the postgres user.
Run it from a user that can write to the desired folder, and pass the --username=database_user parameter to specify the desired database user. You'll probably need a .pgpass file for the password used by this user (unless it has been defined to be trusted on pg_hba.conf).
I have VMware running Ubuntu 14.02 and A Windows 8 Host. I've enabled shared folders and installed VMware tools. Now what I want is to run the web server through /mnt/hgfs/ProjectName
At this point I can access the shared folder from within Ubuntu. I do not have to run sudo to create new folders or files or edit existing. The folder is not mounted as read only and not treated as read only; however, when I try to change the read only attribute within Windows it reverts back afterwards. Is there any clue as to why the web server cannot read the folder as a web server? Even being mounted as read only the web server should be able to read the files.
Turns out the best way to run this is to name the project folder html within vmware and then mount it to the /var/www folder. Now edits have no problem being made and the server runs great for access to both the host and the guest OS.
I need some suggestions about Joomla's files and folders ownership.
I'm working with a Joomla 2.5 website, hosted on a linux-apache server which hasn't suPHP module.
After one year of usage some files are owned by apache-user, and others by the ftp user.
I request to give the ownership of all files to the ftp user, but joomla (the apache user, after all) can't update system, install extensions or upload images.
How can i set the ownership of files?
Thank You
You will either need to talk to your hosting provider to change the file/folder ownership for you (which they should do), else if you are managing the server yourself , then you can do this using the following SSH command:
chgrp -R OWNER FOLDER
-R means recursive so it will also change the ownership for any sub directories and files
OWNER is the name of the server which you will need to add
FOLDER is simply the name of the folder you wish to apply this change to.
As for the command line tool, I like to use Putty
Hope this helps
Our server is running under CentOS 6 and handled over Panel Plesk 10.4.4. Structure of folders and files is created using php script. Then, when accessing through FTP we are unable to modify these folder contents previously created. When accessing it over Apache web user works without exception but not over ftp. Folders and files have 755 and 644 rights respectively. How to enable ftp acces? Thank you
EDIT: problem is that file owner and ftp are not the same but I do not know exactly how and where to attach it.
File and folders owner is psacln (gid 502) and group is apache (gid 503). Ftp users are not the same.
We add a login ftp user (also system one) to the group owner of files and folders "psacln" using usermod -a -G psacln ftpusername. Same procedure with apache group but problem persists.
The problem here would be that you probably run your site in mod_php mode. In this mode scripts are operated under Apache privileges, so all files and directories created are owned by Apache. This way the files cannot be accessed by your FTP user unless you set up 777 or 666 permissions.
I think your options could be
switch to FastCGI mode of PHP. Depending on your Plesk account privileges, you can either do it yourself in Plesk UI or will have to ask hosting provider for that.
This way your script will be operated under user privileges (same as FTP user) and there will be no problems with accessing these files through FTP. Also this option is often considered more secure.
make PHP script setting 777 permissions on your folders and 666 permissions on your files. It means you allow to modify them by everyone (so called "others"). So FTP user can modify these files as well. While this may sound insecure, but practically these files are already can be accessed from any other site on that system (if it is shared hosting server). So I don't think it will be any more insecure than the current status.
Regards
I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download