I have cPanel access on two different servers and would like to transfer from one server to another. The original size of the account on the first server is close to 15GB.
Currently, the only two ways I can think of are:
Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
My problem with option 2 is that this means I will end up using 30GB of data transfer if it actually does that.
What is the best way to transfer from server to server using cPanel?
I have limited knowledge on this. I will suggest you some tips.
1.Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
The backup creation is failed due to the high size of account. So that reason that way is not possible.
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
You can download the whole content to your PC or download one folder by folder like first home, mail, and etc and upload to new cpanel account using ftp
Could you please open a support ticket to your hosting provider they will help you create from server backend using /script/pkgacct.
Thank you.
Related
I have setup flow force server in my local pc and I was able to run couple of sample jobs. Then I try to setup FTP store job by providing required details. I was also able to upload specific file in to FTP server as well.
But now I need to know whether there is a possibility to upload all the files in particular folder in to target folder in FTP server. Appreciate if someone who have knowledge above flowforce server can share their thoughts about this.
I am running a file upload process to upload files to a db. The web server and the SQL server are different machines. I am attempting to use an SQL OPENROWSET to upload an excel file, but I cannot determine how to get the file onto the other machine. Is there a way to set up a shared drive that the web server can save a file to and the SQL server can access? We have a local network set up with Active Directory.
For Example:
WebServer - Shared drive on web server under C:/inetpub/webpage/fileImport
SQLServer - Will log in with sql auth using USERID and PASSWD. Needs to access webserver shared drive.
What user do I share the drive on web server with so that the sql auth user will be able to access it when I run the OPENROWSET?
Any help will be much appreciated.
I am also trying the same thing by uploading the file in FTP and trying to access it. But i didn't get any progress from last 2 weeks.
And i had found may other alternatives like coping the files in another server and share the folder with out user name & password. then we can able to access it by giving the
\\folder\filename
If u get any other alternative plz share...
You should setup a new user that has access to the user group iis_users, and then give them security access to the file drive itself.
The same should be done to the DB server, and on the drive folder security the other user will need read/write/Modify permissions.
So it will look like:
(WebHost) ---- (Shared) ---- (DBHost)
*-------*
Well, you would setup the folder on the SQLServer.
Create a secure user on the SQL machine.
Make the folder shared (with modify rights for the secure user)
Map the Network drive on the Windows machine, using the secure user to access it.
Your main user on the SQLServer should then be able to openrowset from the local folder, whilst the IIS Server is remotely accessing it.
Using the OPENROWSET means that SQL qill access files using the service account. This account must be used to access share drive, as stated here Using SQL Credential to Open a file with OpenRowSet.
I've successfully created site using Umbraco now its time to upload it on hosting server..
i've searched and got one paid product for the same..and i dont want to use it.
Has any body tried developing Umbraco site on local and then uploading it on server?
If yes then please help me doing that.
First I run the umbraco install from a local IIS website. Then I setup my visual studio solution for that website (and my souce control). Then I work, until I reach a beta version, then I go through this process for deploying:
Ftp over to the remove website and copy the whole website (I actually use Beyond compare).
Connect to my local database with management studio and create a .bak file.
Upload the .bak file to the database server.
Restore that database
Review connection strings in web.config
Then I'm pretty much done.
Once I'm "live" and have content I don't want to lose, when I want to work on the website, I bring back the live database through a .bak file, then I make my changes. They often include DB changes since the schema is basically in the database. I note all the operations I do. Once my changes are ready I manually replicate the changes on the live site as I update the files.
This is very painfull but I tried solutions like courrier and other things like that and they are not reliable enough for production I find. Manually is the only risk free way I see for the moment.
Hope this helps.
Yes, that happens all the time. Use FTP to copy your local installation to your webserver, modify the web.config to point to the correct database and your website should be up-and-running.
I'm sure there are more elegant solutions with less clicks but here's how I do it on azure websites with sql, not sure what hosting/db you're using:
1) Create an empty db on azure with the same login and user as my local db.
2) Create an empty site on azure connected to my db.
3) Download the publishing profile.
4) Upload the db the first time with Sql Azure Migration Wizard.
5) Import the publishing profile into and upload the site with WebMatrix.
6) Thereafter I deploy the site and db with WebMatrix.
WebMatrix uses WebDeploy or FTP, you can use WebDeploy through IIS if you like, and FTP.
I have a remote server running SQL Server 9.0.3042, trying to subscribe to a publication on a server running SQL Server 10.0.2531.
These servers are on different domains which basically hate each other and largely refuse to allow their users access to each other.
They do both communicate nicely with a third domain, and it is a user from that domain which I am using as the process owner on both servers.
I have created a shared folder on the publishing server, and I am using it as the Snapshot folder, set via Publication Properties -> Snapshot -> "Put files in the following folder" and have confirmed that the files are being published locally and can be accessed via the shared folder.
The Snapshot Agent on the publishing server runs and appears to complete successfully.
I've then created a Pull subscription on the subscribing server and told it to run with the Agent Process Account of the same user that runs the snapshot agent on the publisher.
I've redirected the snapshot location to "Alternate folder" and set that folder to be the shared folder on the remote server that I set up earlier.
The Pulling server connects correctly to the Publishing server, And then fails because "The process could not read file "\[server].[domain][share][snapshot directory][file].pre" due to OS error 5"
I've logged into the Pull server as the executing account and manually navigated to, and opened, that file. I've done the same on the publishing server.
I'm out of ideas. What am I screwing up?
OS Error 5 is ERROR_ACCESS_DENIED. You're not being allowed to get to the folder. Check to make sure that the user account trying to get to the files has at least read access to the folder containing the files.
The decision has been taken that we're pretty much on the point where we're going to move every user in the company over to a brand new domain, and that this problem isn't worth the time and effort of fixing.
I did find this SO question which relates a similar problem, but I don't intend to investigate this matter any further.
Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
HTTP-Track website mirroring utility.
Wget and scripts
RSync and FTP login (or SFTP for security)
Git can be used for backup and has security features and networking ability.
7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
Put a backup generator script on each website (outputting a ZIP)
Protect its access with a .htpasswd file
On the backupserver, make a cron script download all the backups and store them