I've got a problem to migrate A user from one server to another.
I tried to use the migration manager, but if I start the migration manager the migration will be started but after 2 seconds it has finished and no migration has be doen.
What can I do? Is there anything I can do?
or should I moove the data manually?
You can try to use Plesk Mass Transfer Script.
The Plesk Mass Transfer Script (formerly Mass Migration Script) is designed to allow providers transferring accounts from one Plesk farm to another one by an automated way.
The script will create migration sessions for each domain only if you run mmigration.php with '--per-domain' option. By default single migration session is created.
More details and scenarios you can find here http://kb.sp.parallels.com/en/113283
Okay I solved the problem. I had to increase the number of free domains!.
Related
I am using KeyCloak as my user management tool, and love it.
The data of Keycloak is stored for me on a Postgres database. Over time, more clients are being registered, and other alterations to the realms may be done. My question is: How do I properly keep track of that, and propagate automatically changes between my different environments? For databases, I use liquibase for a purpose like this. I couldn't find anything similar for the Keycloak case.
So, I wanted to ask: How are you folks out there handling this? What am I missing?
It depends on how you're doing the management of those changes. There are generally two approaches:
Using the Keycloak admin console
Using the Keycloak CLI
If you're applying your changes via the admin console, then you can either rely on the database backup or setup a scheduled pipeline in your CI tool to make an export of the Keycloak realm into a file and archive it somewhere.
In case you're using the second approach, then you can have a git repository containing all the Keycloak CLI scripts that you run on your server (e.g. to add a client, to update a realm config, etc.). In that case, you can have them reviewed, versioned and then run as part of an automated pipeline. This will also allow you to run a script on different environments. But of course it comes with a price which is to write a script for every single task that you can typically do in admin console with a couple of clicks.
Sorry about English, Actually i have been looking for best article about WHM/CPANEL server migration but i could not find yet. Hope this is the right platform.
I have one WHM/CPANEL server running with CentOS x86_64 standard. Now i want to migrate all stuff to my new high end machine.
Need to Transfer: (All accounts along with databases, Exim configuration, Tweak settings, PHP etc etc).
Note: I need step by step guide which will be highly appreciated, I am not too much technical and in learning phase so please go easy on me.
Thanks in advance.
First you need to install and setup cPanel on your new server.
After that you can migrate your all cPanel account to new server through transfer tool. WHM >> Transfers >> Transfer Tool
Here are the some useful docs.
https://documentation.cpanel.net/display/CKB/How+to+Move+All+cPanel+Accounts+from+One+Server+to+Another
cPanel setting are stored in /etc/wwwacct.conf and /var/cpanel/cpanel.config file so you can copy that setting to new server.
After the server setup is complete, you will have to recreate all the accounts you wish to move, on the server. You can create the domains in your server using the WHM and then restore the website contents manually using a FTP client(say FileZilla).
First you have to take the backup of your domain via cPanel. Please check below URL to find information regarding this:-
http://docs.cpanel.net/twiki/bin/view/AllDocumentation/CpanelDocs/BackupWizard#Backup your entire site
Then Create the account in WHM.
Upload the contents in these newly created domain and restore them via WHM or cPanel.
To restore via WHM please follow below steps:-
Main >> Backup >> Restore a Full Backup/cpmove file
Please refer the cPanel docs link below on how to restore an account via cPanel.
http://docs.cpanel.net/twiki/bin/view/AllDocumentation/CpanelDocs/BackupWizard#
If you have cpanel WHM server and you want to transfer your whole server accounts and websites to another WHM server then its really very easy and simple. All you need is root access of both server.
If you have set up your high end new WHM server, then login to your new WHM with root user and go to transfer tool from WHM > Transfers > Transfer Tool. It will show you fields to input the details of your low end server. Once you provide the root information of server it will fetch the accounts lists and account details. From there you can select which accounts to transfer and which to not. If you want to transfer all accounts then you can select all and proceed the transfer. It will transfer all your accounts with their current package details.
Try the cpanel transfer tool its easy and simple to understand and proceed.
https://documentation.cpanel.net/display/CKB/How+to+Move+All+cPanel+Accounts+from+One+Server+to+Another
https://documentation.cpanel.net/display/ALD/Transfer+Tool
I've successfully created site using Umbraco now its time to upload it on hosting server..
i've searched and got one paid product for the same..and i dont want to use it.
Has any body tried developing Umbraco site on local and then uploading it on server?
If yes then please help me doing that.
First I run the umbraco install from a local IIS website. Then I setup my visual studio solution for that website (and my souce control). Then I work, until I reach a beta version, then I go through this process for deploying:
Ftp over to the remove website and copy the whole website (I actually use Beyond compare).
Connect to my local database with management studio and create a .bak file.
Upload the .bak file to the database server.
Restore that database
Review connection strings in web.config
Then I'm pretty much done.
Once I'm "live" and have content I don't want to lose, when I want to work on the website, I bring back the live database through a .bak file, then I make my changes. They often include DB changes since the schema is basically in the database. I note all the operations I do. Once my changes are ready I manually replicate the changes on the live site as I update the files.
This is very painfull but I tried solutions like courrier and other things like that and they are not reliable enough for production I find. Manually is the only risk free way I see for the moment.
Hope this helps.
Yes, that happens all the time. Use FTP to copy your local installation to your webserver, modify the web.config to point to the correct database and your website should be up-and-running.
I'm sure there are more elegant solutions with less clicks but here's how I do it on azure websites with sql, not sure what hosting/db you're using:
1) Create an empty db on azure with the same login and user as my local db.
2) Create an empty site on azure connected to my db.
3) Download the publishing profile.
4) Upload the db the first time with Sql Azure Migration Wizard.
5) Import the publishing profile into and upload the site with WebMatrix.
6) Thereafter I deploy the site and db with WebMatrix.
WebMatrix uses WebDeploy or FTP, you can use WebDeploy through IIS if you like, and FTP.
Does anyone know a SQL query that will purge a MediaWiki database of old revisions? My database has grown out of control, and I need to prune it to make it possible to download and manage.
I don't have shell access so, I need to do this with a SQL query.
I have tried the solution suggested here, but it doesn't work http://www.mediawiki.org/wiki/Extension_talk:SpecialDeleteOldRevisions2#Deleting_only_archived_revisions
Thanks for reading :)
Nicholas
As you, I don't have shell access to my MediaWiki. So I can't do a lot of things like maintenance.
Here is my solution : host your MediaWiki web site on your computer just to do your maintenance tasks
Backup your database
Backup your MediaWiki folder
Setup Apache (the web server) on your computer
Setup MySQL on your computer
Restore your MediaWiki database on your computer
Put your MediWiki folder on the Apache root folder
Finally run the maintenance task you want using shell. I suggest you the deleteOldRevisions script
After that, rebackup the folder and the database and restore them on the remote host
Use the Maintenance extension and run the relevant maintenance scripts with it. Direct database manipulation is pure madness, and using a local LAMP install as suggested by the other answer quite cumbersome.
Shell access is really required to properly run a MediaWiki but this is a common problem, please report your experience with the extension on the talk page or file a bug if you find any.
I work on quite a few DotNetNuke sites, and occasionally (I haven't figured out the common factor yet), when I use the Database Publishing Wizard from Microsoft to create scripts for the site I've created on my Dev server, after running the scripts at the host (usually GoDaddy.com), and uploading the site files, I get an error... I'm 99.9% sure that it's not file related, so not sure where to begin in the DB. Unfortunately with DotNetNuke you don't get the YSOD, but a generic error, with no real way to find the actual exception that has occured.
I'm just curious if anyone has had similar deployment issues using the Database Publishing Wizard, and if so, how they overcame them? I own the RedGate toolset, but some hosts like GoDaddy don't allow you to direct connect to their servers...
The Database Publishing Wizard's generated scripts usually need to be tweaked since it sometimes gets the order wrong of table/procedure creation when dealing with constraints. What I do is first backup the database, then run the script, and if I get an error, I move that query to the end of the script. Continue restoring the database and running the script until it works.
There are two areas that I would look at -
Are you running in the dbo schema and was your scripted database
using dbo?
Are you using an objectqualifier in either your dev or your
production environment? (look at your sqldataprovider configuration
settings)
You should be able to expose the underlying error message by setting the following in the web.config:
customErrors mode="Off"
Could you elaborate on "and uploading the site files"? New instance of DNN? updating an existing site? upgrading DNN version? If upgrade or update -- what files are you adding/overwriting?
Also, when using GoDaddy, can you check to verify that the web site's identity (network service or asp.net machine account depending on your IIS version) has sufficient permissions to the website's file system? It should have modify permissions and these may need to be reapplied if you are overwriting files.
IIS6 (XP, Server 2000, 2003) = ASP.Net Machine Account
IIS7 (Vista, Server 2008) = Network Service
Test your generated scripts on a new local database (using the free SQL Express product or the full meal deal). If it runs fine locally, then you can be confident that it will run elsewhere, all things being equal.
If it bombs when you run it locally, use the process of elimination and work your way through the script execution to find the offending code.
My hunch is that the order of scripts could be off. I think I've had that happen before with the database publishing wizard.
Just read your follow up. In every case that I've had your problem, it was always something to do with the connection string in web.config. Even after hours of staring at it, it was always a connection string issue in web.config. Get up, take a walk and then come back.
If you are getting one of DNN's error pages, there is a chance it may have logged the error to the eventlog table.
Depending on exactly what is happening and what DNN is showing you you might be able to manually look inside the EventLog table, pull out the XML data stored there, and parse it to find the stack trace and detailed information regarding the specific error at hand.
I have found however though that I get MUCH better overall experiences with deployments using backups and restores of my database, that way I am 100% sure that all objects moved correctly, and honestly it works better in my experience.
With GoDaddy I know another MAJOR common issue is incorrect file permissions, preventing DNN from modifying the web.config and other files that it needs to do.