Azure SQL Elastic Jobs & Data Tier Application (DACPAC) - azure-sql-database

I'm trying to get a DACPAC containing schema changes deployed to a target group in an elastic jobs setup in Azure. I've done the tutorial and my test jobs are executing properly. Now I want to run a DACPAC against my target group but I'm starting to realize that the documentation is for an older PowerShell module, ElasticDatabaseJobs, specifically the Set-AzureSqlJobContentDefinition command. I don't see any analogous command in the Az.Sql documentation, nor do I see any mention of it on the T-SQL tutorial.
What is the current way to associate a DACPAC file with an Elastic Job? Is this being phased out? Should I use the old PowerShell module? Am I missing something? Help!

Current Azure hosted version of Elastic Database Jobs(https://learn.microsoft.com/en-us/azure/sql-database/sql-database-job-automation-overview#elastic-database-jobs-preview) does not support DACPAC. It is not a tested/supported scenario. Please use Powershell/T-SQL scripts for your jobs.
thanks
-- Srini Acharya

Related

Migrating from Astronomer airflow setup to MWAA

Currently we are using Apache airflow platform provided by Astronomer with Kubernetes executer and we are using PostgreSQL as a backend DB. Now we want move to MWAA. I want some help if anyone has done the same.
How to migrate existing DAGs and Metadata to MWAA ?
What are the things we need to take care before migration ?
Is there anyway we can migrate everything from current setup to MWAA ?
Please help me if anyone has any idea about this.
Here's the official migration guide for MWAA from AWS: Migrating to a new Amazon MWAA environment. The documentation covers how to migrate metadata and key considerations.
Please give it a read through. If you have a more specific question, feel free to post a new question.

How do I create SQL connection to my app and Upload it to google cloud

Thanks for getting back at me. Sorry for the late reply, it was bed-time this time. I need to connect the Cloud SQL database that I have created to my application that is in App Engine. I tried to follow the online tutorials but when I do apply such info I would get then gcloud app deploy it return a connection error. Please help. Also clarify here: When I execute the gcloud app deploy command I suppose it takes my local file to Google Cloud where I would see the entire folder and files of my project on the project I was deploying but I am seeing the old version of my project while presentation has changed to the latest version. Also last one how can I link domain nam from http://domain.google.com to my app in http://cloud.google.com . Please help I am dying with stress I have been trying in here
Given that you haven't provided any information as to what settings you are using, or what error has been provided it is impossible to know what kind of problem you are running into.
I suggest taking a look at the "Connecting to App Engine" page here. It should answer a lot of your questions around connecting from an App Engine app.
I see two questions here.
1.
I need to connect the Cloud SQL database that I have created to my
application that is in App Engine. I tried to follow the online
tutorials but when I do apply such info I would get then gcloud app
deploy it return a connection error. Please help. Also clarify here:
When I execute the gcloud app deploy command I suppose it takes my
local file to Google Cloud where I would see the entire folder and
files of my project on the project I was deploying but I am seeing the
old version of my project while presentation has changed to the latest
version.
I see your problem here to be with CloudSQL and GAE connectivity. Depending on whether you use GAE Standard or Flex and CloudSQL MySQL or POSTGRES the steps varies. Documentation is quite clear in here though.
2.
Also last one how can I link domain nam from http://domain.google.com
to my app in http://cloud.google.com . Please help I am dying with
stress I have been trying in here
This is going to be super simple, goto GCP cloud console, Navigate to GAE-->Settings-->Custom Domain and click on add custom domain "Enter the domain name you want to link" When you click continue you will be shown the steps for verifying the domain owneship and to point the DNS to the GAE.
Documented properly by GCP folks at https://cloud.google.com/appengine/docs/standard/python/mapping-custom-domains
If you are using GAE Standard or Flex, a possible result of command gcloud app deploy :
An app.yaml (or appengine-web.xml) file is required to deploy this directory as an App Engine App, check next links:
https://cloud.google.com/appengine/docs/flexible/python/configuring-your-app-with-app-yaml
https://cloud.google.com/appengine/docs/flexible/python/writing-application-logs
Mysql and Postgres connection:
https://cloud.google.com/sql/docs/mysql/connect-app-engine
https://cloud.google.com/sql/docs/postgres/connect-app-engine
Sometimes it easy share the app.yaml for replicate the app correctly.

Azure Data Factory with Integration Runtime - Delete (or move) file after copy

I have an on premise server with the Microsoft Integration Runtime installed.
In Azure Data Factory V2 I created a pipeline that copies files from the on premise server to a blob storage.
After a successful transfer I need to delete the files on the on premise server. I am not able to find a solution for this in the documentation. How can this be achieved?
Recently Azure Data Factory introduced a Delete Activity to delete files or folders from on-premise storage stores or cloud storage stores.
You have the option to call Azure Automation using webhooks, with the web activity. In Azure Automation you can program a powershell or python script with a Hybrid Runbook Worker to delete the file from the on premise server. You can read more on this here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
Another easier option would be to program a script to be run on the server with the windows task scheduler where you run a script to delete the file. Make sure you program the script to be run after data factory has copied the files to the blob, and that's it!
Hope this helped!
If you are simply moving the file then you can use a Binary dataset in a copy activity. This combination makes a checkbox setting visible that when enabled will automatically delete the file once the copy operation completes. This is a little nicer as you do not need the extra delete activity and the file is "moved" only if the copy operation is a success.

Plesk Migration without Migration Manager

I've got a problem to migrate A user from one server to another.
I tried to use the migration manager, but if I start the migration manager the migration will be started but after 2 seconds it has finished and no migration has be doen.
What can I do? Is there anything I can do?
or should I moove the data manually?
You can try to use Plesk Mass Transfer Script.
The Plesk Mass Transfer Script (formerly Mass Migration Script) is designed to allow providers transferring accounts from one Plesk farm to another one by an automated way.
The script will create migration sessions for each domain only if you run mmigration.php with '--per-domain' option. By default single migration session is created.
More details and scenarios you can find here http://kb.sp.parallels.com/en/113283
Okay I solved the problem. I had to increase the number of free domains!.

AppEngine Backup from one app to another

I can't seem to restore my AppEngine backups to a new app as listed in the documentation.
We are using the cron backup as listed in the documentation.
I get through all the stages to launch the restore job successfully, but when it kicks of all the shards are failing with 503 errors.
I tried this with multiple backup files and the experience is the same.
any advice?
(Java runtime)
I'm posting this hoping this will help someone, as there is really lack of resources around Google's documentation and the web in general about this.
While the appengine documentation says this can be done, I actually found the piece of code that forbids this inside the data_storeadmin app.
I managed to connect through python remote-api shell, read an entity from the backup and tried saving to the datastore, but datastore.Put(entity) operation yielded: "BadRequestError: app s~app_a cannot access app s~app_b's data" so it seems to be on an even lower level.
In the end, I decided to restore only a specific namespace to the same app which was also a tedious task - but it did save the day.
I Managed to pull my backup locally through gsutil, install a python-remote-api version on my app, accessed the interactive shell and wrote this script:
https://gist.github.com/Shuky/ed8728f8eb6187475b9a
Hope this helps.
Shuky