I'm creating a new MS SQL Connector to Azure database. Following the guide at: https://azure.microsoft.com/en-gb/documentation/articles/app-service-logic-connector-sql
As a result the connector is not created due to missing 'LOCATION' input.
How can I specify this property? Does that depend on Azure Subscription used?
Location is in reference to which data region you want your connector to reside. Is this connector in US East, US East 2, Central US etc.
The view you are seeing in the tutorial leverages the new Azure Portal, the Location will reside within the App Service Plan configuration settings.
Related
There is an option to connect a Cloud mySQL instance from BigQuery. I just wanted to know how we can connect a Cloud SQL Server instance to BigQuery.
SQL Server:
There are a bunch of third-party extensions/tools that provide this service. One of them is SSIS Data Flow Source & Destination for Google BigQuery, which is Visual Studio extension that connects SQL Server with Google BigQuery data through SSIS Workflows.:
https://www.cdata.com/drivers/bigquery/ssis/
https://marketplace.visualstudio.com/items?itemName=CDATASOFTWARE.SSISDataFlowSourceDestinationforGoogleBigQuery
In regards to using SQL Server Integration Services to load the data from the on-premises SQL Server to BigQuery, you can take a look for this site. You can also perform ETL from a relational database into BigQuery using Cloud Dataflow, the official documentation details how it can be done, you might need to use Cloud Storage as an intermediate data sink.
Cloud SQL:
BigQuery allows to query data from Cloud SQL by using federated query. The connection must be created within the same project where your Cloud SQL instance is located. If you want to query your data stored in your Cloud SQL instance from BigQuery located in another project, please follow the steps listed below:
Enable the BigQuery API and the BigQuery connection API within your project.
Create a connection to your Cloud SQL instance within the project by following this documentation.
Once you have created the connection, please locate and select it within BigQuery.
Click on the SHARE CONNECTION button and grant permissions to the users that will be use that connection. Please note that the BigQuery Connection User role is the only needed to use a shared connection.
Additionally, please notice that the "Cloud SQL federated queries" feature is in a Beta stage and might change or have limited support (is no available for certain regions, in which case, it is required to use one the supported options mentioned here). Please remember, that to use Cloud SQL Federated queries in BigQuery, the intances need to have a public IP.
If you are limited e.g. by region, one good option might be exporting the data from CloudSQL to Storage as a CSV, and then load it into BigQuery. If you need, it is possible to automate this process using Cloud Composer, refer to this article.
Other approach is to extract information from Cloud SQL (with exports) and import it into BigQuery through load jobs, or streaming inserts.
I hope you find the above pieces of information useful.
It is possible, but be warned the feature is currently Beta
https://cloud.google.com/bigquery/docs/cloud-sql-federated-queries
Azure:
Created SQL Azure Server
Created 2 Databases on the server
Now trying to create SQL Elastic pool for server
Tried Server to Create on
East US
West US
Central US
When Compute+Storage:
I get below message:
Unable to retrieve the pricing configuration data for this region at this moment. Please retry.
My Resource Group is: Learn-xxxxxxx-xxxx-xxxx-xxxx-xxxxxx
Why it does not allow to configure elastic pool?
Maybe it is under maintenance now in your this region. I just tried and everything is ok. Please refresh portal , try again after a moment.
Hope this helps.
I tried after couple days again, and it is working now
I was able to create elastic pool with eastus region.
Thanks,
V
I am trying to create a pipeline to copy some data between Azure SQL databases on different servers, but creating a Linked Service using SQL authentication fails (and gives no helpful information, just a dialog box saying it failed). I think that the server VMs are in different tenancies or different subscriptions (I am not sure of the distinction), so I am guessing that the one I am working in cannot see the one I want the connection to go to. Is that likely, and what needs to be done to make it work? Any advice welcome, including RTFM if you can point me at the right one and it doesn't take weeks to wade through it!
In case anyone hits the same issue: the problem turned out to be the 'encrypted' checkbox in the self-hosted integration runtime (IR). Clearing this flag allowed the IR to see the target database, and the pipeline could then be created with the new connection set to use that IR. #Leon Yue: both databases are Azure SQL instances on Azure PaaS VMs.
I am trying to create a Azure SQL Database located in Australia Southeast. The problem is when I go to create a new server the Australian location (and others ) are missing from the location drop down.
Here is what I see on the Azure portal. I can create other resources (Web app) in Australia without a problem.
Any help would be great
Some subscriptions have restrictions on creating Compute and SQL resources in certain regions. The behavior you noticed was expected due to the subscription type and is not a service degradation. We are working on better communicating such region restriction information to make it clearer. We apologize for any confusion this might have caused.
Disclaimer: I am a Program Manager for Microsoft, working on SQL Server
Please have a look here. Even the Azure Management portal has both the Australian locations.
We'd like to create services in WSO2's DSS that query LDAP data. The Data Services Server gives the option of creating JNDI backed data sources, but the data query definition seems to assume that all JNDI data sources use SQL (as evident by the query field being labled SQL).
The old WSO2 forums suggest that it's possible... http://wso2.com/forum/thread/11109
Does anyone have an example?
thx
Liam
As far as I know you cannot use JNDI to expose LDAP data as LDAP doesn't expose its data as a database .. If you want samples on JNDI Data Sources and Exposing Data Sources as JNDI Resources, Please refer 1 and 2. If you want to expose your LDAP data and create data services you can use Data services custom data source feature and implement it.