Create operations in APIM with requests and parameters throws ValidationError - azure-powershell

I am trying to migrate an apim instance programatically using azure powershell (I can't use Backup/restore in my case).
I am succesfully getting apis, revisions, policies, schemas, etc...
But when I try to create operations with requests, responses, etc, I keep getting this error :
Error Code: ValidationError
Error Message: One or more fields contain incorrect values:
Request Id: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
Error Details:
[Code= ValidationError, Message= Operation references schema that does not exist.,
Target= operation 'GET'-'/contract/{contract_id}' response '200' 'application/json'
representation schema xxxxxxxxxxxxxxxxxxxxxxxx]
Basically, I get operations and schema from the old apim and insert first the schema, then iterating over operations, recreate them 1by1.
$Operations = Get-AzApiManagementOperation -Context $apimSourceContext -ApiId $API.ApiId
$Schema = Get-AzApiManagementApiSchema -Context $apimSourceContext -ApiId $API.ApiId
if ($Schema) {
New-AzApiManagementApiSchema -Context $apimTargetContext -ApiId $FullApiId -SchemaDocumentContentType $Schema.SchemaDocumentContentType -SchemaDocument $Schema.SchemaDocument
}
foreach ($Operation in $Operations) {
New-AzApiManagementOperation -Context $apimTargetContext -ApiId $FullApiId -ApiRevision $API.ApiRevision -Name $Operation.Name -OperationId $Operation.Name -Method $Operation.Method -Description $Operation.Description -UrlTemplate $Operation.UrlTemplate -TemplateParameters $Operation.TemplateParameters -Responses $Operation.Responses -Request $Operation.Request -Verbose -Debug
}
From what I see in the portal, schema is created succesfully, and seems in every point identical than in the source APIM

Related

Using a service account and JSON key which is sent to you to upload data into google cloud storage

I wrote a python script that uploads files from a local folder into Google cloud storage.
I also created a service account with sufficient permission and tested it on my computer using that service account JSON key and it worked.
Now I send the code and JSON key to someone else to run but the authentication fails on her side.
Are we missing any authentication through GCP UI?
def config_gcloud():
subprocess.run(
[
shutil.which("gcloud"),
"auth",
"activate-service-account",
"--key-file",
CREDENTIALS_LOCATION,
]
)
storage_client = storage.Client.from_service_account_json(CREDENTIALS_LOCATION)
return storage_client
def file_upload(bucket, source, destination):
storage_client = config_gcloud()
...
The error happens in the config_cloud and it says it is expecting str, path, ... but gets NonType.
As I said, the code is fine and works on my computer. How anotehr person can use it using JSON key which I sent her?She stored Json locally and path to Json is in the code.
CREDENTIALS_LOCATION is None instead of the correct path, hence it complaining about it being NoneType instead of str|Path.
Also you don't need that gcloud call, that would only matter for gcloud/gsutil commands, not python client stuff.
And please post the actual stacktrace of the error next time, not just a misspelled interpretation of it.

How do I create a bigquery pubsub direct in gcp? I get an error failed to create

I am trying to publish data directly from pubsub to bigquery.
I have created a topic with a schema.
I have created a table.
But when I create the subscription, I get an error request contains an invalid argument
gcloud pubsub subscriptions create check-me.httpobs --topic=check-me.httpobs --bigquery-table=agilicus:checkme.httpobs --write-metadata --use-topic-schema
ERROR: Failed to create subscription [projects/agilicus/subscriptions/check-me.httpobs]: Request contains an invalid argument.
ERROR: (gcloud.pubsub.subscriptions.create) Failed to create the following: [check-me.httpobs].
there's not really a lot of diagnostics i can do here.
Is there any worked out example that shows? What am i doing wrong for this error?
Side note: its really a pain to have to create the BQ schema w/ its native json format, and then create the message schema in avro format. Similar, but different, and no conversion tools that I can find.
If i run with --log-http, it doesn't really enlighten:
{
"error": {
"code": 400,
"message": "Request contains an invalid argument.",
"status": "INVALID_ARGUMENT"
}
}
-- update:
switched to protobuf, same problem.
https://gist.github.com/donbowman/5ea8f8d8017493cbfa3a9e4f6e736bcc has the details.
gcloud version
Google Cloud SDK 404.0.0
alpha 2022.09.23
beta 2022.09.23
bq 2.0.78
bundled-python3-unix 3.9.12
core 2022.09.23
gsutil 5.14
I have confirmed all the fields are present and correct format, as per https://github.com/googleapis/googleapis/blob/master/google/pubsub/v1/pubsub.proto#L639
specifically:
{"ackDeadlineSeconds": 900, "bigqueryConfig": {"dropUnknownFields": true, "table": "agilicus:checkme.httpobs", "useTopicSchema": true, "writeMetadata": true}, "name": "projects/agilicus/subscriptions/check-me.httpobs", "topic": "projects/agilicus/topics/check-me.httpobs"}
I have also tried using the API Explorer to post this, same effect.
I have also tried using the python example:
https://cloud.google.com/pubsub/docs/samples/pubsub-create-bigquery-subscription#pubsub_create_bigquery_subscription-python
to create. Same error w/ a slight bit more info (ip, grpc_status 3)
debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B2607:f8b0:400b:807::200a%5D:443 {created_time:"2022-10-04T20:54:44.600831924-04:00", grpc_status:3, grpc_message:"Request contains an invalid argument."}"

Data Factory copy pipeline from API

We use Azure Data Factory copy pipeline to transfer data from REST api's to a Azure SQL Database and it is doing some strange things. Because we loop over a set of API's that need to be transferred the mapping is empty from the copy activity.
But for one API the automatic mapping is going wrong, the destination table is created with all the needed columns and correct datatypes based on the received metadata. When we run the pipeline for that specific API, the following message is showed.
{ "errorCode": "2200", "message": "ErrorCode=SchemaMappingFailedInHierarchicalToTabularStage,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to process hierarchical to tabular stage, error message: Ticks must be between DateTime.MinValue.Ticks and DateTime.MaxValue.Ticks.\r\nParameter name: ticks,Source=Microsoft.DataTransfer.ClientLibrary,'", "failureType": "UserError", "target": "Copy data1", "details": [] }
As a test we did do the mapping for that API manually by using the "Import Schema" option on the Mapping page. there we see that all the fields are correctly mapped. We execute the pipeline again using the mapping and everything is working fine.
But of course, we don't want to use a manually mapping because it is used in a loop for different API's also.

Getting 'Minimum TLS Version' setting of Azure webapp with Az PowerShell

I have a PowerShell script that uses Az PowerShell modules to retrieve properties of all webapps within a resource group. Now, I also need to fetch the MinTlsVersion property as in below. Can I do it using one of Az modules?
When a call to Get-AzWebApp command is made in the script, a request is sent to /subscriptions/<s>/resourceGroups/<rg>/providers/Microsoft.Web/sites endpoint. The response object has property siteConfig set to null. Is there a way to call Get-AzWebApp such that the property is not null so I can use the minTlsVersion sub-property under the siteConfig object?
If there's no way to above:
I see that the client receives minTlsVersion by sending a GET request to /subscriptions/<s>/resourceGroups/<rg>/providers/Microsoft.Web/sites/<st>/config/web endpoint. Can we hit the same endpoint by using one of the Az PowerShell modules? Though, I would prefer a request that can return minTlsVersion of all webapps in a resource group in a single call.
You need to iterate through each app, try the command as below, it works on my side.
$grouname = "<resource-group-name>"
$apps = Get-AzWebApp -ResourceGroupName $grouname
$names = $apps.Name
foreach($name in $names){
$tls = (Get-AzWebApp -ResourceGroupName $grouname -Name $name).SiteConfig.MinTlsVersion
Write-Host "minTlsVersion of web app" $name "is" $tls
}

Can BigQuery report mismatched the schema field?

When I upsert a row that mismatches schema I get a PartialFailureError along with a message, e.g.:
[ { errors:
[ { message: 'Repeated record added outside of an array.',
reason: 'invalid' } ],
...
]
However for large rows this isn't sufficient, because I have no idea which field is the one creating the error. The bq command does report the malformed field.
Is there either a way to configure or access name of the offending field, or can this be added to the API endpoint?
Please see this Github Issue: https://github.com/googleapis/nodejs-bigquery/issues/70 . Apparently node.js client library is not getting the location field from the API so it's not able to return it to the caller.
Workaround that worked for me: I copied the JSON payload to my Postman client and manually sent a request to REST API (let me know if you need more details of how to do it).