How to access multiple S3 origins (in the same bucket) from a single cloudfront distribution? - amazon-s3

I have a cloudfront distribution that is working fine with an S3 origin.
After adding a second origin, I also add a new cache behaviour so I would get:
first.domain.com: goes to the first origin (via the default * cache behaviour path)
first.domain.com/elsewhere: goes to the new origin (via a new elsewhere/* cache behaviour path)
I feel something maybe wrong or missing, but can't tell from the docs what it could be.
After reading these answers:
One
Two
I can't still figure what is not working. I enabled the S3 logs but they can take hours to update.
Any help is appreciated!
The error I get after hitting the second URL is:
"response": {
"status": 403,
"statusText": "",
"httpVersion": "http/2.0",
"headers": [
{
"name": "status",
"value": "403"
},
{
"name": "content-type",
"value": "application/xml"
},
{
"name": "date",
"value": "Fri, 17 Aug 2018 03:28:54 GMT"
},
{
"name": "server",
"value": "AmazonS3"
},
{
"name": "x-cache",
"value": "Error from cloudfront"
},
{
"name": "via",
"value": "1.1 275132367c30f17c9825826491390fe3.cloudfront.net (CloudFront)"
},
{
"name": "x-amz-cf-id",
"value": "Ag_JzYYNMVJLMlz9Dd8yDgS1qDCRFlihzlCauDXOE0-fojAPQLQNQQ=="
}
It would seem that the dist has no access, but I did the same OAID as with the first origin, I checked the bucket permissions allow the OAID, and the first origin is working fine.
Maybe it's some slow propagation issue about adding an S3 origin?

Related

GoDaddy API to create subdoman returns "The given domain is not registered, or does not have a zone file"

I'm trying to use GoDaddy's API to create a subdomain using the following http request:
PATCH /v1/domains/domainName.com/records
Host: api.ote-godaddy.com
Authorization: sso-key API_KEY:API_SECRET
Content-Type: application/json
Content-Length: 100
[
{
"data": "111.111.111.111",
"name": "subdomainName",
"ttl": 6000,
"type": "A"
}
]
but I get the following response:
{
"code": "UNKNOWN_DOMAIN",
"message": "The given domain is not registered, or does not have a zone file"
}
Please changes the host name as https://api.godaddy.com. your request will be work only production URL.
Please generate Production level API Key & SECRET.
Body: (Raw - Json Type)
[
{
"data": "YourServerIp",
"name": "subdomainName",
"port": 80,
"priority": 10,
"protocol": "string",
"service": "string",
"ttl": 600,
"type": "A"
}
]
Note:
It's only happened when primary domain already exists on your go daddy account
I figured out these requests only work using the production authorization against the production url but won't work using ote-authorization against the ote url. Maybe the url has to be set as an ote domain and not a production domain. Not sure.

Xero BankTransfers won't show up

I'm using BankTransfers endpoint to add transfers between bank accounts. It used to work like a charm in the past. I did not make any changes to my code, but transfers suddenly stopped to appear. Xero responds with 200 code and status OK but transfers just won't show up. Also, TransferID looks like this for some reason:
"BankTransferID": "00000000-0000-0000-0000-000000000000"
ValidationErrors is empty so it seems that the transfer is accepted as valid, but won't show up in any of the accounts involved.
Transfer body I use looks like this:
{
"BankTransfers": [{
"FromBankAccount": {"Code": transfer_from },
"ToBankAccount": {"Code": transfer_to},
"Date": transaction_date.strftime("%Y-%m-%d"),
"Amount": amount}]}
And response looks like this:
{
"Id": "1d28fdb6-cadf-4f4c-9801-55b47567e87d",
"Status": "OK",
"ProviderName": app_name_hidden,
"DateTimeUTC": "\/Date(1628847828463)\/",
"BankTransfers": [
{
"BankTransferID": "00000000-0000-0000-0000-000000000000",
"DateString": "2021-08-13T00:00:00",
"Date": "\/Date(1628812800000+0000)\/",
"FromBankAccount": {
"AccountID": account_id_hidden,
"Code": "1057",
"Name": account_name
},
"ToBankAccount": {
"AccountID": account_id_hidden_2,
"Code": "1073",
"Name": account_name_2
},
"Amount": 1000.00,
"FromBankTransactionID": "00000000-0000-0000-0000-000000000000",
"ToBankTransactionID": "00000000-0000-0000-0000-000000000000",
"CurrencyRate": 1.0000000000,
"ValidationErrors": []
}
]
}
Did anyone face the same issue? Would appreciate any suggestions.
Our developers made a release that fix this issue.
Apologies for the inconvenience caused

Ruckus SmartZone API

I am having issues when trying to create a Zone using the API.
I can create the zone with the basic info, but as soon as I want to add another property (specifically "location") I get an error.
This is my dataset I use for the POST
def id_prov ={
"domainId": "$DomainId",
"name": "$ZoneName",
"login": {
"apLoginName": "xxxxx",
"apLoginPassword": "xxxxx"
},
"description": "$jira_summ",
"version": "3.5.1.0.1010",
"countryCode": "ZA"
"location": "$CalledStationName_val",
}
The API creates everything until I either include the "location" property in the original POST or if I try a PUT or PATCH atferwards.
Result value:
{"message":["object instance has properties which are not allowed by the schema: [\"location\"]"],"errorCode":101,"errorType":"Bad HTTP request"}
Anyone come across this or have any ideas on how to get this working?
Thanks
A comma is required after "countryCode": "ZA". The post payload should look like this:
def id_prov ={
"domainId": "$DomainId",
"name": "$ZoneName",
"login": {
"apLoginName": "xxxxx",
"apLoginPassword": "xxxxx"
},
"description": "$jira_summ",
"version": "3.5.1.0.1010",
"countryCode": "ZA",
"location": "$CalledStationName_val",
}

Error loading file stored in Google Cloud Storage to Big Query

I have been trying to create a job to load a compressed json file from Google Cloud Storage to a Google BigQuery table. I have read/write access in both Google Cloud Storage and Google BigQuery. Also, the uploaded file belongs in the same project as the BigQuery one.
The problem happens when I access to the resource behind this url https://www.googleapis.com/upload/bigquery/v2/projects/NUMERIC_ID/jobs by means of a POST request. The content of the request to the abovementioned resource can be found as follows:
{
"kind" : "bigquery#job",
"projectId" : NUMERIC_ID,
"configuration": {
"load": {
"sourceUris": ["gs://bucket_name/document.json.gz"],
"schema": {
"fields": [
{
"name": "id",
"type": "INTEGER"
},
{
"name": "date",
"type": "TIMESTAMP"
},
{
"name": "user_agent",
"type": "STRING"
},
{
"name": "queried_key",
"type": "STRING"
},
{
"name": "user_country",
"type": "STRING"
},
{
"name": "duration",
"type": "INTEGER"
},
{
"name": "target",
"type": "STRING"
}
]
},
"destinationTable": {
"datasetId": "DATASET_NAME",
"projectId": NUMERIC_ID,
"tableId": "TABLE_ID"
}
}
}
}
However, the error doesn't make any sense and can also be found below:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "Job configuration must contain exactly one job-specific configuration object (e.g., query, load, extract, spreadsheetExtract), but there were 0: "
}
],
"code": 400,
"message": "Job configuration must contain exactly one job-specific configuration object (e.g., query, load, extract, spreadsheetExtract), but there were 0: "
}
}
I know the problem doesn't lie either in the project id or in the access token placed in the authentication header, because I have successfully created an empty table before. Also I specify the content-type header to be application/json which I don't think is the issue here, because the body content should be json encoded.
Thanks in advance
Your HTTP request is malformed -- BigQuery doesn't recognize this as a load job at all.
You need to look into the POST request, and check the body you send.
You need to ensure that all the above (which seams correct) is the body of the POST call. The above Json should be on a single line, and if you manually creating the multipart message, make sure there is an extra newline between the headers and body of each MIME type.
If you are using some sort of library, make sure the body is not expected in some other form, like resource, content, or body. I've seen libraries that use these differently.
Try out the BigQuery API explorer: https://developers.google.com/bigquery/docs/reference/v2/jobs/insert and ensure your request body matches the one made by the API.

BigCommerce API - What is correct resource for updating an option value

I'm trying to update an option value using the BigCommerce api.
The documentation says PUT /options/values/id.json
The console says PUT options/id/values.json
I think it should be PUT options/id/values/id.json, which returns a 200 response code, but does not execute the update.
Any information on what the right endpoint is for this and if it works?
Basically, if you do a GET request on options
{
"id": 3,
"name": "Colors",
"display_name": "Color",
"type": "CS",
"values": {
"url": "https://store-xxx.mybigcommerce.com/api/v2/options/3/values.json",
"resource": "/options/3/values"
}
}
The resource endpoint shows that the URL is options/id/values.json. But, this gives you all the values associated with the option. If you want to retrieve a specific option the endpoint is something similar to /api/v2/options/3/values/7.json
{
"id": 7,
"option_id": 3,
"label": "Silver",
"sort_order": 1,
"value": "#cccccc"
}
Doing a PUT request on this - (On REST console, setting the header content-type to application/json and sending raw JSON data) updates the label - Changed Silver to silver)
{
"id": 7,
"option_id": 3,
"label": "silver",
"sort_order": 1,
"value": "#cccccc"
}