Terraform Unable to Import multiple aws_s3_object keys using - amazon-s3

I'm trying to import aws_s3_object keys which are created manually to Terraform but seems like there is no way to do this if you have more folders/keys in s3 bucket.
If anyone has experience the same issue and found a solution please help me out.

Related

Looking for examples on Airflow GCSToS3Operator. Thanks

I am trying to send file from GCS bucket to S3 bucket using Airflow. I came across this article https://medium.com/apache-airflow/generic-airflow-transfers-made-easy-5fe8e5e7d2c2 but looking for specific code implementations and examples which also explains the requirements for this. I am a newbie to Airflow and GCP.
Astronomer is a good place to start with . see the doc for GCSToS3Operator.
You can get dependencies, explanation on each variable and links to examples

read parquet data from s3 bucket using NiFi

guys!
I'm just starting to learn NiFi. Don't throw stones) just help or guide. I need to read parquet data from s3 bucket, I don’t understand how to set up lists3 and fetchs3object processors for reading data.
full path looks like this:
s3://inbox/prod/export/date=2022-01-07/user=100/
2022-01-09 06:51:23 23322557 cro.parquet
I"ll write data to sql database - I don"t have problems with it.
I tried to configure the lists3 processor myself and I think is not very good
bucket inbox
aws_access_key_id
aws_secret_access_key
region US EAST
endpoint override URL http://s3.wi-fi.ru:8080
What I would do is try to test the Access Key ID, and Secret Key outside of NiFi to make sure that they are working. If they are working fine, then it’s an issue with the NiFi configuration. If the keys/id isn’t working, then by getting new values that work and providing them to NiFi, it might have a better shot of working.

terraform reference existing s3 bucket and dynamo table

From my Terraform script, I am trying to get hold of data for existing resources such as the ARN of an existing DynamoDB table and the bucket Id for an exiting S3 bucket. I've tried to use terraform_remote_state for S3, however it doesn't fit my requirements as it requires a key and I haven't found anything yet that would work for Dynamo.
Is there a solution the would work for both or would there be two separate solutions?
Many thanks in advance.
Remote state is not the concept you need - that's for storage of the tfstate file. What you require is a "data source":
https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/s3_bucket
https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/dynamodb_table
In Terraform, you use "Resources" to declare what things need to be created (if they don't exist), and "Data Sources" to read information from things that already exist and are not managed by Terraform.

Google Big Query cloud storage path error

I am brand new to google big query so apologize if this is obvious.
I am simply trying to test the product out right now. I am able to upload a 5 MB file without any issues.
When I move to 10 MB+ using google storage, I am having no luck.
I have a bucket named teststir and a file name verify_sift.csv
When I try to create a new data set I select google cloud and put:
gs://teststir/verify_sift.csv as the path.
Unfortunately the job keeps failing:
Not found: URI gs://teststir/verify_sift.csv
(I have triple checked names, tried multiple files but no luck). Am I missing something obvious? Thank you for your help!
I've encountered the same problem.
I've just created new bucket with another location and it helped :)

BIG QUERY IMPORT FROM CLOUD issue

I have uploaded my data sets into google cloud. I am trying to import them into big query tables. I get an error declaring that the location of my data is not the "path" as declared in the google cloud browser= "55555/M04Q1%20Query.txt"
Thats my bucket and my file.... so something is missing-
ideas?
I'm a little bit confused by what your asking. The import path, if you're importing from a google cloud storage path should look like: "gs://bucket/object". Can you give more information about the request your are sending, how you are sending it, and the error you are getting?