access google drive with gdrive from terminal - file-upload

I am using gdrive to upload files to google drive from terminal. I used the token the first time I ran gdrive to establish the connection and usccessfully uploaded a file. When I tried the second time with:
$ gdrive upload /home/gigiux/backup/2016-12-25.tar.gz
I got:
$ Uploading /home/gigiux/backup/2016-12-25.tar.gz
Failed to upload file: Post https://www.googleapis.com/upload/drive/v3/files?alt=json&fields=id%2Cname%2Csize%2Cmd5Checksum%2CwebContentLink&uploadType=multipart: Post https://accounts.google.com/o/oauth2/token: dial tcp: lookup accounts.google.com on 127.0.1.1:53: read udp 127.0.0.1:37601->127.0.1.1:53: i/o timeout
Does this mean that every time I run gdrive I need to re'set the connection with google drive and get a new token? Or is it a problem with the connection? but when I run:
$ gdrive list
I get the list of the files on google drive so there is indeed a connection and an access.
thank you

Related

keep prompt message when i using this download aws command on my mac

i was using download command on mac
aws s3 sync s3://abc .
it works and download my file, but it keep pop up :
terminal would like to access your photos, your contact, your calender ... etc
why? Thanks

SFTP To Amazon S3 is failing with error "couldn't close file"

I'm copying a file from my local machine to Amazon S3 using SFTP on the command line. The user is configured to use KMS
The connection opens succesfully, and the file transfers, but at the end, I'm getting this error and the transfer fails
the connection is opened, and then cd into the bucket and the file uploaded with these commands
sftp <AWS username>#<AWS Host>
cd <s3 bucket name>/<folder>
put myFile.txt
The put transfer gets to 100%, but then the following error is logged
Uploading myFile.txt to /myS3Bucket/myFolder
myFile.txt 100% 174 4.9KB/s 00:00
Couldn't close file: Permission denied
Do you know what could be causing this, and how to resolve it?
Resolved it with the solution from this question - Unable to read or write any files using AWS Transfer for SFTP when using KMS encryption key
The user policy was missing the kms permissions. KMS encryption had recently been enabled on the bucket

How to create and interaction between google drive and aws s3?

I'm trying to set up a connection a Google Drive folder and S3 bucket, but I'm not sure where to start.
I've already created a sort of "Frankenstein process", but it's easy to use only by me and sharing it to my co-workers it's a pain.
I have a script that generates a plain text file and saves it into a drive folder. And to upload, I've installed Drive file stream to save it in my mac, then all I did was create a script using Python3, with the boto3 library, to upload the text file into different s3 buckets depending on the file name.
I was thinking that I can create a lambda to process the file into the s3 buckets but I cannot resolve how to create the connection between drive and s3. I would appreciate if someone could give me a piece of advise on how to start with this.
Thanks
if you just simply want to connect google drive and aws s3 there is one service name zapier which provide different type of integration without line of code
https://zapier.com/apps/amazon-s3/integrations/google-drive
For more details you can check this link out

GoReplay - Upload to S3 does not work

I am trying to capture all incoming traffic on a specific port using GoReplay and to upload it directly to S3 servers.
I am running a simple file server on port 8000 and a gor instance using the (simple) command
gor --input-raw :8000 --output-file s3://<MyBucket>/%Y_%m_%d_%H_%M_%S.log
I does create a temporal file at /tmp/ but other than that, id does not upload any thing to S3.
Additional information :
The OS is Ubuntu 14.04
AWS cli is installed.
The AWS credentials are deffined within the environent
It seems the information you are providing or scenario you explained is not complete however to upload a file from your EC2 machine to S3 is simple as written command below.
aws s3 cp yourSourceFile s3://your bucket
To see your file you can see your file by using below command
aws s3 ls s3://your bucket
However, s3 is object storage and you can't use it to upload those files which are continually editing or adding or updating.

How to upload files directly to Amazon S3 from a remote server?

Is it possible to upload a file to S3 from a remote server?
The remote server is basically a URL based file server. Example, using http://example.com/1.jpg, it serves the image. It doesn't do anything else and can't run code on this server.
It is possible to have another server telling S3 to upload a file from http://example.com/1.jpg
upload from http://example.com/1.jpg
server -------------------------------------------> S3 <-----> example.com
If you can't run code on the server or execute requests then, no, you can't do this. You will have to download the file to a server or computer that you own and upload from there.
You can see the operations you can perform on amazon S3 at http://docs.amazonwebservices.com/AmazonS3/latest/API/APIRest.html
Checking the operations for both the REST and SOAP APIs you'll see there's no way to give Amazon S3 a remote URL and have it grab the object for you. All of the PUT requests require the object's data to be provided as part of the request. Meaning the server or computer that is initiating the web request needs to have the data.
I have had a similar problem in the past where I wanted to download my users' Facebook Thumbnails and upload them to S3 for use on my site. The way I did it was to download the image from Facebook into Memory on my server, then upload to Amazon S3 - the full thing took under 2 seconds. After the upload to S3 was complete, write the bucket/key to a database.
Unfortunately there's no other way to do it.
I think the suggestion provided is quite good, you can SCP the file to S3 Bucket. Giving the pem file will be a password less authentication, via PHP file you can validate the extensions. PHP file can pass the file, as argument to SCP command.
The only problem with this solution is, you must have your instance in AWS. You can't use this solution if your website is hosted in other Hosting Providers and you are trying to upload files straight to S3 Bucket.
Technically it's possible, using AWS Signature Version 4, Assuming your remote server is the customer in the image below, you could prepare a form in the main server, and send the form fields to the remote server, for it to curl it. Detailed example here.
you can use scp command from Terminal.
1)using terminal, go to the place where there is that file you want to transfer to the server
2) type this:
scp -i yourAmazonKeypairPath.pem fileNameThatYouWantToTransfer.php ec2-user#ec2-00-000-000-15.us-west-2.compute.amazonaws.com:
N.B. Add "ec2-user#" before your ec2blablbla stuffs that you got from the Ec2 website!! This is such a picky error!
3) your file will be uploaded and the progress will be shown. When it is 100%, you are done!