File not uploading to s3 bucket using yii2tech/file-storage extension - amazon-s3

I'm trying to upload file to my amazon s3 bucket.
My config looks like this
'fileStorage' => [
'class' => 'yii2tech\filestorage\amazon\Storage',
'awsKey' => 'dota2',
'awsSecretKey' => 'dota2',
'buckets' => [
'webfiles' => [
'fileSubDirTemplate' => '{ext}/{^name}/{^^name}',
'region' => 'us_e1',
'acl' => 'public',
],
]
],
My upload function looks like this
$bucket = Yii::$app->fileStorage->getBucket('webfiles');
$bucket->copyFileIn('/var/www/html/vasttag/web/', 'samplefinal.mp4');
$result= var_dump($bucket->fileExists('samplefinal.mp4'));
echo $result;
I get result as false. Yes I have double checked my path and my filename. Its perfectly fine. Anyone know where Im going wrong?

Related

LinkedIn Campagin API Link not inserted

I am using the marketing API to create ads with curl in PHP.
Creating Campaign Groups and Campaigns in this way is working perfectly fine. The basic creation of a creative and an ad is also working. The only problem is that i can’t add a destination URL and some other info’s like the description and Call-to-Action. But the destination URL or content Landing Page like it’s called in the documentation really is fundamental.
This is my code for the creative data:
$data = [
"creative" => [
"inlineContent" => [
"post" => [
"adContext" => [
"dscAdAccount" => "urn:li:sponsoredAccount:".$this->params->get('app.linkedin_my_acc_id'),
"dscStatus" => "ACTIVE"
],
"author" => "urn:li:organization:".$this->params->get('app.linkedin_my_org_id'),
"commentary" => $job->getCommentary(),
"visibility" => "PUBLIC",
"lifecycleState" => "PUBLISHED",
"isReshareDisabledByAuthor" => false,
"contentCallToActionLabel" => "DOWNLOAD",
"contentLandingPage" => "https://google.com",
"content" => [
"media" => [
"title" => "This is a test!",
"id" => $image
]
]
]
],
"campaign" => "urn:li:sponsoredCampaign:".$myCampaignId,
"intendedStatus" => "ACTIVE"
]
];
The structure is the same as in the documentation (https://learn.microsoft.com/en-us/linkedin/marketing/integrations/ads/account-structure/create-and-manage-creatives-new?view=li-lms-2023-01&tabs=http#creatives-inline-schema).
So what am I doing wrong? Why is LinkedIn ignoring my contentLandingPage and contentCallToActionLabel values? The request is working and Im not getting an unpermitted fields error so it seems to be correct…
Can someone help? I also can provide the code of the Campaign Group and the Campaign if needed.
Not sure if this is important but the objectiveType of my Campaign is “WEBSITE_VISIT” and the type is “SPONSORED_UPDATES”.

Access Custom S3 Metadata After Completing Multipart Upload

I'm wanting to access the custom metadata for an object uploaded via the S3 multipart upload after firing off the completeMultipartUpload method.
I initiate a multipart S3 upload with some added custom metadata like so:
$response = $this->client->createMultipartUpload([
'Bucket' => $this->bucket,
'Key' => $key,
'ContentType' => $type,
'Expires' => 60,
'Metadata' => [
'file-guid' => $fileGuid,
],
]);
When I complete the multipart upload, I'm wanting to access the file-guid metadata and pass it along in my response.
$result = $this->client->completeMultipartUpload([
'Bucket' => $this->bucket,
'Key' => $key,
'UploadId' => $uploadId,
'MultipartUpload' => [
'Parts' => $parts,
],
]);
$fileGuid = $result['?'] // Couldn't find the metadata in the result.
return response()->json(['file-guid' => $fileGuid]);
I've checked the S3 object after it's been uploaded and it shows the custom metadata, but I don't see how to access it. I assumed it would be part of the completeMultipartUpload response, but I'm not seeing it.
Any help would be appreciated. Thanks!
I found a solution, but it involves an additional request. If anyone knows of a way to access the metadata without making another request, that would be better.
$headObject = $this->client->headObject([
'Bucket' => $this->bucket,
'Key' => $key,
]);

When i download file programmatically from s3 bucket, It gives me error It looks like we don't support this file format

When I download file from s3 bucket, I am using aws-sdk for that, it downloads the files but when i open that file it says It looks like we don't support this file format, here is my full code of it, can anyone please check my code and help me why image is doesn't open the image, my wholde code is in PHP, It looks like small error but doesn't working for me
$bucket = '*****';
$keyname = '1560346461616.jpg';
$s3 = new S3Client([
'version' => 'latest',
'region' => '******',
'credentials' => [
'key' => '******',
'secret' => '******',
],
]);
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname,
]);
header("Content-Type: {$result['ContentType']}");
header('Content-Disposition: attachment; filename='.$keyname);
echo $result['Body'];
} catch (Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
Check the content type of your .jpg file stored in s3. Ensure the metadata for this object says image/jpeg.I don't think it is related to the object you downloaded,maybe the download file was corrupted.

How to set http timeouts for Amazon AWS SDK for PHP

I'm using the Amazon AWS SDK for PHP (namely, version 2.7.16) to upload files to an S3 bucket. How can I set a timeout for http/tcp operations (connection, upload, etc.)? Although I've googled a lot I wasn't able to find out how.
Sample code I'm using:
$awsS3Client = Aws\S3\S3Client::factory(array(
'key' => '...',
'secret' => '...'
));
$awsS3Client->putObject(array(
'Bucket' => '...',
'Key' => 'destin/ation.file',
'ACL' => 'private',
'Body' => 'content'
));
so I'd like to set a timeout on the putObject() call.
Thanks!
Eventually I helped myself:
$awsS3Client = Aws\S3\S3Client::factory(array(
'key' => '...',
'secret' => '...'
'curl.options' => array(
CURLOPT_CONNECTTIMEOUT => 5,
CURLOPT_TIMEOUT => 10,
)
));
Looks like AWS PHP uses curl internally, so network related options are set this way.
With SDK version 3 this can be configured using the http configuration key.
$awsS3Client = Aws\S3\S3Client([
'key' => '...',
'secret' => '...',
'http' => [
'connect_timeout' => 5,
'timeout' => 10,
]
]);

reading a file from s3 bucket with laravel getting error

Im trying to get file from s3 bucket using getObject
$s3 = AWS::createClient('s3');
$file = $s3->getObject(array(
'Bucket' => 'hotel4cast',
'Key' => $path->path,
'SaveAs' => public_path()
));
I'm getting below error
Error executing
"GetObject" on "https://s3.amazonaws.com/mybucket/filename.xlsx";
AWS HTTP error: Unable to open /var/www/html/laravel/public/ using mode r+: fopen(/var/www/html/laravel/public/):
ailed to open stream: Is a directory
if i take SaveAs out and dump $file i get object of data, body, stream all that stuff but not sure what to do with that.
I have figured out, there is bug in aws sdk,
i was able to get file to save by storing path in var before calling getObject
$r = fopen(public_path() . '/myfile.xlsx', 'wb');
$s3 = AWS::createClient('s3');
$file = $s3->getObject(array(
'Bucket' => 'bucketname',
'Key' => $path->path,
'SaveAs' => $r
));
can you tell me that what exactly these equals too ? So, I can guide you accordingly.
$path->path = ???
public_path() = ???
Edited
your method params should be like this, you just passing the saveAs path but attaching the key name, So, add the keyname with saveAs path, it will be downloaded.
$s3 = AWS::createClient('s3');
$file = $s3->getObject(array(
'Bucket' => 'hotel4cast',
'Key' => $path->path,
'SaveAs' => public_path()."/filename.xlsx"
));
here are the examples of code, which I am using for uploading file and coping file
for uploading
$result = $this->S3->putObject([
'ACL' => 'public-read-write',
'Bucket' => 'xyz', // REQUIRED
'Key' => 'file.xlsx', // REQUIRED
'SourceFile' => public_path()."/xlsx/file.xlsx",
]);
for Coping from one bucket to another
$copy = $this->S3->copyObject(array(
'ACL' => 'public-read-write',
'Bucket' => 'xyz', // REQUIRED
'Key' => 'file.xlsx', // REQUIRED
'CopySource' => 'mybucketname/xlsx/file.xlsx,
));
but your file which is exists in s3 bucket should have permission to read. other wise it will give you error to saveAs, copy etc
here are multiple permissions, you can see here
'ACL' => 'private|public-read|public-read-write|authenticated-read|aws-exec-read|bucket-owner-read|bucket-owner-full-control',