laravel download excel file from s3 is not working - amazon-s3

I am using laravel 5.1 I need to download excel from amazon s3 storage. I done all process correctly and if it is pdf format file means it downloading correctly but for excel file download it is downloading zip format. Can please anyone help me to fix this issue.
$filePath = "CIS/download/format/upload_format.xlsx";
if ( Storage::disk('s3')->has($filePath)) {
$file = Storage::disk('s3')->get($filePath);
$getMimeType = Storage::disk('s3')->getMimetype($filePath);
return Response::make($file, 200, [
'Content-Type' => $getMimeType,
'Content-Disposition' => 'attachement;
filename:upload_format.xlsx',
]);
}

You can give your Content-Type as desired and Content-Disposition as 'attachment' because your files are coming from S3 and you have to download it as an attachment.
$event_data = $this->ticket->where('user_id', $user_id)->first();
$data = $event_data->pdf;
$get_ticket = 'tickets/'. $data;
$file_name = "YOUR_DESIRED_NAME.csv";
$headers = [
'Content-Type' => 'text/csv',
'Content-Disposition' => 'attachment; filename="'. $file_name .'"',
];
return \Response::make(Storage::disk('s3')->get($get_ticket), 200, $headers);

Related

When i download file programmatically from s3 bucket, It gives me error It looks like we don't support this file format

When I download file from s3 bucket, I am using aws-sdk for that, it downloads the files but when i open that file it says It looks like we don't support this file format, here is my full code of it, can anyone please check my code and help me why image is doesn't open the image, my wholde code is in PHP, It looks like small error but doesn't working for me
$bucket = '*****';
$keyname = '1560346461616.jpg';
$s3 = new S3Client([
'version' => 'latest',
'region' => '******',
'credentials' => [
'key' => '******',
'secret' => '******',
],
]);
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname,
]);
header("Content-Type: {$result['ContentType']}");
header('Content-Disposition: attachment; filename='.$keyname);
echo $result['Body'];
} catch (Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
Check the content type of your .jpg file stored in s3. Ensure the metadata for this object says image/jpeg.I don't think it is related to the object you downloaded,maybe the download file was corrupted.

AWS PHP SDK ContentType not being set

When uploading to S3 via PHP SDK even after setting the correct ContentType S3 still shows application/octet-stream
$file = Aws::putObject([
'Bucket' => 'types-jpeg',
'Key' => 'current.jpg',
'SourceFile' => 'uploads/current.jpg',
'ContentType' => 'image/jpeg',
'ACL' => 'public-read',
]);
File successfully uploads except that viewing the file's meta data shows application/octet-stream instead of 'image/jpeg' on its Content-Type
Any ideas?

How to make php api call to CloudFlare to Purge individual files by URL from cache

this code worked good to Purge All files from cache .
Example:
$authKey = "MyKEY";
$authEmail = "myEMAIL";
$zoneId = "MYZONEID";
$endpoint = "purge_cache";
$data = [
"purge_everything" => true
];
$url = "https://api.cloudflare.com/client/v4/zones/{$zoneId}/{$endpoint}";
$opts = ['http' => [
'method' => 'DELETE',
'header' => [
"Content-Type: application/json",
"X-Auth-Key: {$authKey}",
"X-Auth-Email: {$authEmail}",
],
'content' => json_encode($data),
]];
$context = stream_context_create($opts);
$result = file_get_contents($url, false, $context);
But I need Purge individual files by URL how can I do that via PHP ?
On api page I found this link: https://api.cloudflare.com/#zone-purge-individual-files-by-url
but I don't now how to make it via php ?
You need to pass an array with files - like explained on the API Docs
In your example that would be something like
$data = '{"files":[
"http://www.example.com/css/styles1.css",
"http://www.example.com/css/styles2.css",
"http://www.example.com/css/styles3.css"
]}';

Uploading a file to S3 in laravel

I am using larvel 4.2 and I have got these libraries added in my vendor folder. aws-sdk-php and aws-sdk-php-laravel.
When I try to upload a file in s3 bucket, it uploads a file of 45B only and the content of that file is a string specifying file location on file system. $file object is legit and it has been constructed using $file = Input::file($field->element_name);
$s3 = App::make('aws')->get('s3');
$result = $s3->putObject(array(
'Bucket' => 'phxdevapp',
'Key' => $file_new_name,
'Body' => $file
));
Any pointers ?

reading a file from s3 bucket with laravel getting error

Im trying to get file from s3 bucket using getObject
$s3 = AWS::createClient('s3');
$file = $s3->getObject(array(
'Bucket' => 'hotel4cast',
'Key' => $path->path,
'SaveAs' => public_path()
));
I'm getting below error
Error executing
"GetObject" on "https://s3.amazonaws.com/mybucket/filename.xlsx";
AWS HTTP error: Unable to open /var/www/html/laravel/public/ using mode r+: fopen(/var/www/html/laravel/public/):
ailed to open stream: Is a directory
if i take SaveAs out and dump $file i get object of data, body, stream all that stuff but not sure what to do with that.
I have figured out, there is bug in aws sdk,
i was able to get file to save by storing path in var before calling getObject
$r = fopen(public_path() . '/myfile.xlsx', 'wb');
$s3 = AWS::createClient('s3');
$file = $s3->getObject(array(
'Bucket' => 'bucketname',
'Key' => $path->path,
'SaveAs' => $r
));
can you tell me that what exactly these equals too ? So, I can guide you accordingly.
$path->path = ???
public_path() = ???
Edited
your method params should be like this, you just passing the saveAs path but attaching the key name, So, add the keyname with saveAs path, it will be downloaded.
$s3 = AWS::createClient('s3');
$file = $s3->getObject(array(
'Bucket' => 'hotel4cast',
'Key' => $path->path,
'SaveAs' => public_path()."/filename.xlsx"
));
here are the examples of code, which I am using for uploading file and coping file
for uploading
$result = $this->S3->putObject([
'ACL' => 'public-read-write',
'Bucket' => 'xyz', // REQUIRED
'Key' => 'file.xlsx', // REQUIRED
'SourceFile' => public_path()."/xlsx/file.xlsx",
]);
for Coping from one bucket to another
$copy = $this->S3->copyObject(array(
'ACL' => 'public-read-write',
'Bucket' => 'xyz', // REQUIRED
'Key' => 'file.xlsx', // REQUIRED
'CopySource' => 'mybucketname/xlsx/file.xlsx,
));
but your file which is exists in s3 bucket should have permission to read. other wise it will give you error to saveAs, copy etc
here are multiple permissions, you can see here
'ACL' => 'private|public-read|public-read-write|authenticated-read|aws-exec-read|bucket-owner-read|bucket-owner-full-control',