Laravel copy and move not working with large file(10GB) with s3 - amazon-s3

Small files (up to 1 GB) are successfully transferred but large files return false on the move and copy functions.
both functions should work correctly for large files as well.
My code
try {
//dd($param);
$s3 = Storage::disk('s3');
$dd = $s3->move($param['source'], $param['target']);
dd($dd);
} catch (Exception $e) {
return [
"status" => 0,
"msg" => $e->getMessage(),
];
}

Related

pion/laravel-chunk-upload Laravel not working with large files

I am using resumable.js and Laravel Chunk for uploading large files. It works for small files but didn't work for > 500mb. Chunk is part the file but it didn't re-compile the file to a given directory.
Namespaces
use Illuminate\Http\Request;
use Illuminate\Http\UploadedFile;
use Pion\Laravel\ChunkUpload\Exceptions\UploadMissingFileException;
use Pion\Laravel\ChunkUpload\Handler\AbstractHandler;
use Pion\Laravel\ChunkUpload\Handler\HandlerFactory;
use Pion\Laravel\ChunkUpload\Receiver\FileReceiver;
use Illuminate\Support\Facades\Storage;
Controller
$receiver = new FileReceiver('file', $request, HandlerFactory::classFromRequest($request));
if (!$receiver->isUploaded()) {
// file not uploaded
}
$fileReceived = $receiver->receive(); // receive file
if ($fileReceived->isFinished()) { // file uploading is complete / all chunks are uploaded
$file = $fileReceived->getFile(); // get file
$extension = $file->getClientOriginalExtension();
$fileName = str_replace('.'.$extension, '', $file->getClientOriginalName()); //file name without extenstion
$fileName .= '_' . md5(time()) . '.' . $extension; // a unique file name
$disk = Storage::disk('new');
$path = $disk->putFileAs('resources/techpacks', $file, $fileName);
// delete chunked file
unlink($file->getPathname());
// return [
// 'path' => asset('storage/' . $path),
// 'filename' => $fileName
// ];
}
Resumable
let browseFile = $('#browseFile');
let resumable = new Resumable({
target: "{{ url('admin/techpack/insert') }}",
query:{_token:'{{ csrf_token() }}'} ,// CSRF token
fileType: ['zip'],
chunkSize: 10*1024*1024, // default is 1*1024*1024, this should be less than your maximum limit in php.ini
headers: {
'Accept' : 'application/json'
},
testChunks: false,
throttleProgressCallbacks: 1,
});
resumable.assignBrowse(browseFile[0]);
resumable.on('fileAdded', function (file) { // trigger when file picked
showProgress();
resumable.upload() // to actually start uploading.
});
resumable.on('fileProgress', function (file) { // trigger when file progress update
updateProgress(Math.floor(file.progress() * 100));
});
resumable.on('fileSuccess', function (file, response) { // trigger when file upload complete
// response = JSON.parse(response)
console.log(response)
});
resumable.on('fileError', function (file, response) { // trigger when there is any error
alert('file uploading error.')
});
let progress = $('.progress');
function showProgress() {
progress.find('.progress-bar').css('width', '0%');
progress.find('.progress-bar').html('0%');
progress.find('.progress-bar').removeClass('bg-success');
progress.show();
}
function updateProgress(value) {
progress.find('.progress-bar').css('width', `${value}%`)
progress.find('.progress-bar').html(`${value}%`)
}
function hideProgress() {
progress.hide();
}
Chunk Folder
After Re-Building
Added to Target Folder
But I can't open the final file. Please guide me.
Windows Error
Resumable Error

Livewire: What is the proper way to save multiple images in the storage link and the name of the image in the database?

I tried to upload multiple images and want to save it to the storage and and name of the image into database but nothing happened next.
I dump the data from the view and I got this
array:3 [▼
0 => Livewire\TemporaryUploadedFile {#1437 ▼
+"disk": "local"
#storage: Illuminate\Filesystem\FilesystemAdapter {#1384 ▶}
#path: "livewire-tmp/pALAWPNYcMg4pm1w8EINqvURTuYH8x-metaRmFjZWJvb2stMDAxOS5qcGc=-.jpg"
-test: false
-originalName: "pALAWPNYcMg4pm1w8EINqvURTuYH8x-metaRmFjZWJvb2stMDAxOS5qcGc=-.jpg"
-mimeType: "application/octet-stream"
-error: 0
#hashName: null
path: "C:\Windows\Temp"
filename: "pALAWPNYcMg4pm1w8EINqvURTuYH8x-metaRmFjZWJvb2stMDAxOS5qcGc=-.jpg"
basename: "phpB58F.tmp"
pathname: "C:\Windows\Temp\phpB58F.tmp"
extension: "tmp"
realPath: "C:\wamp64\www\cbcc-website\storage\app\livewire-tmp/pALAWPNYcMg4pm1w8EINqvURTuYH8x-metaRmFjZWJvb2stMDAxOS5qcGc=-.jpg"
size: 554154
writable: false
readable: false
executable: false
file: false
dir: false
link: false
}
1 => Livewire\TemporaryUploadedFile {#1438 ▶}
2 => Livewire\TemporaryUploadedFile {#1436 ▶}
]
and this is my store function
public function store(){
dd($this->gallery_image);
$action = '';
$data = $this->validate([
'gallery_image.*' => 'image|max:1024',
]);
if(!empty($this->gallery_image)){
foreach ($this->gallery_image as $images) {
$images->store('Gallery');
}
}
if($this->galleryId){
Gallery::find($this->galleryId)->update($data);
$action = 'edit';
}else{
Gallery::create($data);
$action = 'store';
}
$this->emit('showEmitedFlashMessage', $action);
$this->resetInputFields();
$this->emit('refreshParent');
$this->emit('closeGalleryModal');
}
Is there any better way to this? Thank you.
First of all, make sure you're using WithFileUploads in your component.
It doesn't look like you're saving the files to disk anywhere in your code, all you're doing is recording the file upload to the database.
This is how I'm doing it for a thing I just wrote. In this scenario the public variable uploaded files is wire:bound to your image field, and setup step is the entity that the files table is referencing. My code calls this to actually store the files to disk. It's not complete, because i don't have thumbnail resizing in yet, but it should still give you a good idea of how to do it.
public function saveUploadedFiles() {
if (!empty($this->uploaded_files)) {
foreach ($this->uploaded_files as $file) {
$this->active_file = new File;
$this->active_file->filetype = $file->getMimeType();
$this->active_file->filename = Str::uuid()->toString();
$this->active_file->filesize = $file->getSize();
$this->active_file->order = 0;
$this->active_file->filepath = '/files/setupsteps/' . $this->active_file->filename;
if (strpos($this->active_file->filetype, 'mage') !== false) {
$this->active_file->thumbnail = '/files/thumbnails/setupsteps/' . $this->active_file->filename;
} else {
$this->active_file->thumbnail = asset('images/placeholder.jpg');
}
$this->active_file->created_by = auth()->id();
$this->active_file->updated_by = auth()->id();
$this->uploadFile($file);
$this->active_step->files()->save($this->active_file);
}
$this->active_step->touch();
$this->files = $this->active_step->files;
}
}
public function uploadFile($file) {
$file->storeAs('files/setupsteps/',$this->active_file->filename,'public');
if (strpos($this->active_file->type,'image') !== false){
$file->storeAs('files/thumbnails/setupsteps/',$this->active_file->filename,'public');
}
}

HCL Domino AppDevPack - Problem with writing Rich Text

I use the code proposed as an example in the documentation for Domino AppDev Pack 1.0.4 , the only difference is the reading of a text file (body.txt) as a buffer, this file containing only simple long text (40Ko).
When it is executed, the document is created in the database and the rest of the code does not return an error.
But finally, the rich text field was not added to the document.
Here the response returned:
response: {"fields":[{"fieldName":"Body","unid":"8EA69129BEECA6DEC1258554002F5DCD","error":{"name":"ProtonError","code":65577,"id":"RICH_TEXT_STREAM_CORRUPT"}}]}
My goal is to write very long text (more than 64 Ko) in a rich text field. I use in the example a text file for the buffer but it could be later something like const buffer = Buffer.from ('very long text ...')
Is this the right way or does it have to be done differently ?
I'm using a Windows system with IBM Domino (r) Server (64 Bit), Release 10.0.1FP4 and AppDevPack 1.0.4.
Thank you in advance for your help
Here's code :
const write = async (database) => {
let writable;
let result;
try {
// Create a document with subject write-example-1 to hold rich text
const unid = await database.createDocument({
document: {
Form: 'RichDiscussion',
Title: 'write-example-1',
},
});
writable = await database.bulkCreateRichTextStream({});
result = await new Promise((resolve, reject) => {
// Set up event handlers.
// Reject the Promise if there is a connection-level error.
writable.on('error', (e) => {
reject(e);
});
// Return the response from writing when resolving the Promise.
writable.on('response', (response) => {
console.log("response: " + JSON.stringify(response));
resolve(response);
});
// Indicates which document and item name to use.
writable.field({ unid, fieldName: 'Body' });
let offset = 0;
// Assume for purposes of this example that we buffer the entire file.
const buffer = fs.readFileSync('/driver/body.txt');
// When writing large amounts of data, it is necessary to
// wait for the client-side to complete the previous write
// before writing more data.
const writeData = () => {
let draining = true;
while (offset < buffer.length && draining) {
const remainingBytes = buffer.length - offset;
let chunkSize = 16 * 1024;
if (remainingBytes < chunkSize) {
chunkSize = remainingBytes;
}
draining = writable.write(buffer.slice(offset, offset + chunkSize));
offset += chunkSize;
}
if (offset < buffer.length) {
// Buffer is not draining. Whenever the drain event is emitted
// call this function again to write more data.
writable.once('drain', writeData);
}
};
writeData();
writable = undefined;
});
} catch (e) {
console.log(`Unexpected exception ${e.message}`);
} finally {
if (writable) {
writable.end();
}
}
return result;
};
As of appdev pack 1.0.4, the rich text stream accepts writing data of valid rich text cd format, in the LMBCS character set. We are currently working on a library to help you write valid rich text data to the stream.
I'd love to hear more about your use cases, and we're excited you're already poking around the feature! If you can join the openntf slack channel, I usually hang out there.

View a PDF in React Native

Can anyone please provide sample code for displaying a PDF in React Native? iOS and Android.
This is what I've tried:
render: function() {
return <WebView
source={require('./my.pdf')}
/>
}
^ This results in a Red Screen of Death with the "Unable to resolve module" error.
render: function() {
return <WebView
source={{uri: 'my.pdf'}}
/>
}
^ This gives an "Error Loading Page" message in the WebView. "The requested URL was not found on this server"
I know iOS is capable of showing a PDF in a UIWebView. I assume Android can do the same. There must be something in how the source is specified that I'm missing.
Okay, for future generations, here's how I solved this problem:
Updated September 13, 2017:
There is a new NPM module that makes this entire process much easier. I would suggest using it going forward instead of my original answer below:
react-native-pdf
Once installed, rendering the PDF is as easy as this:
export default class YourClass extends Component {
constructor(props) {
super(props);
this.pdf = null;
}
render() {
let yourPDFURI = {uri:'bundle-assets://pdf/YourPDF.pdf', cache: true};
return <View style={{flex: 1}}>
<Pdf ref={(pdf)=>{this.pdf = pdf;}}
source={yourPDFURI}
style={{flex: 1}}
onError={(error)=>{console.log(error);}} />
</View>
}
}
Just put your actual pdf in the android/app/src/main/assets/pdf folder of your project.
Original Answer:
iOS
render: function() {
return <WebView source={{uri: 'My.pdf'}}/>
}
The trick is that you need to include My.pdf into your project in Xcode and make sure it's added to your build target.
Just copying it into your React Native project folder wasn't enough. It had to be part of the Xcode project itself.
Android
It appears that Android did not provide a native PDF viewer until 5.0 (Lollipop). To get around this, I've had to make use of three key techniques:
Pull the PDF out of my APK bundle and store it in the files folder for my app. This SO answer was very helpful in accomplishing this:
Android: How to copy files from 'assets' folder to sdcard?
I tweaked the code a bit so that the file wasn't going to an sdcard but to my app's files folder. Here's what I added to my MainActivity.java
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
AssetManager assetManager = getAssets();
String[] files = null;
try {
files = assetManager.list("pdf");
} catch (IOException e) {
Log.e("tag", "Failed to get asset file list.", e);
}
if (files != null) for (String filename : files) {
InputStream in = null;
OutputStream out = null;
try {
in = assetManager.open("pdf/" + filename);
File outFile = new File(getFilesDir(), filename);
out = new FileOutputStream(outFile);
copyFile(in, out);
Log.e("tag", "Copy was a success: " + outFile.getPath());
} catch(IOException e) {
Log.e("tag", "Failed to copy asset file: " + "pdf/" + filename, e);
}
finally {
if (in != null) {
try {
in.close();
} catch (IOException e) {
// NOOP
}
}
if (out != null) {
try {
out.close();
} catch (IOException e) {
// NOOP
}
}
}
}
}
private void copyFile(InputStream in, OutputStream out) throws IOException {
byte[] buffer = new byte[1024];
int read;
while((read = in.read(buffer)) != -1){
out.write(buffer, 0, read);
}
}
I also made sure my PDF is in the assets/pdf folder under android/app/src/main
I then utilized the react-native-fs package to get the absolute URL to my PDF, which is now in the files folder:
var RNFS = require('react-native-fs');
var absolutePath = RNFS.DocumentDirectoryPath + '/My.pdf';
With all of this in place, I used react-native-pdf-view to actually load and display the PDF:
import PDFView from 'react-native-pdf-view';
render: function() {
var absolutePath = RNFS.DocumentDirectoryPath + '/My.pdf';
return <PDFView
ref={(pdf)=>{this.pdfView = pdf;}}
src={absolutePath}
style={ActharStyles.fullCover} />
}
Good luck!
A simple solution for this problem is to set <WebView> source/uri to this:
https://drive.google.com/viewerng/viewer?embedded=true&url={your pdf url}’
Just add style={{flex:1}} in the opening View Tag of your return and you should be fine.
The Amazon S3 Presigned URL contains query string contains "?" and "&" which confuses the outer URL.
So you have to Encode the S3 Presigned URL before passing it to the Google Doc Viewer.
Like this:
var encodedUrl = encodeURIComponent(presigned_url);
var iFrameUrl = 'https://docs.google.com/gview?url=' + encodedUrl;
In case of pdf url this will do
openLink(url) {
return Linking.canOpenURL(url)
.then(supported => {
if (!supported) {
console.log("Can't handle url: " + url);
this.showFlashMessage('Not supported in your device');
} else {
return Linking.openURL(url);
}
})
.catch(err => console.error('An error occurred', err));
}

Stop request during file upload in NodeJs

I'm write an image uploader, and I want to constrain the size of the image to under 3mb. On the server side, I can check the size of the image in the header, something like this (using express):
app.post('/upload', function(req, res) {
if (+req.headers['content-length'] > 3001000) { // About 3mb
// Do something to stop the result
return res.send({'error': 'some kind of error'});
}
// Stream in data here...
}
I tried to stop the req by (and permuations of)
req.shouldKeepAlive = false;
req.client.destroy();
res.writeHead(200, {'Connection': 'close'});
res.end()
None of them really "destroys" the request to prevent more data being uploaded.
req.client.destroy() seem to freeze the download, but the res.send({error... is not being sent back.
Help!
Throw an error and catch it. It will stop the file upload, allowing you to send a response.
try { throw new Error("Stopping file upload..."); }
catch (e) { res.end(e.toString()); }
It's a bit hackish, but it works...
Here is my solution:
var maxSize = 30 * 1024 * 1024; //30MB
app.post('/upload', function(req, res) {
var size = req.headers['content-length'];
if (size <= maxSize) {
form.parse(req, function(err, fields, files) {
console.log("File uploading");
if (files && files.upload) {
res.status(200).json({fields: fields, files: files});
fs.renameSync(files.upload[0].path, uploadDir + files.upload[0].originalFilename);
}
else {
res.send("Not uploading");
}
});
}
else {
res.send(413, "File to large");
}
And in case of wasting client's uploading time before get response, control it in the client javascript.
if (fileElement.files[0].size > maxSize) {
....
}