File upload and save file info to database in silverlight - sql

I know that out there a lots of posts but i coudn't find a example on how i can uload a video to server and save fileinfo to sql database. I am looking for a silverlight example. Video size would be arround 100-200 Mb.
If someone can point in the right direction , i would apreciate.

You have to split the file and upload by chunk.
The steps :
In a cancelable thread : BackgroundWorker ;
Split your file by chunk : FileStream.Read (Just read chunk one by one) ;
Send the file chunk : HttpWebRequest with a upload ID ;
Wait until the chunk is sent : AutoResetEvent ;
Run the next chunk (step 2).
At the end you can use a Hash like md5 to test if your file is not corrupted.
You can send many chunk in the same time with an order parameter and, in the server side, reorder the chunk.
Note : You can find a sample by reading the project Silverlight File Uploader.

Related

Rename filename.ext.crswap to filename.ext rather than copying

When performing this sequence
Obtain a handle to a new file via window.showSaveFilePicker, say filename.ext
Obtain a writeable file stream from the handle
Write some content into the file using the stream
close the stream to signal completion
the File System API writes to filename.ext.crswap and on close copies filename.ext.crswap to filename.ext
Is there a reason that filename.ext.crswap is not rather renamed to filename.ext?
The reason for this behavior is to avoid partial writes:
"User agents try to ensure that no partial writes happen, i.e. the file represented by fileHandle will either contain its old contents or it will contain whatever data was written through stream up until the stream has been closed."—Spec.

What is the best practice for downloading large CSV files from S3 in Java?

I'm trying to get a large CSV file from S3 but the download fails with “java.net.SocketException: Connection reset”, which is probably due to the InputStream simply being open for too long (the download often takes more than an hour since I am doing multiple time-consuming processes on the streamed content). This is how I currently parse the file:
InputStream inputStream = new GZIPInputStream(s3Client.getObject("bucket", "key").getObjectContent());
Reader decoder = new InputStreamReader(inputStream, Charset.defaultCharset());
BufferedReader isr = new BufferedReader(decoder);
CSVParser csvParser = new CSVParser(isr, CSVFormat.DEFAULT);
CSVRecord nextRecord = csvParser.iterator().next();
...
I know I have to split the download into multiple short getObject-calls with a defined offset for the GetObjectRequest, but I'm wondering how to define this offset in case of a CSV, since I need complete lines.
Do I have to ditch the parser library and parse each line into an Object myself so I can keep a count of the read bytes and use it as an offset for the next batch? That doesn't seem very robust to me. Is there any best practice way to achieve "batch downloading" of CSV records?
I decided on simply using the dedicated getObject(GetObjectRequest getObjectRequest, File destinationFile) method to copy the entire CSV to a temporary file on disk. This closes the HTTP connection as soon as possible and allows me to get the InputStream from the local file with no problems. It doesn't resolve the question of the best way to download in batches, but it's a nice and simple workaround.

Size limit on ContentVersion object in Salesforce

I was trying to create and insert a ContentVersion object in Salesforce lightning(for file upload) using the following code:
ContentVersion v = new ContentVersion();
v.versionData = EncodingUtil.base64Decode(content);
v.title = fileName;
v.pathOnClient = fileName;
insert v;
This works fine for smaller files. But when i try loading a file which is just 750KB the above operation fails(actual allowed size could be still less).
Is there any limit on the size if the files that could be uploaded using the above code?
As per the similar question from the Salesforce StackExchange.
From Base Lightning Components Considerations:
When working with type="file", you must provide your own server-side logic for uploading files to Salesforce. [...]
Uploading files using this component is subject to regular Apex controller limits, which is 1 MB. To accommodate file size increase due to base64 encoding, we recommend that you set the maximum file size to 750 KB. You must implement chunking for file size larger than 1 MB. Files uploaded via chunking are subject to a size limit of 4 MB.
The Base64 is pushing the file size past the Maximum HTTP POST form size—the size of all keys and values in the form limit of 1 MB. Or at least this seems like the applicable limit here.
Instead you will need to go with either an embed Visualforce page as used in How to Build a Lightning File Uploader Component. This gets you up to the Maximum file size for a file uploaded using a Visualforce page limit of 10 MB. Just remember to keep the file processing to a minimum before the heap size limit catches up with you.

MVC4 - How to upload a file partially (only the first 10 lines, for e.g.)

ASP.NET MVC - Is it possible to upload only the first 10 lines of a file? Basically, we have some files that can range from 1-10GB but the data that we need is present only in the first 10 rows in the file. Using the typical web development approache, we'd upload the whole file to the server and then read the first 10 rows, but uploading a 10GB file just to read a few bytes of data seems a big waste of resources. Is it possible to read such a file without uploading all of it to the webserver?
Solution - FileAPIs slice function solved this problem (thanks to Chris below). The simplified code is below for anyone interested -
var sampleFile = document.getElementById('yourfileelement').files[0];
var reader = new FileReader();
var fileData = sampleFile.slice(0, 500000); //Read top 500000 bytes
reader.onprogress = function (evt) { //Show progressbar etc }
reader.onloadend = function (evt) { alert(evt.target.result); } //evt.target.result contains the file data that was read
reader.readAsText(fileClientReadData);
No, but you may be able to accomplish it using the File API client-side to read and send to the server via AJAX just the first 10 lines. However, note that the File API is only supported in modern browsers, so this won't work with IE 9 or less. You might be able to create a more comprehensive solution using a Flash or Java applet, but ugh.

How to move Uploaded file to another host in Php

Can anyone help me to implement how to move uploaded file from one server to another
I am not talking about the move_uploaded_file() function.
for example,
If the image is uploaded from http://example.com
How can I move it to http://image.example.com
It is possible right? Not by sending another post or put request?
Take the Uploaded file, move it to a temporary location and push it then to any FTP-Acount you like.
$tempName = tempnam(sys_get_temp_dir(), 'upload');
move_uploaded_file($_FILES["file"]["tmpname"], $tempName);
$handle = fopen("ftp://user:password#example.com/somefile.txt", "w");
fwrite($handle, file_get_contents($uploadedFile));
fclose($handle);
unlink($tempName);
Actually you don't even need the part with the move_uploaded_file. It is totally sufficent to take the uploaded file and write it's content to the file opened with fopen. For more informations on opening URLs with fopenhave a look at the php-documentation. For more information on uploading files have a look at the File-Upload-Section of the PHP-Manual
[Edit] Added file_get_contents to the code-example
[Edit] Shorter Example
$handle = fopen("ftp://user:password#example.com/somefile.txt", "w");
fwrite($handle, file_get_contents($_FILES["file"]["tmpname"]);
fclose($handle);
// As the uploaded file has not been moved from the temporary folder
// it will be deleted from the server the moment the script is finished.
// So no cleaning up is required here.