Upload of Large Files to Google Drive with the Google Drive API for Android (GDAA) - google-drive-android-api

I realize that similar questions have been asked before. However, none of them was answered.
My problem is the following:
To upload a file to Google Drive, you need to create DriveContents.
You either do this by creating them out of thin air:
DriveApi.DriveContentsResult driveContentsResult = Drive.DriveApi.newDriveContents(getGoogleApiClient()).await();
DriveContents contents = driveContentsResult.getDriveContents();
Or you do this by opening an already existing file:
DriveApi.DriveContentsResult driveContentsResult = driveFileResult.getDriveFile().open(getGoogleApiClient(), DriveFile.MODE_WRITE_ONLY, null).await();
DriveContents contents = driveContentsResult.getDriveContents();
You are now ready to fill the DriveContents with data. You do this by obtaining an OutputStream and by writing to this OutputStream:
FileOutputStream fileOutputStream = new FileOutputStream(driveContentsResult.getDriveContents().getParcelFileDescriptor().getFileDescriptor());
Now this is where the problem starts: by filling this OutputStream, Google Play services just copy the file I want to upload and create a local copy. If you have 0.5 GB of free space on your phone and you want to upload a 1.3 GB file, this is not going to work! There is not enough storage space.
So how is it done? Is there a way to directly upload to Google Drive via the GDAA that does not involve creating a local copy first, and THEN uploading it?
Does the Google REST API handle these uploads any different? Can it be done via the Google REST API?
EDIT:
It seems this cannot be done via the GDAA. For people looking for a way to do resumable uploads with the Google REST API, have a look at my example here on StackOverflow.

I'm not sure if it can but you can surely try to use Google REST APIs to upload your file.
You could use Multipart upload or Resumable upload:
Multipart upload
If you have metadata that you want to send along with the data to upload, you can make a single multipart/related request. This is a good choice if the data you are sending is small enough to upload again in its entirety if the connection fails.
Resumable upload
To upload data files more reliably, you can use the resumable upload protocol. This protocol allows you to resume an upload operation after a communication failure has interrupted the flow of data. It is especially useful if you are transferring large files and the likelihood of a network interruption or some other transmission failure is high, for example, when uploading from a mobile client app. It can also reduce your bandwidth usage in the event of network failures because you don't have to restart large file uploads from the beginning.
You must remember as discussed in this SO question:
The GDAA's main identifier, the DriveId lives in GDAA (GooPlaySvcs) only and does not exist in the REST Api.
ResourceId can be obtained from the DriveId only after GDAA committed (uploaded) the file/folder)
You will run into a lot of timing issues caused by the fact that GDAA 'buffers' network requests on it's own schedule (system optimized), whereas the REST Api let your app control the waiting for the response..
Lastly you can check this related SO question regarding tokens and authentication in HTTP request in android. There are also some examples by seanpj for both GDAA and the REST api that might help you.
Hope this helps.

Related

How to stream an HTTP POST request's body in the browser

The big picture is that I want to live upload recorded audio from the browser directly to google drive.
This is a pet project so I am happy to play with experimental web technologies.
I currently have the browser fetching a feed from the microphone using MediaDevices.getUserMedia() and encoding it to mp3 in a WebWorker. The encoder returns an rxjs.Observable<Int16Array> that will produce chunks of the encoded file on subscribe.
I would like to use a resumable upload to upload the file, preferably in the "single request" style. The challenge is in uploading the file as it is produced by the encoder.
I appreciate that I could probably achieve a similar result by using their "multiple chunks" style and collecting the results of the encoder into Blobs and sending them on an interval. My problem with this is that the more "live" the upload is (smaller chunks) the more POST requests I will be making.
XMLHttpRequest.send() does specify that I can provide a ReadableStream as the body. BUT it appears that this experimental tech does not yet support byte streams

Saving data on phone in a Cordova app

I am making a mobile app using Cordova and I need to save some sensitive and not so sensitive data inside the phone. I am a bit lost on what is the best way to do it.
I need to save:
A JSON web-token (for authentication).
A response from server (I save this to populate my page in case the GET request fails).
Coordinates information when user is logging data to the app (for later upload to a server from with in the app). These will be many separate logs, and can be large in size for local storage ~5-10 MB.
Till now i have been successfully saving everything I need to the local storage but I don't think that is the correct way to do it. So that is why I need some help in deciding what is the best course to take from security point of view.
Saving server response is just for better UI experience and static in size so I guess local storage is a good option to use.
But web-tokens and GPS logs is sensitive information and I dont want to keep it in the local storage as it is accessible from outside the app.
What other options do I have?
Cordova still doesn't have encrypted storage.
Is saving to files a good approach? This here says that data contained inside cordova.file.applicationStorageDirectory is private to the app.So can I use it to save the logs and the token?
The plugin also lists the file systems for Android and iOS and lists which of those are private.
I am currently working with android phones but want to extend the app to iOS later. I have never worked with file systems and caches before so I am a bit lost.

Finding the file size of an image file in the Sony Camera Remote API

I'm writing a fairly involved application for working with Sony cameras.
I can list the contents of the camera and copy image files no problem at all, but I can't seem to figure out the size of the files before I start to download them.
I'm receiving the file list using the standard getContentList API, and finding the files using the originals array in the response. That response seems to have no file size information in it.
Is this possible? Knowing the file size before downloading is important for a good user experience, and all the other camera APIs support it.
I do get the size when I start to download in the HTTP Content-Length header, but performing HEAD requests to hundreds of URLs in a row seems very inefficient!
Unfortunately the API does not support getting the file size.

Upload large file on server is this possible by create firefox addon

I want to upload large video file. for that i want to create firefox addon. Is this possible by create firefox addons to upload large files on my server.
or is there any other way to upload large files on server.
please suggest.
If you are POSTing the data to the server as application/x-www-form-urlencoded then you should base64 encode it using btoa() and include it as one of the POST parameters in the request body (i.e. the string passed to XMLHttpRequest.send()):
postbody = "body=" + btoa(fileContents);
xhr.send(postbody);
If you are just downloading the file and uploading it right away, you might as well keep it in memory since you're presumably going to load it into memory anyway in order to base64 encode the contents.
Well if you're reading the file into memory then you should need an nsIFile at all. You can just download it using XMLHttpRequest and use responseText, uploading it in the way I described in the answer. If you do have an nsIFile then yes, that snippet describes how to read from it.
I assume you are wanting to upload via HTTP.
If so, the upload limit is usually decided by the server-side software. This affects both the maximum size and the length of time you have to upload it.
Without a server capable of taking an upload in chunks and reassembling it, you are limited in ways you can't get around through software.
If you want to upload via FTP on the other hand, there are a lot of options... look at FireFTP.
I have made firefox addons for fileupload.
I integrate jquery file upload.
I create widget. In the widget I made panel. In panel I create separate web page for file uploading. And panel is calling that page.
For more information you can mail me at chetansinghal1988#gmail.com

How to retrieve Salesforce file attachment limit via API?

Salesforce attachment file size is officially limited to 5MB (doc) but if requested they can increase this limit on a one to one cases.
My question: Can I retrieve this newly allowed file size limit using the API.
Context: Non-Profits are applying for grants via a web portal (.NET), all data is stored in Salesforce. They are asked to attach files. We read the file size they try to upload and send an error message if it exceeds 5MB as it will not be accepted by Salesforce. This is to avoid having them wait for few minutes to upload to only be told that the file size is too large. We would like to update our code so that it allows files bigger than 5MB if Salesforce allows it. Can we retrieve this information via the API?
Thank you!
You can call the getUserInfo() function in the SOAP API, part of the returned data includes the field orgAttachmentFileSizeLimit (this appears to be missing from the docs, but is in the WSDL)
I'll recommend going away from Salesforce to store files, more if you'r expecting to hit limits, also there is a limit on the space for storing organization wide, a useful service like Amazon S3 would be very useful, you can then attach the S3 url to your record in case it is needed, it'll be also available for external applications without having to load your org's api consumption.