How to retrieve Salesforce file attachment limit via API? - api

Salesforce attachment file size is officially limited to 5MB (doc) but if requested they can increase this limit on a one to one cases.
My question: Can I retrieve this newly allowed file size limit using the API.
Context: Non-Profits are applying for grants via a web portal (.NET), all data is stored in Salesforce. They are asked to attach files. We read the file size they try to upload and send an error message if it exceeds 5MB as it will not be accepted by Salesforce. This is to avoid having them wait for few minutes to upload to only be told that the file size is too large. We would like to update our code so that it allows files bigger than 5MB if Salesforce allows it. Can we retrieve this information via the API?
Thank you!

You can call the getUserInfo() function in the SOAP API, part of the returned data includes the field orgAttachmentFileSizeLimit (this appears to be missing from the docs, but is in the WSDL)

I'll recommend going away from Salesforce to store files, more if you'r expecting to hit limits, also there is a limit on the space for storing organization wide, a useful service like Amazon S3 would be very useful, you can then attach the S3 url to your record in case it is needed, it'll be also available for external applications without having to load your org's api consumption.

Related

Save API response to the browser?

I am fetching some date from an API and need to have access to this same data on another page, but don't want to request the same data again from the server.
I am trying to find a more efficient and quicker way to access the data at the second request.
Should I save the first request response saved in the browser, session storage, local storage or cookie?
Definitely not cookie!
Saving data in LocalStorage or IndexedDB is how it's usually done.
Basically you need to read about Service Workers, those are used for caching and many other things as well, but most tutorials start with caching - exactly what you need.
I would recommend to start here: Google Progressive Web Apps Training

Upload of Large Files to Google Drive with the Google Drive API for Android (GDAA)

I realize that similar questions have been asked before. However, none of them was answered.
My problem is the following:
To upload a file to Google Drive, you need to create DriveContents.
You either do this by creating them out of thin air:
DriveApi.DriveContentsResult driveContentsResult = Drive.DriveApi.newDriveContents(getGoogleApiClient()).await();
DriveContents contents = driveContentsResult.getDriveContents();
Or you do this by opening an already existing file:
DriveApi.DriveContentsResult driveContentsResult = driveFileResult.getDriveFile().open(getGoogleApiClient(), DriveFile.MODE_WRITE_ONLY, null).await();
DriveContents contents = driveContentsResult.getDriveContents();
You are now ready to fill the DriveContents with data. You do this by obtaining an OutputStream and by writing to this OutputStream:
FileOutputStream fileOutputStream = new FileOutputStream(driveContentsResult.getDriveContents().getParcelFileDescriptor().getFileDescriptor());
Now this is where the problem starts: by filling this OutputStream, Google Play services just copy the file I want to upload and create a local copy. If you have 0.5 GB of free space on your phone and you want to upload a 1.3 GB file, this is not going to work! There is not enough storage space.
So how is it done? Is there a way to directly upload to Google Drive via the GDAA that does not involve creating a local copy first, and THEN uploading it?
Does the Google REST API handle these uploads any different? Can it be done via the Google REST API?
EDIT:
It seems this cannot be done via the GDAA. For people looking for a way to do resumable uploads with the Google REST API, have a look at my example here on StackOverflow.
I'm not sure if it can but you can surely try to use Google REST APIs to upload your file.
You could use Multipart upload or Resumable upload:
Multipart upload
If you have metadata that you want to send along with the data to upload, you can make a single multipart/related request. This is a good choice if the data you are sending is small enough to upload again in its entirety if the connection fails.
Resumable upload
To upload data files more reliably, you can use the resumable upload protocol. This protocol allows you to resume an upload operation after a communication failure has interrupted the flow of data. It is especially useful if you are transferring large files and the likelihood of a network interruption or some other transmission failure is high, for example, when uploading from a mobile client app. It can also reduce your bandwidth usage in the event of network failures because you don't have to restart large file uploads from the beginning.
You must remember as discussed in this SO question:
The GDAA's main identifier, the DriveId lives in GDAA (GooPlaySvcs) only and does not exist in the REST Api.
ResourceId can be obtained from the DriveId only after GDAA committed (uploaded) the file/folder)
You will run into a lot of timing issues caused by the fact that GDAA 'buffers' network requests on it's own schedule (system optimized), whereas the REST Api let your app control the waiting for the response..
Lastly you can check this related SO question regarding tokens and authentication in HTTP request in android. There are also some examples by seanpj for both GDAA and the REST api that might help you.
Hope this helps.

Can I use Autodesk viewing API to render local DWG (2D) files to my browser?

The main goal of my project is to read Autocad(DWG) drawings from my local server to output them in a web browser (Chrome).
I managed to do it with the View and Data API in JAVA from Autocad with buckets, Key, etc. but when it comes to read offline files with this sample code from https://github.com/Developer-Autodesk/view-and-data-offline-sample, the DWG format did not work.
Do you have suggestion or have a clue to use the offline API with DWG files?
The Autodesk View & Data API (developer.autodesk.com) allows you to display a DWG on your website using a zero-client (WebGL) viewer. You need to upload the DWG to the Autodesk server, translate it, and either then download the translation to store on your local server (as demonstrated on extract.autodesk.io) or keep it on the Autodesk server. You might consider downloading it to be advantageous because then you don't need to implement the OAuth code on your server.
Buckets on the Autodesk server can only be accessed using the accesstoken created from your API keys, so it is secure in that only someone with your accesstoken and who knows the URN can access your translated file. However, for the viewer on your client-page to access the file, you need to provide it with your accesstoken. This does mean that someone could separately access your translated file by grabbing the accesstoken and URN from your webpage. But if you're serving up the model on a public page, then you presumably don't care about that.
There is a 'list' API available, but this is white-listed (available on request), so getting your accesstoken and urn for one file doesn't automatically give access to your other files - unless someone can guess the other filenames (or iterate to find them).
If you use a non-permanent bucket, then your original (untranslated file) becomes unavailable when the bucket expires, or you can explicitly delete the untranslated file (using the delete API).
Files translated via the View & Data API are not accessible via A360. They are stored in a separate area. (But I wouldn't be at all surprised if an A360 file access API became available in the near future :-).
Finally, unless you want to interact with the displayed file via the viewer's JavaScript API, you may prefer just to upload your files to A360, share the translated model, and then iframe embed them in your webpage.

Upload large file on server is this possible by create firefox addon

I want to upload large video file. for that i want to create firefox addon. Is this possible by create firefox addons to upload large files on my server.
or is there any other way to upload large files on server.
please suggest.
If you are POSTing the data to the server as application/x-www-form-urlencoded then you should base64 encode it using btoa() and include it as one of the POST parameters in the request body (i.e. the string passed to XMLHttpRequest.send()):
postbody = "body=" + btoa(fileContents);
xhr.send(postbody);
If you are just downloading the file and uploading it right away, you might as well keep it in memory since you're presumably going to load it into memory anyway in order to base64 encode the contents.
Well if you're reading the file into memory then you should need an nsIFile at all. You can just download it using XMLHttpRequest and use responseText, uploading it in the way I described in the answer. If you do have an nsIFile then yes, that snippet describes how to read from it.
I assume you are wanting to upload via HTTP.
If so, the upload limit is usually decided by the server-side software. This affects both the maximum size and the length of time you have to upload it.
Without a server capable of taking an upload in chunks and reassembling it, you are limited in ways you can't get around through software.
If you want to upload via FTP on the other hand, there are a lot of options... look at FireFTP.
I have made firefox addons for fileupload.
I integrate jquery file upload.
I create widget. In the widget I made panel. In panel I create separate web page for file uploading. And panel is calling that page.
For more information you can mail me at chetansinghal1988#gmail.com

Skydrive sync REST API

I have read the docs for SkyDrive REST APIs but didn't find any API using which i can sync with the SkyDrive, without recursive polling the folders for update check.
Is there any API to get only the update for a user Drive?
A commonplace reality of epistemology is that...
It is typically much easier to prove that something exists than to prove that it does not exist
Never the less I can say with a high level of confidence that the official REST API for Skydrive doesn't include a way of getting a list of updated documents for synchronization purposes.
Furthermore I didn't see any evidence of a non-supported/non-official API that would serve this purpose and by observing the way the Windows Client for SkyDrive interacts with the server (within limit of fair-use reverse engineering), it appears that the synchronization is done by reviewing the directory tree rather than getting a differential list.
I believe the closes you can go is: Get a list of the user's most recently used documents
To get a list of SkyDrive documents that the user has most recently
used, use the wl.skydrive scope to make a GET request to
/USER_ID/skydrive/recent_docs, where USER_ID is either me or the user
ID of the consenting user. Here's an example.
GET http://apis.live.net/v5.0/me/skydrive/recent_docs?access_token=ACCESS_TOKEN