How to Upload a Local File into sharepoint using Graph API? - api

I am trying to upload a local PDF file into share point via using Graph API but the PUT method is just letting me to create a new file in the SharePoint but not letting me upload a local existing C:/ Drive PDF file
Here is the API that I am using
PUT https://graph.microsoft.com/v1.0/drives/{drive-id}/root:/SharePointFilePath/file-name:/content

As per here, that is the correct endpoint. however, you need to send ithe file as a binary stream in the body. https://learn.microsoft.com/en-us/graph/api/driveitem-put-content?view=graph-rest-1.0&tabs=http
This means that you need to read your local pdf file into a binary stream and send it to the endpoint in the body of the put request. as a test for example, you could create a regular text file with the body as string and contenttype as text.
also if it's larger than 4MB, then you need to use a different method. https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/driveitem_createuploadsession mainly an uploadsession.

Previous test script to upload file to SharePoint by PnP PowerShell.
https://social.technet.microsoft.com/Forums/msonline/en-US/25191b51-41dd-4f4b-93aa-a46594c9a184/uploading-a-file-to-sharepoint-online-using-microsoft-graph-api?forum=onlineservicessharepoint
Connect-PnPOnline -AppId 'yourAzure app id' -AppSecret "yourAzure app secret" -AADDomain 'xxx.onmicrosoft.com'
$accessToken= Get-PnPAccessToken
$apiUrl = "https://graph.microsoft.com/v1.0/sites/xxx.sharepoint.com,x,x/drives/driveid/items/01AKXHS4ELSEOTLPZHZZDYYBOW57WR6HK6:/test.xlsx:/content"
$path="C:\Lee\Script\testdata.xlsx"
$file=[IO.File]::ReadAllBytes($path)
Invoke-RestMethod -Headers #{Authorization = "Bearer $accessToken"} -Uri $apiUrl -Method Put -Body $file -ContentType "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"

Related

Attach files larger than 5MB using GMAIL API resumable upload

I am trying to upload an attachment which is greater than 5MB size, hence using resumable upload option of the gmail API using the REST API option.(https://developers.google.com/gmail/api/guides/uploads#resume-upload).
In the API spec it is not clear on what should be the request body of the resumable upload request and the request body of the upload URI.
I have tried all possible ways of adding parameters to the body of the request, but couldn't succeed in attaching the file to the email.
has anyone able to successfully attach files using the resumable upload api,
Also should the attachment file needs to be uploaded as base64 encoded ?
developer guide doesn't provide much information, so unable to get a working solution.
Can anyone provide a snippet of request of the resumable upload request and the request of upload URI ?

Azure logic app API http response with excel file download using Postman

Created azure logic app HTTP request it gives response for normal JSON schema However, I want to attach SharePoint excel sheet when I trigger the request from Postman.
1.How to used content type or schema to download the attached file. when postman request sent.
2.is that possible to download when you hit API through logic app
3.Generated HTTP POST URL is working
For your requirement, I test it in my side. It seems we do not need to set any value for "Content-Type" in headers of response. Please refer to my logic app below:
Then when you request the logic app url in postman, please choose "Send and Download" instead of "Send".
After that, you can download the file when request the url in postman.

how to construct header for (REST API) azure storage SAS (shared accesss signature)

I am debugging a client application calling REST API embedded with AZURE storage SAS-shared access signature to access azure storage resource. However, it is not getting through. The azure threw out an error stating that the mandatory header is missing, and abort the operation.
The REST API is fairly simple, although it is embedded with the SAS token generated by the azure storage account. The client application uses the REST API to write data into an azure blob.
is there anywhere I can find a good example showing how to generate the header for the REST API (SAS)? I need to find out the exact layout of the header (such as the type of information that needs to be embedded in the header.
Also, do I need to register my client application with the Azure AD?
I didn't think my client application needs to be registered with the AZURE since that is why we have client-side SAS. But, I could be wrong. Therefore, any input will be appreciated.
Thanks in advance.
If you use sas token to call Azure blob rest api, the request URL should be like
https://myaccount.blob.core.windows.net/<cantianer>/<blob>?<sastoken>
For example
$accountName=""
$accountKey=""
$containerName="output"
$blobName="test.txt"
$context= New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
$sas = New-AzStorageAccountSASToken -Service Blob -ResourceType Service,Container,Object -Permission "rwdlacx" -Context $context
$body = "Hello"
$headers=#{"x-ms-blob-type"="BlockBlob"; "Content-Type"="text/plain"}
$url="https://$accountName.blob.core.windows.net/$containerName/$blobName$sas"
Invoke-WebRequest -Uri $url -Method Put -Headers $headers -Body $body -UseBasicParsing

Microsoft Graph v1.0 download file content not working

I am trying to download a word document file named Test.docx located in Sharepoint Document Library from within a Visio File macro referring to the Microsoft documentation at
https://learn.microsoft.com/en-us/graph/api/driveitem-get-content?view=graph-rest-1.0&tabs=http
I am able to get my site ID and Item ID ending up to the below URL,
https://graph.microsoft.com/v1.0/sites/{site-id}/drive/items/{item-id}
Since I want to download the file content in binary and save it locally, I add "content" to the url as below,
https://graph.microsoft.com/v1.0/sites/{site-id}/drive/items/{item-id}/content
My code looks as below in VBA environment for the client as below with an active Access Token,
myURL=https://graph.microsoft.com/v1.0/sites/{site-id}/drive/items/{item-id}/content
Set winHttpReq = New XMLHTTP60
winHttpReq.Open "GET", myURL
winHttpReq.setRequestHeader "Authorization", "Bearer " & AccessToken
winHttpReq.send
I am getting an error as "Access is denied." at the line winHttpReq.send and control exits out of the function.
Files.Read.All, Files.ReadWrite.All, Sites.Read.All, Sites.ReadWrite are the permissions already given for the API access.
Also, I did another call fetching the "#microsoft.graph.downloadUrl" download URL but I did not see any binary content in the response.
My questions are,
Is my approach correct?
I need the file binary data to be downloaded using graph API but adding /content to the URL gives this error. How can I get the file using API?
Where can I fetch the binary content from the response to the url of "#microsoft.graph.downloadUrl" of the file to download and save the file locally?
Is there any other approach to download a file using the graph API
that will work in VBA environment?

Why is file being corrupted during multipart upload into express running in aws lambda?

Have a SPA with a redux client and an express webapi. One of the use cases is to upload a single file from the browser to the express server. Express is using the multer middleware to decode the file upload and place it into an array on the req object. Everything works as expected when running on localhost.
However when the app is deployed to AWS, it does not function as expected. Deployment pushes the express api to an AWS Lambda function, and the redux client static assets are served by Cloudfront CDN. In that environment, the uploaded file does make it to the express server, is handled by multer, and the file does end up as the first (and only) item in the req.files array where it is expected to be.
The problem is that the file contains the wrong bytes. For example when I upload a sample image that is 2795 bytes in length, the file that ends up being decoded by multer is 4903 bytes in length. Other images I have tried always end up becoming larger by approximately the same factor by the time multer decodes and puts them into the req.files array. As a result, the files are corrupted and are not displaying as images.
The file is uploaded like so:
<input type="file" name="files" onChange={this.onUploadFileSelected} />
...
onUploadFileSelected = (e) => {
const file = e.target.files[0]
var formData = new FormData()
formData.append("files", file)
axios.post('to the url', formData, { withCredentials: true })
.then(handleSuccessResponse).catch(handleFailResponse)
}
I have tried setting up multer with both MemoryStorage and DiskStorage. Both work, both on localhost and in the aws lambda, however both exhibit the same behavior -- the file is a larger size and corrupted in the store.
I have also tried setting up multer as both a global middleware (via app.use) and as a route-specific middleware on the upload route (via routes.post('the url', multerMiddlware, controller.uploadAction). Again, both exhibit the same behavior. Multer middleware is configured like so:
const multerMiddleware = multer({/* optionally set dest: '/tmp' */})
.array('files')
One difference is that on localhost, both the client and express are served over http, whereas in aws, both the client and express are served over https. I don't believe this makes a difference, but I have yet been unable to test -- either running localhost over https, or running in aws over http.
Another peculiar thing I noticed was that when the multer middleware is present, other middlewares do not seem to function as expected. Rather than the next() function moving flow down to the controller action, instead, other middlewares will completely exit before the controller action invocation, and when the controller invocation exits, control does not flow back into the middlware after the next() call. When the multer middleware is removed, other middlewares do function as expected. However this observation is on localhost, where the entire end-to-end use case does function as expected.
What could be messing up the uploaded image file payload when deployed to the cloud, but not on localhost? Could it really be https making the difference?
Update 1
When I upload this file (11228 bytes)
Here is the HAR chrome is giving me for the local (expected) file upload:
"postData": {
"mimeType": "multipart/form-data; boundary=----WebKitFormBoundaryC4EJZBZQum3qcnTL",
"text": "------WebKitFormBoundaryC4EJZBZQum3qcnTL\r\nContent-Disposition: form-data; name=\"files\"; filename=\"danludwig.png\"\r\nContent-Type: image/png\r\n\r\n\r\n------WebKitFormBoundaryC4EJZBZQum3qcnTL--\r\n"
}
Here is the HAR chrome is giving me for the aws (corrupted) file upload:
"postData": {
"mimeType": "multipart/form-data; boundary=----WebKitFormBoundaryoTlutFBxvC57UR10",
"text": "------WebKitFormBoundaryoTlutFBxvC57UR10\r\nContent-Disposition: form-data; name=\"files\"; filename=\"danludwig.png\"\r\nContent-Type: image/png\r\n\r\n\r\n------WebKitFormBoundaryoTlutFBxvC57UR10--\r\n"
}
The corrupted image file that is saved is 19369 bytes in length.
Update 2
I created a text file with the text hello world that is 11 bytes long and uploaded it. It does NOT become corrupted in aws. This is the case even if I upload it with the txt or png suffix, it ends up as 11 bytes in length when persisted.
Update 3
Tried uploading with a much larger text file (12132 bytes long) and had the same result as in update 2 -- the file is persisted intact, not corrupted.
Potential answers:
Found this https://forums.aws.amazon.com/thread.jspa?threadID=252327
API Gateway does not natively support multipart form data. It is
possible to configure binary passthrough to then handle this multipart
data in your integration (your backend integration or Lambda
function).
It seems that you may need another approach if you are using API Gateway events in AWS to trigger the lambda that hosts your express server.
Or, you could configure API Gateway to work with binary payloads per https://stackoverflow.com/a/41770688/304832
Or, upload directly from your client to a signed s3 url (or a public one) and use that to trigger another lambda event.
Until we get a chance to try out different API Gateway settings, we found a temporary workaround: using FileReader to convert the file to a base64 text string, then submit that. The upload does not seem to have any issues as long as the payload is text.