Google Drive Api Error 403 cannotAddParent with service account - asp.net-core

Using a service account, Google Drive API and Google SpreadSheet API, I create a spreadsheet that i then move to a specific folder, using the following code :
public async Task<File> SaveNewSpreadsheet(Spreadsheet spreadsheet, File folder)
{
try
{
Spreadsheet savedSpreadsheet = await _sheetService.Spreadsheets.Create(spreadsheet).ExecuteAsync();
string spreadsheetId = GetSpreadsheetId(savedSpreadsheet);
File spreadsheetFile = await GetFileById(spreadsheetId);
File spreadsheetFileMoved = await MoveFileToFolder(spreadsheetFile, folder);
return spreadsheetFileMoved;
}
catch (Exception e)
{
_logger.LogError(e, $"An error has occured during new spreadsheet save to Google drive API");
throw;
}
}
public async Task<File> MoveFileToFolder(File file, File folder)
{
try
{
var updateRequest = _driveService.Files.Update(new File(), file.Id);
updateRequest.AddParents = folder.Id;
if (file.Parents != null)
{
string previousParents = String.Join(",", file.Parents);
updateRequest.RemoveParents = previousParents;
}
file = await updateRequest.ExecuteAsync();
return file;
}
catch (Exception e)
{
_logger.LogError(e, $"An error has occured during file moving to folder.");
throw;
}
}
This used to work fine for a year or so, but since today, the MoveFileToFolder request throw the following exception:
Google.GoogleApiException: Google.Apis.Requests.RequestError
Increasing the number of parents is not allowed [403]
Errors [
Message[Increasing the number of parents is not allowed] Location[ - ] Reason[cannotAddParent] Domain[global]
]
The weird thing is that if I create a new service account and use it instead of the previous one, it works fine again.
I've looked for info on this "cannotAddParent" error but I couldn't find anything.
Any ideas on why this error is thrown ?

I have the same problem and filed in issue in the Google Issue Tracker. This is intended behavior, unfortunately. You are no longer able to place a file in multiple parents as in your example. See the linked documentation for migration.
Beginning Sept. 30, 2020, you will no longer be able to place a file in multiple parent folders; every file must have exactly one parent folder location. Instead, you can use a combination of status checks and a new shortcut implementation to accomplish file-related operations.
https://developers.google.com/drive/api/v2/multi-parenting

Related

UWP App: `Image.SetSource` hangs computer on `StorageFiles` outside of `KnownPlaces`

This one is hard to explain, so I give you some actual and pseudo code:
try
{
// If source (a string) points towards a file that is available with
// StorageFile.GetFileFromPathAsync(), just open the file that way.
// If that is not possible, use the path to look up an Access Token
// and use the file from the StorageFolder gotten via that token.
StorageFile file = await GetFileFromAccessList(source);
if (file != null)
{
bitmap = new BitmapImage();
using (IRandomAccessStream fileStream = await file.OpenAsync(FileAccessMode.Read))
{
await bitmap.SetSourceAsync(fileStream);
}
}
}
catch (Exception e)
{
string s = e.Message;
bitmap = null;
}
with the following method:
public async Task<StorageFile> GetFileFromAccessList(string path)
{
StorageFile result = null;
if (String.IsNullOrEmpty(path) == false)
try
{
// Try to access to file directly...
result = await StorageFile.GetFileFromPathAsync(path);
}
catch (Exception)
{
result = null;
try
{
// See if the folder this thing is in is in the access list...
StorageFolder folder = await GetFolderFromAccessList(Path.GetFullPath(path));
// If there is a folder, try that.
if (folder != null)
result = await folder.GetFileAsync(Path.GetFileName(path));
}
catch (Exception)
{
result = null;
}
}
return result;
}
The resulting bitmap is used in Image.SetSource() as an ImageSource.
Now what kills me: this call works perfectly, fast and rock solid for files stored within the apps folder or KnownFolders. So it works like a charm when I don't need an Access Token. Windows.Storage.AccessCache.StorageApplicationPermissions.FutureAccessList.GetFolderAsync(token)
However, it breaks if I have to use an access token, just not all the time
This code does not break immediately: it breaks when I try to open more than 5-7 source files at the same time.
Repeat that: this works if I display 5-7 images. If I try to open more, it freezes the PC. No such problem occurs when I open StorageFiles without tokens.
I can access such files using normal file operations. I can create bitmaps from them, process them, the work.
I just cannot make them a source of an XAML Image.
Any thoughts?
Ah clarity.
So it turns out that using the DataContextChanged event to refresh the bitmap through Image.SetSource() is the murder weapon.
The solution: declare a property of type BitmapSource. Bind the Image.Source to that property. Update the property with the loaded bitmap upon Image.Loaded and Image.DataContextChanged. Works stable and fast now in all conditions I was able to test.

AmazonS3: Getting warning: S3AbortableInputStream:Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection

Here's the warning that I am getting:
S3AbortableInputStream:Not all bytes were read from the S3ObjectInputStream, aborting HTTP connection. This is likely an error and may result in sub-optimal behavior. Request only the bytes you need via a ranged GET or drain the input stream after use.
I tried using try with resources but S3ObjectInputStream doesn't seem to close via this method.
try (S3Object s3object = s3Client.getObject(new GetObjectRequest(bucket, key));
S3ObjectInputStream s3ObjectInputStream = s3object.getObjectContent();
BufferedReader reader = new BufferedReader(new InputStreamReader(s3ObjectInputStream, StandardCharsets.UTF_8));
){
//some code here blah blah blah
}
I also tried below code and explicitly closing but that doesn't work either:
S3Object s3object = s3Client.getObject(new GetObjectRequest(bucket, key));
S3ObjectInputStream s3ObjectInputStream = s3object.getObjectContent();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(s3ObjectInputStream, StandardCharsets.UTF_8));
){
//some code here blah blah
s3ObjectInputStream.close();
s3object.close();
}
Any help would be appreciated.
PS: I am only reading two lines of the file from S3 and the file has more data.
Got the answer via other medium. Sharing it here:
The warning indicates that you called close() without reading the whole file. This is problematic because S3 is still trying to send the data and you're leaving the connection in a sad state.
There's two options here:
Read the rest of the data from the input stream so the connection can be reused.
Call s3ObjectInputStream.abort() to close the connection without reading the data. The connection won't be reused, so you take some performance hit with the next request to re-create the connection. This may be worth it if it's going to take a long time to read the rest of the file.
Following option #1 of Chirag Sejpal's answer I used the below statement to drain the S3AbortableInputStream to ensure the connection can be reused:
com.amazonaws.util.IOUtils.drainInputStream(s3ObjectInputStream);
I ran into the same problem and the following class helped me
#Data
#AllArgsConstructor
public class S3ObjectClosable implements Closeable {
private final S3Object s3Object;
#Override
public void close() throws IOException {
s3Object.getObjectContent().abort();
s3Object.close();
}
}
and now you can use without warning
try (final var s3ObjectClosable = new S3ObjectClosable(s3Client.getObject(bucket, key))) {
//same code
}
To add an example to Chirag Sejpal's answer (elaborating on option #1), the following can be used to read the rest of the data from the input stream before closing it:
S3Object s3object = s3Client.getObject(new GetObjectRequest(bucket, key));
try (S3ObjectInputStream s3ObjectInputStream = s3object.getObjectContent()) {
try {
// Read from stream as necessary
} catch (Exception e) {
// Handle exceptions as necessary
} finally {
while (s3ObjectInputStream != null && s3ObjectInputStream.read() != -1) {
// Read the rest of the stream
}
}
// The stream will be closed automatically by the try-with-resources statement
}
I ran into the same error.
As others have pointed out, the /tmp space in lambda is limited to 512 MB.
And if the lambda context is re-used for a new invocation, then the /tmp space is already half-full.
So, when reading the S3 objects and writing all the files to the /tmp directory (as I was doing),
I ran out of disk space somewhere in between.
Lambda exited with error, but NOT all bytes from the S3ObjectInputStream were read.
So, two things one need to keep in mind:
1) If the first execution causes the problem, be stingy with your /tmp space.
We have only 512 MB
2) If the second execution causes the problem, then this could be resolved by attacking the root problem.
Its not possible to delete the /tmp folder.
So, delete all the files in the /tmp folder after the execution is finished.
In java, here is what I did, which successfully resolved the problem.
public String handleRequest(Map < String, String > keyValuePairs, Context lambdaContext) {
try {
// All work here
} catch (Exception e) {
logger.error("Error {}", e.toString());
return "Error";
} finally {
deleteAllFilesInTmpDir();
}
}
private void deleteAllFilesInTmpDir() {
Path path = java.nio.file.Paths.get(File.separator, "tmp", File.separator);
try {
if (Files.exists(path)) {
deleteDir(path.toFile());
logger.info("Successfully cleaned up the tmp directory");
}
} catch (Exception ex) {
logger.error("Unable to clean up the tmp directory");
}
}
public void deleteDir(File dir) {
File[] files = dir.listFiles();
if (files != null) {
for (final File file: files) {
deleteDir(file);
}
}
dir.delete();
}
This is my solution. I'm using spring boot 2.4.3
Create an amazon s3 client
AmazonS3 amazonS3Client = AmazonS3ClientBuilder
.standard()
.withRegion("your-region")
.withCredentials(
new AWSStaticCredentialsProvider(
new BasicAWSCredentials("your-access-key", "your-secret-access-key")))
.build();
Create an amazon transfer client.
TransferManager transferManagerClient = TransferManagerBuilder.standard()
.withS3Client(amazonS3Client)
.build();
Create a temporary file in /tmp/{your-s3-key} so that we can put the file we download in this file.
File file = new File(System.getProperty("java.io.tmpdir"), "your-s3-key");
try {
file.createNewFile(); // Create temporary file
} catch (IOException e) {
e.printStackTrace();
}
file.mkdirs(); // Create the directory of the temporary file
Then, we download the file from s3 using transfer manager client
// Note that in this line the s3 file downloaded has been transferred in to the temporary file that we created
Download download = transferManagerClient.download(
new GetObjectRequest("your-s3-bucket-name", "your-s3-key"), file);
// This line blocks the thread until the download is finished
download.waitForCompletion();
Now that the s3 file has been successfully transferred into the temporary file that we created. We can get the InputStream of the temporary file.
InputStream input = new DataInputStream(new FileInputStream(file));
Because the temporary file is not needed anymore, we just delete it.
file.delete();

Google Sheets API v4 receives HTTP 401 responses for public feeds

I'm having no luck getting a response from v4 of the Google Sheets API when running against a public (i.e. "Published To The Web" AND shared with "Anyone On The Web") spreadsheet.
The relevant documentation states:
"If the request doesn't require authorization (such as a request for public data), then the application must provide either the API key or an OAuth 2.0 token, or both—whatever option is most convenient for you."
And to provide the API key, the documentation states:
"After you have an API key, your application can append the query parameter key=yourAPIKey to all request URLs."
So, I should be able to get a response listing the sheets in a public spreadsheet at the following URL:
https://sheets.googleapis.com/v4/spreadsheets/{spreadsheetId}?key={myAPIkey}
(with, obviously, the id and key supplied in the path and query string respectively)
However, when I do this, I get an HTTP 401 response:
{
error: {
code: 401,
message: "The request does not have valid authentication credentials.",
status: "UNAUTHENTICATED"
}
}
Can anyone else get this to work against a public workbook? If not, can anyone monitoring this thread from the Google side either comment or provide a working sample?
I managed to get this working. Even I was frustrated at first. And, this is not a bug. Here's how I did it:
First, enable these in your GDC to get rid of authentication errors.
-Google Apps Script Execution API
-Google Sheets API
Note: Make sure the Google account you used in GDC must be the same account you're using in Spreadsheet project else you might get a "The API Key and the authentication credential are from different projects" error message.
Go to https://developers.google.com/oauthplayground where you will acquire authorization tokens.
On Step 1, choose Google Sheets API v4 and choose https://www.googleapis.com/auth/spreadsheets scope so you have bot read and write permissions.
Click the Authorize APIs button. Allow the authentication and you'll proceed to Step 2.
On Step 2, click Exchange authorization code for tokens button. After that, proceed to Step 3.
On Step 3, time to paste your URL request. Since default server method is GET proceed and click Send the request button.
Note: Make sure your URL requests are the ones indicated in the Spreadsheetv4 docs.
Here's my sample URL request:
https://sheets.googleapis.com/v4/spreadsheets/SPREADSHEET_ID?includeGridData=false
I got a HTTP/1.1 200 OK and it displayed my requested data. This goes for all Spreadsheetv4 server-side processes.
Hope this helps.
We recently fixed this and it should now be working. Sorry for the troubles, please try again.
The document must be shared to "Anyone with the link" or "Public on the web". (Note: the publishing settings from "File -> Publish to the web" are irrelevant, unlike in the v3 API.)
This is not a solution of the problem but I think this is a good way to achieve the goal. On site http://embedded-lab.com/blog/post-data-google-sheets-using-esp8266/ I found how to update spreadsheet using Google Apps Script. This is an example with GET method. I will try to show you POST method with JSON format.
How to POST:
Create Google Spreadsheet, in the tab Tools > Script Editor paste following script. Modify the script by entering the appropriate spreadsheet ID and Sheet tab name (Line 27 and 28 in the script).
function doPost(e)
{
var success = false;
if (e != null)
{
var JSON_RawContent = e.postData.contents;
var PersonalData = JSON.parse(JSON_RawContent);
success = SaveData(
PersonalData.Name,
PersonalData.Age,
PersonalData.Phone
);
}
// Return plain text Output
return ContentService.createTextOutput("Data saved: " + success);
}
function SaveData(Name, Age, Phone)
{
try
{
var dateTime = new Date();
// Paste the URL of the Google Sheets starting from https thru /edit
// For e.g.: https://docs.google.com/---YOUR SPREADSHEET ID---/edit
var MyPersonalMatrix = SpreadsheetApp.openByUrl("https://docs.google.com/spreadsheets/d/---YOUR SPREADSHEET ID---/edit");
var MyBasicPersonalData = MyPersonalMatrix.getSheetByName("BasicPersonalData");
// Get last edited row
var row = MyBasicPersonalData.getLastRow() + 1;
MyBasicPersonalData.getRange("A" + row).setValue(Name);
MyBasicPersonalData.getRange("B" + row).setValue(Age);
MyBasicPersonalData.getRange("C" + row).setValue(Phone);
return true;
}
catch(error)
{
return false;
}
}
Now save the script and go to tab Publish > Deploy as Web App.
Execute the app as: Me xyz#gmail.com,
Who has access to the app: Anyone, even anonymous
Then to test you can use Postman app.
Or using UWP:
private async void Button_Click(object sender, RoutedEventArgs e)
{
using (HttpClient httpClient = new HttpClient())
{
httpClient.BaseAddress = new Uri(#"https://script.google.com/");
httpClient.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"));
httpClient.DefaultRequestHeaders.AcceptEncoding.Add(new System.Net.Http.Headers.StringWithQualityHeaderValue("utf-8"));
string endpoint = #"/macros/s/---YOUR SCRIPT ID---/exec";
try
{
PersonalData personalData = new PersonalData();
personalData.Name = "Jarek";
personalData.Age = "34";
personalData.Phone = "111 222 333";
HttpContent httpContent = new StringContent(JsonConvert.SerializeObject(personalData), Encoding.UTF8, "application/json");
HttpResponseMessage httpResponseMessage = await httpClient.PostAsync(endpoint, httpContent);
if (httpResponseMessage.IsSuccessStatusCode)
{
string jsonResponse = await httpResponseMessage.Content.ReadAsStringAsync();
//do something with json response here
}
}
catch (Exception ex)
{
}
}
}
public class PersonalData
{
public string Name;
public string Age;
public string Phone;
}
To above code NuGet Newtonsoft.Json is required.
Result:
If your feed is public and you are using api key, make sure you are throwing a http GET request.In case of POST request, you will receive this error.
I faced same.
Getting data using
Method: spreadsheets.getByDataFilter has POST request

Custom Error Message for Event Receiver in SharePoint 2010

I want users to upload the .doc files only in the document library.
To do so, I have developed an event receiver in Visual Studio 2010.
My code is as follows:
public override void ItemAdding(SPItemEventProperties properties)
{
try
{
base.ItemAdding(properties);
EventFiringEnabled = false;
if (!properties.AfterUrl.EndsWith("doc"))
{
properties.ErrorMessage = "You are allowed to updload only .doc files";
properties.Status = SPEventReceiverStatus.CancelWithError;
properties.Cancel = true;
}
}
catch (Exception ex)
{
properties.Status = SPEventReceiverStatus.CancelWithError;
properties.ErrorMessage = ex.Message.ToString();
properties.Cancel = true;
}
}
The code is referred from this example.
My problem is that while I am uploading non-doc files it is preventing but with the system error message not the user friendly as defined in properties.ErrorMessage.
How do I solve this?
Please help.
I used the same code you have provided in your question, I get custom error message displayed as shown in below image -
Please provide details of the error you are getting.

Trouble Attaching File Programmatically to Email in Windows Metro App C#/XAML using Share Charm

I'm simply trying to attach a file named Document.pdf in the DocumentsLibrary to an email using the Share Charm. My code below works perfectly on the Local Machine:
private async void OnDataRequestedFiles(DataTransferManager sender, DataRequestedEventArgs e)
{
List<IStorageItem> shares = new List<IStorageItem>();
StorageFile filetoShare = await Windows.Storage.KnownFolders.DocumentsLibrary.GetFileAsync("Document.pdf");
if (filetoShare != null)
{
shares.Add(filetoShare);
filetoShare = null;
}
if (shares != null)
{
DataPackage requestData = e.Request.Data;
requestData.Properties.Title = "Title";
requestData.Properties.Description = "Description"; // The description is optional.
requestData.SetStorageItems(shares);
shares = null;
}
else
{
e.Request.FailWithDisplayText("File not Found.");
}
}
But when I run the exact same code on a Windows Surface Tablet, I get the dreaded "There's nothing to share right now." on the right in the Charms flyout area.
Here's a little more background to help:
I'm not looking to use a File Picker...I know the exact file I'm looking for
I've enabled the Documents Library Capability in the manifest
I've added a File Type Association for pdf in the manifest
and yes, the file does exist and is in the Documents Library
an email account is properly setup in the Mail App on the surface
I can successfully send text emails from the Tablet...just not emails with attachments
Like I said, this works on my Win 8 Development Machine as expected...just not on the Surface. I'm wondering if the Surface has different file or folder permissions?
Thanks for the help...this is driving me CRAZY
I finally figured it out - the problem was that my Event Handler was async (so that I could use await to set the StorageFile variable).
I solved it by setting the StorageFile variable earlier in my code so that it was already available when the Event Handler was called.
I still have no idea why it worked on my development machine, but no on the WinRT surface...
The handler can be an async method. In this case, it is critical to use DataTransferManager. Please refer to the MSDN page specifically for this scenario. For your convenience, the code from the page is copied to here:
private void RegisterForShare()
{
DataTransferManager dataTransferManager = DataTransferManager.GetForCurrentView();
dataTransferManager.DataRequested += new TypedEventHandler<DataTransferManager,
DataRequestedEventArgs>(this.ShareStorageItemsHandler);
}
private async void ShareStorageItemsHandler(DataTransferManager sender,
DataRequestedEventArgs e)
{
DataRequest request = e.Request;
request.Data.Properties.Title = "Share StorageItems Example";
request.Data.Properties.Description = "Demonstrates how to share files.";
// Because we are making async calls in the DataRequested event handler,
// we need to get the deferral first.
DataRequestDeferral deferral = request.GetDeferral();
// Make sure we always call Complete on the deferral.
try
{
StorageFile logoFile =
await Package.Current.InstalledLocation.GetFileAsync("Assets\\Logo.png");
List<IStorageItem> storageItems = new List<IStorageItem>();
storageItems.Add(logoFile);
request.Data.SetStorageItems(storageItems);
}
finally
{
deferral.Complete();
}
}
It is critical to place the following statement before any async method is called:
DataTransferManager dataTransferManager = DataTransferManager.GetForCurrentView();
You only have half a second to get the whole job done (getting the file, attaching...etc.). If the half-second deadline occurs you'll get this "driving crazy" message. Consider implementing some resumable logic and replace the message with "the attachment is being prepared please try again in a few seconds" (or else).
Your WinRT device might be just slower than your development machine. The latter just does the job before the deadline...