How to Upload Files on Google Drive Folder having Read/ Write full Access - file-upload

I have created a Test folder on my Google Drive. I want to upload files in this folder using the google drive hard-coded URL. Anyone can access this folder and anyone can add and delete files because I have provided full access to this folder. But the system throws an error "the remote server returned an error (400) bad request" on the UploadFile request. Below is my code. Please help to resolve the issue. Thanks
private void Upload(string fileName)
{
var client = new WebClient();
var uri = new Uri("https://drive.google.com/drive/folders/1yPkWZf03yhihjkejQYrhuS_SMxh9j8AP?usp=sharing");
try
{
client.Encoding = Encoding.UTF8;
client.UploadFile(uri, fileName);
//client.Headers.Add("fileName", Path.GetFileName(fileName));
//var data = File.ReadAllBytes(fileName);
//client.UploadDataAsync(uri, data);
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}

This will never work. C#'s WebClient's UploadDataAsync performs a standard HTTP POST to the specified Uri. Your Uri isn't the kind that accept HTTP POST. Instead it shows Google Drive's web app and let user manage files in the folder, whether it is adding files or removing files.
To add files to Google Drive folder programmatically, use Google Drive API.
Refer to: https://developers.google.com/drive/api/guides/folder
The guide shows how to create folder and upload files to specified folder.
This guide: https://developers.google.com/workspace/guides/get-started shows how to get started if you have never used any of Google's API.

Related

.NET Core 6 API Access SharePoint Online but Http Error 403

To download file from SharePoint Online, I tried to access it by .NET Core 6 Web API CSOM and here is my sample code.
using Microsoft.SharePoint.Client
private void DownloadFileFromSPO()
{
string site = "https://domain.sharepoint.com/sites/xxx";
string user = "userID";
SecureString securePwd = new SecureString();
string password = "passward";
foreach(char c in password.ToCharArray())
{
securePwd.AppendChar(c);
}
using (ClientContext context = new ClientContext(site))
{
context.Credentials = new NetworkCredential(user, securePwd);
Microsoft.SharePoint.Client.File file context.Web.GetFileByUrl(fileUrl);
context.Load(file);
context.ExecuteQuery();
//do something
}
}
}
Unfortunately, when I called this method but got a http error 403: FORBIDDEN. Refer to this article, error 403 means server telling you, “I’m sorry. I know who you are–I believe who you say you are–but you just don’t have permission to access this resource.
So I supposed my API was trying to access correct website with correct userID and passward, but do not have authentication to access it. I also found there were many articles suggest replacing NetworkCredential(user, securePwd); by SharePointOnlineCredentials (user, securePwd);. However, when I tried this solution, I found my installed CSOM package did not have this class.
I also found MS docs said SharePointOnlineCredentials is not available anymore, they recommend another "OAuth access token" way to get the authentication.
My question is, is that the only way to get the authentication to access SharePoint Online? Due to company policy, configuring an application in Azure AD process is bothering me. Is there another solution, just by code and package, in this scenario?

Dropbox api upload to root directory?

I'm using dropbox-php-sdk and i cannot upload to root directory(team), i can only upload to account directory? Instead of uploading to home/Movies, it is uploading to home/Dave Smith/home/Movies?
//Configure Dropbox service
$dropbox = new Dropbox($app);
//Chunk Size
$chunkSize = 8000000;
$pathToLocalFile = '/home/mywebsite/public_html/assets/videos/input3.mp4';
// Automatically create stream through file path
$dropboxFile = new DropboxFile($pathToLocalFile);
$file = $dropbox->uploadChunked($dropboxFile, "/home/Movies/input3.mp4", null, $chunkSize);
My dropbox account has full dropbox permissions to access all files/folders in teamspace. The app/access token was created by team admin account. How do i solve?
By default, Dropbox API calls operate in the "member folder" ("/Dave Smith" in your example).
You can configure Dropbox API calls to operate in the "team space" instead though, by setting the Dropbox-API-Path-Root header. You can find information on using that in the Team Files Guide.

How to load images/videos directly from server in Blazor WebAssembly?

I implemented a solution in Blazor webassembly that allows the user to upload an image and store the file on the server and the filepath in the database. When I load the image on client I send the image from the sever to client as bytes and transform the image bytes in a base64 string and display the image. This works well but from what I read, this is not a good practice because the base64 tranformation is increasing the file size by 30% and the images cannot be cached on client. I would want to let the browser download the file from the server path and display the image. My questions are the following:
How do I set the server 'root' location in order for the browser to know where to get the files ? I'm using IWebHostEnvironment on the server to get the local path.
Is there any way to secure the files to allow only the authorised user to download the files from the server ?
Should I add the images in the same folder as the solution or I should make a different folder outside of the solution ?
I'm currently using IIS Server with ASP.NET CORE 5 in a Blazor WebAssembly solution.
EDIT 1:
Controller method:
[HttpGet(API.GetFile)]
public async Task<FileContentResult> GetFile([FromQuery] string fileType,string encoding,string name)
{
var path = Path.Combine(_webHost.ContentRootPath, "Media", "Images", name);
byte[] file = await System.IO.File.ReadAllBytesAsync(path);
return File(file, encoding);
}
Link used to access the file :
https://localhost:44339/GetFile?encoding=image/jpeg&name=3020_1_21.JPEG
Generated tag in the browser:
<img style="width:300px;height:300px" src="https://localhost:44339/GetFile?encoding=image/jpeg&name=3020_1_21.JPEG">
As I mentioned in comments, the link works in Postman, it retrives the image, but it does not work in the browser, there is no file retrieved.

Serve git-lfs files from express' public folder

I'm using node.js (express) on Heroku, where the slug size is limited to 300MB.
In order to keep my slug small, I'd like to use git-lfs to track my express' public folder.
In that way all my assets (images, videos...) are uploaded to a lfs-store (say AWS S3) and git-lfs leaves a pointer file (with probably the S3 URL in it?).
I'd like express redirects to the remote S3 file when serving files from the public folder.
My problem is I don't kwon how to retrieve the URL from the pointer file's content...
app.use('/public/:pointerfile', function (req, res, next) {
var file = req.params.pointerfile;
fs.readFile('public/'+file, function (er, data) {
if (er) return next(er);
var url = retrieveUrl(data); // <-- HELP ME HERE with the retrieveUrl function
res.redirect(url);
});
});
Don't you think it will not be too expensive to make express read and parse potentially all the public/* files. Maybe I could cache the URL once parsed?
Actually the pointer file doesn't contain any url information in it (as can be seen in the link you provided, or here) - it just keeps the oid(Object ID) for the blob which is just its sha256.
You can however achieve what you're looking for using the oid and the lfs api that allows you to download specific oids using the batch request.
You can tell what is the endpoint that's used to store your blobs from .git/config which can accept non-default lfsurl tags such as:
[remote "origin"]
url = https://...
fetch = +refs/heads/*:refs/remotes/origin/*
lfsurl = "https://..."
or a separate
[lfs]
url = "https://..."
If there's no lfsurl tag then you're using GitHub's endpoint (which may in turn redirect to S3):
Git remote: https://git-server.com/user/repo.git
Git LFS endpoint: https://git-server.com/user/repo.git/info/lfs
Git remote: git#git-server.com:user/repo.git
Git LFS endpoint: https://git-server.com/user/repo.git/info/lfs
But you should work against it and not S3 directly, as GitHub's redirect response will probably contain some authentication information as well.
Check the batch response doc to see the response structure - you will basically need to parse the relevant parts and make your own call to retrieve the blobs (which is what git lfs would've done in your stead during checkout).
A typical response (taken from the doc I referenced) would look something like:
{
"_links": {
"download": {
"href": "https://storage-server.com/OID",
"header": {
"Authorization": "Basic ...",
}
}
}
}
So you would GET https://storage-server.com/OID with whatever headers was returned from the batch response - the last step will be to rename the blob that was returned (it's name will typically be just the oid as git lfs uses checksum based storage) - the pointer file has the original resource's name so just rename the blob to that.
I've finally made a middleware for this: express-lfs with a demo here: https://expresslfs.herokuapp.com
There you can download a 400Mo file as a proof.
See usage here: https://github.com/goodenough/express-lfs#usage
PS: Thanks to #fundeldman for good advices in his answer ;)

401 unauthorized exception while reading data from the document library SharePoint 2010

We developed a WebPart to read the content of the file in the document library in the SiteCollection. I used the following code to read the content.
WebClient wc = new WebClient();
wc.Credentials = CredentialCache.DefaultNetworkCredentials;
string documenturl = siteurl+"/" + file.Url.ToString();
content = wc.DownloadData(documenturl);//documenturl is the file path of the document
But, i got the following error 401 unathorized exception
System.Net.WebException: The remote server returned an error:(401) Unauthorized. at System.Net.WebClient.DownloadDataInternal(Uri address,WebRequest& request) at System.Net.WebClient.DownloadData(Uri address)
For your information, i already tried to download document by SPFile openBinary method. But it only works when the document is small. Please refer the below site.
getting ComException while reading the document in SharePoint 2010
Thanks in advance.
Probably the local loopback check. (Verify by trying to access your site using the same URL whilst using IE running on your SharePoint server)
Also - any reason why you're using web services to access local content rather than the SharePoint object model (Microsoft.SharePoint.dll)?