How to view the changes of OneDrive children - onedrive

I want to list all the changes of a file inside a folder in OneDrive for business. I can list all the changes in root directory using the below API.
/drive/root/view.delta
The link which I am following is
view_delta
But when I try to list the changes of a file within a folder its throwing the below error.
[error] => Array
(
[code] => notSupported
[message] => view.delta can only be called on the root.
)
The API which I am calling is below. Here Sample is a directory in root folder and Document.docx is a file within that.
/drive/root:/Sample/Document.docx:/view.delta

I found the answer. In OneDrive for Business, view.delta is only supported on the root folder, not on other folders. It also will not return the following Item properties:
createdBy
cTag
eTag
fileSystemInfo
lastModifiedBy
parentReference
size

Related

NestJS Multer set dest to an url

Hello!
So, when i upload a file with nestjs and multer, i want to set the dest in the module to an url, but if i do that, then it gives me an error:
EINVAL: invalid argument, mkdir 'C:\Users\almak\Desktop\Chatenium2\chatenium-server\http:\localhost'
Can you help me why? Thanks, and also is there any way to prevent nestjs from renaming and removing the file extension from the file (test.png => 03ebe1f47494378fee61196c0524afaf )
Heres the code:
Module:
MulterModule.register({
dest: process.env.CDN_URL,
}),
Controller:
#Post("uploadImg")
#UseInterceptors(FileInterceptor("file"))
async uploadedFile(#UploadedFile() file) {
return file;
}
dest by default is a local directory for the server to access via the fs module. If you need to upload the file to another server from your server, you should use a different storage type.
As for the renaming of the file, that's also multer's default, you can pass options to the FileInterceptor according to multer's documentation to change how the file gets handled.

Can't find static assets from express/npm module

tldr;
My project is an NPM module that is used by an ExpressJS server. The server needs to specify an endpoint and my module will do the rest. How do I get my module to load the correct html page and grab the correct js/css files from the correct path?
The Problem
I'm running into a problem where I can see the directory structure of the site, using the serveIndex library, and all the files are in their correct directories but for some reason when I try to load any of the files, whether from the serveIndex view or from the actual endpoint where it should load, I get nothing but 404 errors.
Here's an example if someone wanted to use this NPM module from their project.
app.js (their server)
const express = require('express')
const { adminAreaConfig } = require('express-admin-area')
const app = express()
const adminArea = adminAreaConfig(express) // my module being passed the "express" library
app.use('/admin', adminArea) // specify a URL to use my module
app.listen(3000, () => console.log('\n\nServer Online\n\n'))
Here's an image of my projects dir structure after it's been built.
Going off of a console.log(__dirname), which returns <long path string>/express-admin-area/build/src, I then tell my module, using the express reference passed by the actual server in the code above, to look in the views directory with
... import libraries etc ...
const adminAreaConfig = express => {
const adminArea = express.Router()
adminArea.use('/', express.static(__dirname + '/views') // sets my modules views to the "http://localhost:3000/admin" path
adminArea.use('/dirs', serveIndex(__dirname)) // will get into this later
... some other stuff like exports etc ...
This then attempts to load the index.html file in the express-admin-area/build/src/views directory but fails because it can't locate the CSS and JS files inside express-admin-area/build/src/views/static/css or .../js.
First, I know it fails because instead of looking for http://localhost:3000/admin/static/css/styles.css it looks for http://localhost:3000/static/css/styles.css, so that's another problem I need to solve entirely.
Second, looking back at the small code sample above, adminArea.use('/dirs', serveIndex(__dirname)), I'm using the serveIndex library in an attempt to view the directory structure. If I go to http://localhost:3000/admin/dirs I get the correct directories and files in the browser
But now, if I try to view an actual file I'm met with the error Cannot GET /admin/dir/main.js for example if I were to go to http://localhost:3000/admin/dir/main.js, but I can continue going deeper in the directories if I wanted such as the controllers or routes directories from the image.
What I want
I need a way to get these static assets to load. If I point my module to a basic html page with a simple <h1>Hello, World!</h1> then that's what Ill get but trying to load any outside scripts/stylesheets is when I get the 404 errors and nothing loads.
I'll be answering my own question.
The solution is actually pretty simple. The view layer of this module is handled by React, CRA to be specific. CRA will look for some specific environment variables, one of them being PUBLIC_URL. All I had to do was
Create a .env file in the root directory of my CRA
add PUBLIC_URL="/admin"
Afterward, it's just rebuilding the project, yarn build, and reset the server. CRA will then look at http://localhost:3000/admin/static/... instead of http://localhost:3000/static/... for static assets.

AWS Lambda package-deployed functions require() of a relative path, not found

I have a zip file containing the following structure (this is the root of the archive, not nested in a top-level folder, which I understand is a common cause of errors for aws-s3-lambda deployments):
- support/
- shared.js
- one.js
- two.js
and then in one.js and two.js:
var shared = require("./support/shared");
// ...
When I run this code locally, it works. I use the aws-sdk to upload the zip file to AWS-S3 and then use aws.lambda.createFunction() to create a function with that name and handler and everything. The created function DOES show up in my Lambda dashboard, but when I test it, I get "Cannot find module './support/shared'". I have also tried var shared = require("./support/shared.js"); and that gives "Cannot find module './support/shared.js'".
This is for runtime node6.10. The filename cases are correct for case-sensitive lambda.
Shouldn't this work?? What's the gotcha?
Is there a way to verify the file structure that Lambda is working in to show that the additional ./support/shared.js file actually made it to the working directory or whatever it uses?
The gotcha is that the zip file created on a windows machine has the wrong chmod permissions set in it for when AWS unpacks it. The files are there, but inaccessible but node just gives a generic warning about not found instead of that the folder access is denied.

Get XML comments output file location for ASP Core

I've added Swashbucklepackage to my ASP Core project.
I'd like to configure Swagger to use auto-generated by VS xml comments.
The problem is that I can't find the way to get that location:
PlatformServices.Default.Application.ApplicationBasePath - points to the project root path
Directory.GetCurrentDirectory() - the same
Path.GetFullPath(".") - the same
IHostingEnvironment.WebRootPath - the same
Output folder configured in <project>.xproj by BaseIntermediateOutputPath option.
But I can't get this location in runtime.
var pathToDoc = "????";
options.OperationFilter(new Swashbuckle.SwaggerGen.XmlComments.ApplyXmlActionComments(pathToDoc));
Bad solutions I see:
add configuration option to AppSettings.json
Relative path from project path (as I'm configuring bin output path).
But I'd like to use this with Docker, CI, localhost, so I don't think this would be the best solution to use hard-coded solution..
You can try the following function to get the XML File path
private string GetXmlCommentsPath()
{
var app = PlatformServices.Default.Application;
return System.IO.Path.Combine(app.ApplicationBasePath, System.IO.Path.ChangeExtension(app.ApplicationName, "xml"));
}
The xml file has the same name as the app. I am currently using this code in my project and it works fine.

How to organize folders for Laravel on a conventional web host?

If I have the entire Laravel project inside public_html, I have to go to http://domain.com/public to access it, but if I put the contents of Laravel's public inside public_html, the rest of the files in Laravel would be looking for a folder called public, that is now called public_html.
Also, if I rename public_html to public, it won't work either.
I've tried changing 'public' => __DIR__.'/../public to 'public' => __DIR__.'/../public_html' on /bootstrap/paths.php
When I try to load http:/domain.com it says:
Warning: require() [function.require]: open_basedir restriction in effect. File(/home/domain/bootstrap/autoload.php) is not within the allowed path(s): (/home/domain/public_html:/usr/lib/php:/usr/local/lib/php:/tmp) in /home/domain/public_html/index.php on line 21
Warning: require(/home/domain/bootstrap/autoload.php) [function.require]: failed to open stream: Operation not permitted in /home/domain/public_html/index.php on line 21
Fatal error: require() [function.require]: Failed opening required '/home/domain/public_html/../bootstrap/autoload.php' (include_path='.:/usr/lib/php:/usr/local/lib/php') in /home/domain/public_html/index.php on line 21
'public' => __DIR__.'/../public to 'public' => __DIR__.'/../www' on /bootstrap/paths.php doesn't work either
Support disabled open_basedir restriction, so once again I changed the paths from public to public_html and it works now, no .htaccess magic needed either.