How do you set a tmp folder on Vercel (with NextJS?) to handle file uploads written to tmp folder.
I've seen multiple answers on how to do this, and they all have problems..
const tmp = `os.tmpdir()/tmp
// ENOENT error on Vercel when I try to write to this location
const tmp = process.cwd()/tmp`
// ENOENT error on Vercel when I try to write to this location
const tmp = /tmp with some code that checks if thetmp` dir exists, if not - create it..
try {
await fs.readdir(tmp);
console.log("Directory exists.");
} catch (err) {
await fs.mkdir(tmp);
}
Results with this error:
read-only file system, mkdir '/var/task/tmp'
Also, in localhost /tmp saves the file to /private/tmp which I don't want.
I just want the /tmp dir to BE IN THE PROJECT folder.
Related
I'm trying to use the FileSystem API to write an uploaded file on a SPA to a Local sandboxed FileSystem using the FileSystem API.
The File Is uploaded with drop acion and I can get the File object array in the call back.
From the File I can get the ReadableStream calling the stream method (yes, it return only readable sream).
Considering that the uploaded file could be big enough, I would go for a streaming than loading entirely into a blob and then writing into FileSystem api.
So, following the docs the steps are:
get a FileSystem (DOMFileSystem) through the async webkitRequestFileSystem call.
get the prop root that is a FileSystemDirectoryEntry
create a file through getFile (with flag create:true) that returns (async) a FileSystemFileEntry
Now from the FileEntry I can get a FileWriter using createWriter but it is obsolete (in MDN), and in any case it is a FileWriter while I would look to obtain a WritableStream instead in order to use the pipeTo from the uploaded file Handler->ReadableStream.
So, I see that in the console the class (interface) FileSystemFileHandler is defined but I cannot understand how to get an instance from the FileSystemFileEntry. If I can obtain a FileSystemFileHandler I can call the createWritable to obtain a FileSystemWritableFileStream that I can "pipe" with the ReadStream.
Anyone who can clarify this mess ?
references:
https://web.dev/file-system-access/
https://wicg.github.io/file-system-access/#filesystemhandle
https://developer.mozilla.org/en-US/docs/Web/API/FileSystemFileEntry
You have the solution in your "references" links at the bottom. Specifically, this is the section to read. You can create files or directories like so:
// In an existing directory, create a new directory named "My Documents".
const newDirectoryHandle = await existingDirectoryHandle.getDirectoryHandle('My Documents', {
create: true,
});
// In this new directory, create a file named "My Notes.txt".
const newFileHandle = await newDirectoryHandle.getFileHandle('My Notes.txt', { create: true });
Once you have a file handle, you can then pipe to it or write to it:
async function writeFile(fileHandle, contents) {
// Create a FileSystemWritableFileStream to write to.
const writable = await fileHandle.createWritable();
// Write the contents of the file to the stream.
await writable.write(contents);
// Close the file and write the contents to disk.
await writable.close();
}
…or…
async function writeURLToFile(fileHandle, url) {
// Create a FileSystemWritableFileStream to write to.
const writable = await fileHandle.createWritable();
// Make an HTTP request for the contents.
const response = await fetch(url);
// Stream the response into the file.
await response.body.pipeTo(writable);
// pipeTo() closes the destination pipe by default, no need to close it.
}
I'm trying to run the following code but I get this error
{ [Error: ENOENT: no such file or directory, open '/mypath/key.json'] }
I know it has something to do with no key.json file in the directory I'm running the code from, but where can I find this file?
I've tried searching find / -name "key.json" and using some paths there but I still get the same error. Thanks
const BigQuery = require('#google-cloud/bigquery');
const bigquery = new BigQuery({
projectId: 'XXXXX',
keyFilename: 'key.json'
});
const query = `SELECT total_amount, pickup_datetime, trip_distance
FROM \`nyc-tlc.yellow.trips\`
ORDER BY total_amount DESC
LIMIT 1;`
bigquery.createQueryJob(query).then((data) => {
const job = data[0];
return job.getQueryResults({timeoutMs: 10000});
}).then((data) => {
const rows = data[0];
console.log(rows[0]);
}).catch(e=>{
//handle exception
console.log(e)
})
;
The key.json file is the resulting file that you download in order to authenticate against GCP services. There are several methods to authenticate and one of those is using a service account. The key.json is the file that contains the credentials needed.
How you create and use the key.json file is explained here1
This is also a good guide for creating the credentials via the Console UI 2
I am trying to load a PDF file from the application directory, I tried loading it from the url and it works and no I need to load the already downloaded PDF from the Application directory.
This is the package that I used to view PDF flutter_pdfview, while trying to load it from a url this works fine, now what I need is load an already downloaded pdf from the directory.
for loading file from directory i tried this:
var dir = await getApplicationDocumentsDirectory();
File file = File('${dir.path}/$pName.pdf');
bool fileExists = File(await '${dir.path}/$pName.pdf')
.existsSync();
if(fileExists)
{
urlPdfPath = file.toString();
print('url pdf path $urlPdfPath');
Navigator.push(context, MaterialPageRoute(builder: (context) {
return PdfViewer(
path: urlPdfPath,
product: pName,
);
}));
}
when i run this i got the exception
D/AndroidRuntime( 4565): Shutting down VM
E/AndroidRuntime( 4565): FATAL EXCEPTION: main
E/AndroidRuntime( 4565): java.lang.IllegalArgumentException: Unsupported value: java.io.FileNotFoundException: No such file or directory
In your Code:
Change - urlPdfPath = file.toString(); to urlPdfPath = file.path;
I'm simply trying to follow this tutorial on how to upload files to gcs with Node and Express. But the following error keep causing my app to crash. Usually, I am able to upload one file without a problem in the first run. But I will get this error after running a few request, even with different file. When I try to upload, say 5, files at a time, this error cause my app to crash even in the first run. I see the process is trying to rename a file in the .config folder. Is it a normal behavior? If so, is there a work-around?
Window: v10.0.10586
Node: v4.3.1
Express: v4.13.1
Error: EPERM: operation not permitted, rename 'C:\Users\James Wang.config\configstore\gcs-resumable-upload.json.2873606827' -> 'C:\Users\James Wang.config\configstore\gcs-resumable-upload.json'
at Error (native)
at Object.fs.renameSync (fs.js:681:18)
at Function.writeFileSync as sync
at Object.create.all.set (C:\Users\James Wang\gi-cms-backend\node_modules\configstore\index.js:62:21)
at Object.Configstore.set (C:\Users\James Wang\gi-cms-backend\node_modules\configstore\index.js:93:11)
at Upload.set (C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:264:20)
at C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:60:14
at C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:103:5
at Request._callback (C:\Users\James Wang\gi-cms-backend\node_modules\gcs-resumable-upload\index.js:230:7)
at Request.self.callback (C:\Users\James Wang\gi-cms-backend\node_modules\request\request.js:199:22)
at emitTwo (events.js:87:13)
at Request.emit (events.js:172:7)
at Request. (C:\Users\James Wang\gi-cms-backend\node_modules\request\request.js:1036:10)
at emitOne (events.js:82:20)
at Request.emit (events.js:169:7)
at IncomingMessage. (C:\Users\James Wang\gi-cms-backend\node_modules\request\request.js:963:12)
[nodemon] app crashed - waiting for file changes before starting...
UPDATE:
After setting {resumable: false} as suggested by #stephenplusplus in this post, I am no longer getting the "EPERM: operation not permitted" error.But, I start running into the { [ERROR:ETIMEDOUT] code: 'ETIMEDOUT', connection: false } error while trying to upload multiple files at a time with the largest file greater than 1.5mb. Other files get uploaded successfully.
For more information, I am able to upload files one by one when the files are no greater than ~2.5mb. If I try to upload 3 files at a time, I can only do so with files no greater than ~1.5mb.
Is the "Operation not permitted" issue as specified in the question a window specific thing, and does the timeout issue happen only after i set resumable = false?
I'm using express and multer with node.
This is the code I'm using now:
// Express middleware that will handle an array of files. req.files is an array of files received from
// filemulter.fields([{field: name, maxCount}]) function. This function should handle
// the upload process of files asychronously
function sendFilesToGCS(req, res, next) {
if(!req.files) { return next(); }
function stream(file, key, folder) {
var gcsName = Date.now() + file.originalname;
var gcsFile = bucket.file(gcsName);
var writeStream = gcsFile.createWriteStream({ resumable: false });
console.log(key);
console.log('Start uploading: ' + file.originalname);
writeStream.on('error', function(err) {
console.log(err);
res.status(501).send(err);
});
writeStream.on('finish', function() {
folder.incrementFinishCounter();
req.files[key][0].cloudStorageObject = gcsName;
req.files[key][0].cloudStoragePublicUrl = getPublicUrl(gcsName);
console.log('Finish Uploading: ' + req.files[key][0].cloudStoragePublicUrl);
folder.beginUploadNext();
});
writeStream.end(file.buffer);
};
var Folder = function(files) {
var self = this;
self.files = files;
self.reqFilesKeys = Object.keys(files); // reqFilesKeys is an array of keys parsed from req.files
self.nextInQuene = 0; // Keep track of the next file to be uploaded, must be less than reqFilesKeys.length
self.finishCounter = 0; // Keep track of how many files have been uploaded, must be less than reqFilesKeys.length
console.log(this.reqFilesKeys.length + ' files to upload');
};
// This function is used to initiate the upload process.
// It's also called in the on-finish listener of a file's write-stream,
// which will start uploading the next file in quene
Folder.prototype.beginUploadNext = function() {
// If there's still file left to upload,
if(this.finishCounter < this.reqFilesKeys.length) {
// and if there's still file left in quene
if(this.nextInQuene < this.reqFilesKeys.length) {
// upload the file
var fileToUpload = this.files[this.reqFilesKeys[this.nextInQuene]][0];
stream(fileToUpload, this.reqFilesKeys[this.nextInQuene], this);
// Increment the nextInQuene counter, and get the next one ready
this.nextInQuene++;
}
} else {
console.log('Finish all upload!!!!!!!!!!!!!!!!!!!!!!');
next();
}
};
Folder.prototype.incrementFinishCounter = function() {
this.finishCounter++;
console.log('Finished' + this.finishCounter + ' files');
};
var folder = new Folder(req.files);
// Begin upload with 3 streams
/*for(var i=0; i<3; i++) {
folder.beginUploadNext();
}*/
//Upload file one by one
folder.beginUploadNext();
}
I had the same issue with bower .. Run the following command: bower cache clean --allow-root
if this does not solve the problem, try after disabling anti virus.
While publishing my AIR application(CurrentFile), I have also included chatFile.swf with the installation files.
In my AIR settings panel [AIR 3.7 for Desktop], under 'Include Files' I have the following:
CurrentFile.swf
CurrentFile-app.xml
chatFile.swf
Here is the AS3 code in my CurrentFile.swf:
import flash.net.URLRequest;
import flash.events.Event;
import flash.display.Loader;
import flash.filesystem.File;
var chatLoaderWindow:Loader;
function loadchat(m:MouseEvent):void
{
chatLoaderWindow = new Loader();
chatLoaderWindow.contentLoaderInfo.addEventListener(Event.COMPLETE, chatLoadComplete);
chatLoaderWindow.contentLoaderInfo.addEventListener(Event.INIT, chatInitLoad);
chatLoaderWindow.contentLoaderInfo.addEventListener(IOErrorEvent.IO_ERROR, chatErrorLoad);
chatLoaderWindow.contentLoaderInfo.addEventListener(HTTPStatusEvent.HTTP_STATUS, chatHttpStatus);
myclip.chatwindow.addChild(chatLoaderWindow);
var f:File = File.applicationStorageDirectory.resolvePath("chatFile.swf");
chatLoaderWindow.load(new URLRequest(f.url));
tracebox.text = "Chat URL" + f.url;
}
function chatLoadComplete(e:Event):void
{
tracebox.text = "chat loaded";
}
function chatErrorLoad(io:IOErrorEvent):void
{
tracebox.text = "chat IO Error: "+io;
}
function chatInitLoad(i:Event):void
{
tracebox.text = "chat INIT";
}
function chatHttpStatus(e:HTTPStatusEvent):void
{
tracebox.text = "chat Http"+e;
}
myclip.chatbut.addEventListener(MouseEvent.CLICK,loadchat);
/*
Output:
chat IO Error: [IOErrorEvent type="ioError" bubbles=false cancelable=false eventPhase=2 text="Error #2035" errorID=2035]
EDIT: I figured it out. It was really simple
This is not required:
var f:File = File.applicationStorageDirectory.resolvePath("chatFile.swf");
chatLoaderWindow.load(new URLRequest(f.url));
Insert this:
chatLoaderWindow.load(new URLRequest("app:/chatFile.swf"));
So now my question is:
What is the purpose of File.applicationStorageDirectory.resolvePath?
There are two directories here. One is the "application" directory, where your install files are placed. One is the "application-storage" directory, which is a convenient place to write files to at runtime. To access these directories you can either use the File.resolvePath() function or use the URI-scheme shortcuts, app: or app-storage:. In your initial attempt, you were just looking in the wrong directory for your file.
File.applicationStorageDirectory.resolvePath("somefile.swf").url will equal "app-storage:/somefile.swf"
File.applicationDirectory.resolvePath("somefile.swf").url will equal "app:/somefile.swf"
The application directory is where your app was installed. The app storage directory is a folder your app can save files to.
resolvePath() returns a file object. You can use it for purposes other than getting the cross-platform url for the file location, such as fileObj.exists and fileObj.parent.createDirectory(). fileObj.url is just the url you would use with URLLoader to access the file in a platform-independent manner.