How to upload image(s) through SQL queries? - directus

I want to upload images through SQL queries. But I can't because the image field is UUID type.
How can I upload? My file location is server and I don't want to store them on the database. Just on the server and then give a relative URL.

To upload a file in Directus, POST the multipart/form-data as the encoding type to the /files endpoint.
The file contents has to to be provided in a part called file. All other properties of the file object can be provided as parts as well.
https://docs.directus.io/reference/api/system/files/#upload-a-file

Related

How to add an extension to a file copy activity with Azure Data Factory

The datasets that I ingest from a REST API endpoint do not include the .JSON extension to the files (even though they're JSON files). Therefore, can someone let me know where I can add a .json extension from the following scenarios
Scenario 1.
Adding .JSON to the relativeURL
Scenario 2
Adding .JSON to the SINK
Scenario 3
Adding .JSON to SOURCE - However, I don't think this is possible
Can someone please take a look at the three scenarios and let me know if I can add .JSON extension to any of those methods?
Thanks to #Scott Mildenberger, we can provide the name of the file and its extension from the sink dataset.
The following is a demonstration of the same. I have a file called sample without extension.
In the sink dataset, you can simply concat the extension to your filename (If it is just a single file, you can directly give the required name with extension directly). I have used the following dynamic content (fileName parameter value is req_filename).
#concat(dataset().fileName,'.json')
The following file would be generated in the sink.

Parse file and patch schema on Publish

I'm trying to use the #sanity/react-hooks to create a document action to parse a file (from the draft) and patch other fields.
Example
A user adds a .txt file to a file field. When the document is published I would like to parse the file and patch some readOnly fields using data from the .txt file.
This means I need to be able to read the new file.
I've managed to make Actions work in simple ways, like access a string field and patch another field. But I can't seem to access the file asset.
I've followed this tutorial but it doesn't seem to work for a file asset.
Is this possible? And if so, how can I parse a file from the draft to patch another field?
Or perhaps a custom input is the way to go here?

Compressing PDF to ZIP and save in table inside SQL

I'm getting a Base64 file send by some application to my application and I need to Decode it in SQL, make that file to be a *.zip and then store as a zip in a proper column in table.
The decoding part is ready. Now, I'm stuck on the step in which I need to make that file to be a ZIP.
The question is: Is there any way or chance to make the compression only using SQL?

What's the difference between readAsStringAsync() and writeAsStringAsync() FileSystem in expo?

I have a function that generates a PDF file and i want to get the Uri's of the file to stoke it locally and then send it as an attachment with MailComposer. I want to know what's the difference between readAsStringAsync() and writeAsStringAsync() in FileSystem?
Both readAsStringAsync() and writeAsStringAsync() are functions of Expo FileSystem.
But the difference is readAsStringAsync() used to read a local files of the device. While writeAsStringAsync() is used to write onto a local file or create a file (if the named file does not exist on the location)
According to Expo Documentation documentation
FileSystem.writeAsStringAsync:
"Write the entire contents of a file as a string."
FileSystem.readAsStringAsync:
"Read the entire contents of a file as a string. Binary will be returned in raw format, you will need to append data:image/png;base64, to use it as Base64."

BigQuery Backend Errors during upload operation

I want to know what are the possible errors that can arose from Big Query server side during upload mechanism, though the .CSV file that i'm uploading contains perfect data. Can you list out those errors?
Thanks.
Some of the common errors are:
Files must be encoded in UTF-8 format.
Source data must be properly
escaped within standard guidelines for CSV and JSON.
The structure of
records and the data within of must match the schema provided.
Individual files must be under the size limits listed on our
quota/limits page.
More information about BigQuery source data formats.
Check out our Data Loading cookbook for additional tips.