How can I post documents using Sanity? - api

I'm using sanity cms for an application, and I have some documents stored on db. Which query should I use the create another document in the database from my frontend?

//Create a client file (for example, in scr folder in React) and add where appropriate
//In my case, I have a token and project Id in a.env file
import sanityClient from '#sanity/client';
import imageUrlBulder from '#sanity/image-url';
export const client = sanityClient({
projectId: process.env.REACT_APP_SANITY_PROJECT_ID,
dataset:'production',
apiVersion: '2020-10-21',
useCdn:true,
token: process.env.REACT_APP_SANITY_API_TOKEN,
});
const builder = imageUrlBulder(client);
export const urlFor = (source) => builder.image(source);
//in the other file include these lines
import { client } from '../client';
//these fields should match your schema declaration
const doc = {
_id: "yout_id",
_type: 'your_doc_type',
userName:name,
image: imageUrl,
}
client.createIfNotExists(doc)
.then(() => {
console.log("document created")
})
.catch(console.error);
check this video for more

Related

react-native (Expo) upload file on background

In my Expo (react-native) application, I want to do the upload task even if the application is in the background or killed.
the upload should be done to firebase storage, so we don't have a REST API.
checked out the Expo task manager library, but I could not figure out how it should be done. is it even possible to achieve this goal with Expo? is the TaskManager the correct package for this task?
there are only some Expo packages that could be registered as a task (e.g. backgroundFetch), and it is not possible to register a custom function (in this case uploadFile method).
I even got more confused as we should enable add UIBackgroundModes key for iOS but it only has audio,location,voip,external-accessory,bluetooth-central,bluetooth-peripheral,fetch,remote-notification,processing as possible values.
I would appreciate it if you can at least guide me on where to start or what to search for, to be able to upload the file even if the app is in the background is killed/terminated.
import { getStorage, ref, uploadBytes } from "firebase/storage";
const storage = getStorage();
const storageRef = ref(storage, 'videos');
const uploadFile = async (file)=>{
// the file is Blob object
await uploadBytes(storageRef, file);
}
I have already reviewed react-native-background-fetch, react-native-background-upload, react-native-background-job . upload should eject Expo, job does not support iOS, and fetch is a fetching task designed for doing task in intervals.
if there is a way to use mentioned libraries for my purpose, please guide me :)
to my understanding, the Firebase Cloud JSON API does not accept files, does it ? if so please give me an example. If I can make storage json API work with file upload, then I can use Expo asyncUpload probably without ejecting.
I have done something similar like you want, you can use expo-task-manager and expo-background-fetch. Here is the code as I used it. I Hope this would be useful for you.
import * as BackgroundFetch from 'expo-background-fetch';
import * as TaskManager from 'expo-task-manager';
const BACKGROUND_FETCH_TASK = 'background-fetch';
const [isRegistered, setIsRegistered] = useState(false);
const [status, setStatus] = useState(null);
//Valor para que se ejecute en IOS
BackgroundFetch.setMinimumIntervalAsync(60 * 15);
// Define the task to execute
TaskManager.defineTask(BACKGROUND_FETCH_TASK, async () => {
const now = Date.now();
console.log(`Got background fetch call at date: ${new Date(now).toISOString()}`);
// Your function or instructions you want
return BackgroundFetch.Result.NewData;
});
// Register the task in BACKGROUND_FETCH_TASK
async function registerBackgroundFetchAsync() {
return BackgroundFetch.registerTaskAsync(BACKGROUND_FETCH_TASK, {
minimumInterval: 60 * 15, // 1 minutes
stopOnTerminate: false, // android only,
startOnBoot: true, // android only
});
}
// Task Status
const checkStatusAsync = async () => {
const status = await BackgroundFetch.getStatusAsync();
const isRegistered = await TaskManager.isTaskRegisteredAsync(
BACKGROUND_FETCH_TASK
);
setStatus(status);
setIsRegistered(isRegistered);
};
// Check if the task is already register
const toggleFetchTask = async () => {
if (isRegistered) {
console.log('Task ready');
} else {
await registerBackgroundFetchAsync();
console.log('Task registered');
}
checkStatusAsync();
};
useEffect(() => {
toggleFetchTask();
}, []);
Hope this isn't too late to be helpful.
I've been dealing with a variety of expo <-> firebase storage integrations recently, and here's some info that might be helpful.
First, I'd recommend not using the uploadBytes / uploadBytesResumable methods from Firebase. This Thread has a long ongoing discussion about it, but basically it's broken in v9. Maybe in the future the Firebase team will solve the issues, but it's pretty broken with Expo right now.
Instead, I'd recommend either going down the route of writing a small Firebase function that either gives a signed-upload-url or handles the upload itself.
Basically, if you can get storage uploads to work via an http endpoint, you can get any kind of upload mechanism working. (e.g. the FileSystem.uploadAsync() method you're probably looking for here, like #brentvatne pointed out, or fetch, or axios. I'll show a basic wiring at the end).
Server Side
Option 1: Signed URL Upload.
Basically, have a small firebase function that returns a signed url. Your app calls a cloud function like /get-signed-upload-url , which returns the url, which you then use. Check out: https://cloud.google.com/storage/docs/access-control/signed-urls for how you'd go about this.
This might work well for your use case. It can be configured just like any httpsCallable function, so it's not much work to set up, compared to option 2.
However, this doesn't work for the firebase storage / functions emulator! For this reason, I don't use this method, because I like to intensively use the emulators, and they only offer a subset of all the functionalities.
Option 2: Upload the file entirely through a function
This is a little hairier, but gives you a lot more fidelity over your uploads, and will work on an emulator! I like this too because it allows doing upload process within the endpoint execution, instead of as a side effect.
For example, you can have a photo-upload endpoint generate thumbnails, and if the endpoint 201's, then you're good! Rather than the traditional Firebase approach of having a listener to cloud storage which would generate thumbnails as a side effect, which then has all kinds of bad race conditions (checking for processing completion via exponentiational backoff? Gross!)
Here are three resources I'd recommend to go about this approach:
https://cloud.google.com/functions/docs/writing/http#multipart_data
https://github.com/firebase/firebase-js-sdk/issues/5848
https://github.com/mscdex/busboy
Basically, if you can make a Firebase cloud endpoint that accepts a File within formdata, you can have busboy parse it, and then you can do anything you want with it... like upload it to Cloud Storage!
an outline of this:
import * as functions from "firebase-functions";
import * as busboy from "busboy";
import * as os from "os";
import * as path from "path";
import * as fs from "fs";
type FieldMap = {
[fieldKey: string]: string;
};
type Upload = {
filepath: string;
mimeType: string;
};
type UploadMap = {
[fileName: string]: Upload;
};
const MAX_FILE_SIZE = 2 * 1024 * 1024; // 2MB
export const uploadPhoto = functions.https.onRequest(async (req, res) => {
verifyRequest(req); // Verify parameters, auth, etc. Better yet, use a middleware system for this like express.
// This object will accumulate all the fields, keyed by their name
const fields: FieldMap = {};
// This object will accumulate all the uploaded files, keyed by their name.
const uploads: UploadMap = {};
// This will accumulator errors during the busboy process, allowing us to end early.
const errors: string[] = [];
const tmpdir = os.tmpdir();
const fileWrites: Promise<unknown>[] = [];
function cleanup() {
Object.entries(uploads).forEach(([filename, { filepath }]) => {
console.log(`unlinking: ${filename} from ${path}`);
fs.unlinkSync(filepath);
});
}
const bb = busboy({
headers: req.headers,
limits: {
files: 1,
fields: 1,
fileSize: MAX_FILE_SIZE,
},
});
bb.on("file", (name, file, info) => {
verifyFile(name, file, info); // Verify your mimeType / filename, etc.
file.on("limit", () => {
console.log("too big of file!");
});
const { filename, mimeType } = info;
// Note: os.tmpdir() points to an in-memory file system on GCF
// Thus, any files in it must fit in the instance's memory.
console.log(`Processed file ${filename}`);
const filepath = path.join(tmpdir, filename);
uploads[filename] = {
filepath,
mimeType,
};
const writeStream = fs.createWriteStream(filepath);
file.pipe(writeStream);
// File was processed by Busboy; wait for it to be written.
// Note: GCF may not persist saved files across invocations.
// Persistent files must be kept in other locations
// (such as Cloud Storage buckets).
const promise = new Promise((resolve, reject) => {
file.on("end", () => {
writeStream.end();
});
writeStream.on("finish", resolve);
writeStream.on("error", reject);
});
fileWrites.push(promise);
});
bb.on("close", async () => {
await Promise.all(fileWrites);
// Fail if errors:
if (errors.length > 0) {
functions.logger.error("Upload failed", errors);
res.status(400).send(errors.join());
} else {
try {
const upload = Object.values(uploads)[0];
if (!upload) {
functions.logger.debug("No upload found");
res.status(400).send("No file uploaded");
return;
}
const { uploadId } = await processUpload(upload, userId);
cleanup();
res.status(201).send({
uploadId,
});
} catch (error) {
cleanup();
functions.logger.error("Error processing file", error);
res.status(500).send("Error processing file");
}
}
});
bb.end(req.rawBody);
});
Then, that processUpload function can do anything you want with the file, like upload it to cloud storage:
async function processUpload({ filepath, mimeType }: Upload, userId: string) {
const fileId = uuidv4();
const bucket = admin.storage().bucket();
await bucket.upload(filepath, {
destination: `users/${userId}/${fileId}`,
{
contentType: mimeType,
},
});
return { fileId };
}
Mobile Side
Then, on the mobile side, you can interact with it like this:
async function uploadFile(uri: string) {
function getFunctionsUrl(): string {
if (USE_EMULATOR) {
const origin =
Constants?.manifest?.debuggerHost?.split(":").shift() || "localhost";
const functionsPort = 5001;
const functionsHost = `http://${origin}:${functionsPort}/{PROJECT_NAME}/${PROJECT_LOCATION}`;
return functionsHost;
} else {
return `https://{PROJECT_LOCATION}-{PROJECT_NAME}.cloudfunctions.net`;
}
}
// The url of your endpoint. Make this as smart as you want.
const url = `${getFunctionsUrl()}/uploadPhoto`;
await FileSystem.uploadAsync(uploadUrl, uri, {
httpMethod: "POST",
uploadType: FileSystem.FileSystemUploadType.MULTIPART,
fieldName: "file", // Important! make sure this matches however you want bussboy to validate the "name" field on file.
mimeType,
headers: {
"content-type": "multipart/form-data",
Authorization: `${idToken}`,
},
});
});
TLDR
Wrap Cloud Storage in your own endpoint, treat it like a normal http upload, everything plays nice.

How to write a mutation resolver that handles image AND form data upload stream?

So, currently I’m able to successfully upload an image through the client to an S3 bucket. However, my question is, how would I go about including additional form fields, such as an accompanying title and/or body field, with the existing image file?
The server component primarily utilizes apollo-server-express, graphql-upload, and aws-sdk while the client just uses apollo-client
My end goal is to essentially be able to take the user’s form data on submit that consists of multiple text fields and an image file, and then upload the contents to the s3 bucket in its own individual directory.
Would it be possible to compile the non-image-file form data as a json, and then upload both the files (the json and the image file) as a batch?
Here are some snippets for some context:
// server/typedefs.js
const typeDefs = gql`
scalar Upload
type File {
id: ID!
filename: String!
mimetype: String!
encoding: String!
}
type Mutation {
singleUploadStream(file: Upload!): File!
}
type Query {
files: [File]
}
`;
// server/resolvers.js
...
...
Mutation: {
singleUploadStream: async (parent, args) => {
const file = await args.file;
const { createReadStream, filename } = file;
const fileStream = createReadStream();
const uploadParams = {
Bucket: process.env.BUCKET_NAME,
Key: `uploads/${filename}/${filename}`,
Body: fileStream,
};
await s3.upload(uploadParams).promise();
return file;
},
}
...
...
// server/index.js
...
...
const app = express();
const server = new ApolloServer({
typeDefs,
resolvers,
uploads: false,
introspection: true,
});
app.use(graphqlUploadExpress());
server.applyMiddleware({ app });
app.listen({ port: 4000 }, () => {
console.log(`🚀 Server ready at http://localhost:4000${server.graphqlPath}`);
});
please don’t hesitate to ask if you need more info or if you’re not sure what I’m asking for!
I really appreciate any help I can get ☹️ Thank you!
My resources:
https://www.apollographql.com/blog/graphql-file-uploads-with-react-hooks-typescript-amazon-s3-tutorial-ef39d21066a2/
https://dev.to/fhpriamo/painless-graphql-file-uploads-with-apollo-server-to-amazon-s3-and-local-filesystem-1bn0
https://www.thomasmaximini.com/upload-images-to-aws-s3-with-react-and-apollo-graphql

Accessing Vuex store in Nuxt project from JS file

In my Nuxt project I have a file named "apiAccess.js" in the root folder. This file simply exports a bunch of functions that make Ajax calls to the server API. This file is imported in any page that needs access to the server API. I need to send a JWT token with each of these api requests, and I have stored that token in the Vuex store.
I need to access the JWT token from the Vuex store within this "apiAccess.js" file. Unfortuntaely, this.$store is not recognized within this file. How do I access the Vuex store from within this file? Or should I have done something differently?
Here's a snippet from the apiAccessjs file where I try to access the store:
import axios from 'axios'
const client = axios.create({
baseURL: 'http://localhost:3000/api',
json: true,
headers: { Authorization: 'Bearer' + this.$store.state.auth.token }
})
After i readed this post i used this generic structure:
// generic actions file
import {
SET_DATA_CONTEXT,
SET_ITEM_CONTEXT
} from '#/types/mutations'
// PAGEACTIONS
export const getDataContext = api => async function ({ commit }) {
const data = await this[api].get()
commit(SET_DATA_CONTEXT, data)
}
export const getItemContext = api => async function ({ commit }, id) {
const data = await this[api].getById(id)
commit(SET_ITEM_CONTEXT, data)
}
export const createItemContext = api => async function ({}, form) {
await this[api].create(form)
}
export const updateItemContext = api => async function ({}, form) {
await this[api].update(form)
}
export const deleteItemContext = api => async function ({}, id) {
await this[api].delete(id)
}
and for any store i used actions from my generic file:
// any store file
import {
getDataContext,
getItemContext,
createItemContext,
updateItemContext,
deleteItemContext,
setDynamicModal
} from '#/use/store.actions'
const API = '$rasterLayerAPI'
export const state = () => ({
dataContext: [],
itemContext: {},
})
export const actions = {
createItemContext: createItemContext(API),
getDataContext: getDataContext(API),
getItemContext: getItemContext(API),
updateItemContext: updateItemContext(API),
deleteItemContext: deleteItemContext(API),
}
because I had many stores with similar features.
and the same for mutations i used generic mutations functions.

How to check downloaded file name?

I wrote a test that downloads the export file, but I also need to send this file through email. The problem is that filename is always different and I don't know how to look it up during the test.
You can retrieve the dynamic downloaded filename from the 'content-disposition' header.
import { Selector, RequestLogger } from 'testcafe';
const url = 'https://demos.devexpress.com/ASPxGridViewDemos/Exporting/Exporting.aspx';
const logger = RequestLogger({ url, method: 'post' }, {
logResponseHeaders: true
});
fixture `Export`
.page(url)
.requestHooks(logger);
test('export to csv', async t => {
const exportToCSVButton = Selector('span').withText('Export to CSV');
await t
.click(exportToCSVButton)
.expect(logger.contains(r => r.response.statusCode === 200)).ok();
console.log(logger.requests[0].response.headers['content-disposition']);
});
See also: Check the Downloaded File Name and Content

React-Admin <ImageInput> to upload images to rails api

I am trying to upload images from react-admin to rails api backend using active storage.
In the documentation of react-admin it says: "Note that the image upload returns a File object. It is your responsibility to handle it depending on your API behavior. You can for instance encode it in base64, or send it as a multi-part form data" I am trying to send it as a multi-part form.
I have been reading here and there but I can not find what I want, at least a roadmap of how I should proceed.
You can actually find an example in the dataProvider section of the documentation.
You have to decorate your dataProvider to enable the data upload. Here is the example of transforming the images into base64 strings before posting the resource:
// in addUploadFeature.js
/**
* Convert a `File` object returned by the upload input into a base 64 string.
* That's not the most optimized way to store images in production, but it's
* enough to illustrate the idea of data provider decoration.
*/
const convertFileToBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file.rawFile);
reader.onload = () => resolve(reader.result);
reader.onerror = reject;
});
/**
* For posts update only, convert uploaded image in base 64 and attach it to
* the `picture` sent property, with `src` and `title` attributes.
*/
const addUploadFeature = requestHandler => (type, resource, params) => {
if (type === 'UPDATE' && resource === 'posts') {
// notice that following condition can be true only when `<ImageInput source="pictures" />` component has parameter `multiple={true}`
// if parameter `multiple` is false, then data.pictures is not an array, but single object
if (params.data.pictures && params.data.pictures.length) {
// only freshly dropped pictures are instance of File
const formerPictures = params.data.pictures.filter(p => !(p.rawFile instanceof File));
const newPictures = params.data.pictures.filter(p => p.rawFile instanceof File);
return Promise.all(newPictures.map(convertFileToBase64))
.then(base64Pictures => base64Pictures.map((picture64, index) => ({
src: picture64,
title: `${newPictures[index].title}`,
})))
.then(transformedNewPictures => requestHandler(type, resource, {
...params,
data: {
...params.data,
pictures: [...transformedNewPictures, ...formerPictures],
},
}));
}
}
// for other request types and resources, fall back to the default request handler
return requestHandler(type, resource, params);
};
export default addUploadFeature;
You can then apply this on your dataProvider:
// in dataProvider.js
import simpleRestProvider from 'ra-data-simple-rest';
import addUploadFeature from './addUploadFeature';
const dataProvider = simpleRestProvider('http://path.to.my.api/');
const uploadCapableDataProvider = addUploadFeature(dataProvider);
export default uploadCapableDataProvider;
Finally, you can use it in your admin as usual:
// in App.js
import { Admin, Resource } from 'react-admin';
import dataProvider from './dataProvider';
import PostList from './posts/PostList';
const App = () => (
<Admin dataProvider={uploadCapableDataProvider}>
<Resource name="posts" list={PostList} />
</Admin>
);
When using files, use a multi-part form in the react front-end and for example multer in your API backend.
In react-admin you should create a custom dataProvider and extend either the default or built a custom one. Per implementation you should handle the file/files upload. For uploading a file or files from your custom dataprovider in react-admin:
// dataProvider.js
// this is only the implementation for a create
case "CREATE":
const formData = new FormData();
for ( const param in params.data ) {
// 1 file
if (param === 'file') {
formData.append('file', params.data[param].rawFile);
continue
}
// when using multiple files
if (param === 'files') {
params.data[param].forEach(file => {
formData.append('files', file.rawFile);
});
continue
}
formData.append(param, params.data[param]);
}
return httpClient(`myendpoint.com/upload`, {
method: "POST",
body: formData,
}).then(({ json }) => ({ data: json });
From there you pick it up in your API using multer, that supports multi-part forms out-of-the-box. When using nestjs that could look like:
import {
Controller,
Post,
Header,
UseInterceptors,
UploadedFile,
} from "#nestjs/common";
import { FileInterceptor } from '#nestjs/platform-express'
#Controller("upload")
export class UploadController {
#Post()
#Header("Content-Type", "application/json")
// multer extracts file from the request body
#UseInterceptors(FileInterceptor('file'))
async uploadFile(
#UploadedFile() file : Record<any, any>
) {
console.log({ file })
}
}