I'm learning to use the Redis for my backend database and I would like to try using redis-om for fastify not sure whether they are compatible or not, but I got error.
I use service of app.redislabs.com
I don't know what I just messed up? And how can I fix the problem?
server.js
const { createCar, createIndex } = require("./redis");
app.post("/add", async (req, res) => {
await createIndex();
const { make, model, image, description } = req.body;
const data = { make, model, image, description };
await createCar(data);
res.code(200).send('ok');
});
const PORT = 5000;
app.listen(PORT, function (err) {
if (err) {
app.log.error(err);
process.exit(1);
}
});
redis.js
const { Client, Entity, Schema, Repository } = require("redis-om");
const client = new Client();
const connect = async () => {
if (!client.isOpen()) {
await client.open("redis://default:password#localhost:6379");
} else {
console.log("CONNECTED");
}
};
class Car extends Entity {}
let schema = new Schema(
Car,
{
make: { type: "string" },
model: { type: "string" },
image: { type: "string" },
description: { type: "string" },
},
{ dataStructure: "JSON" }
);
const createCar = async (data) => {
await connect();
const repository = new Repository(schema, client);
const car = repository.createEntity(data);
const id = await repository.save(car);
return id;
};
const createIndex = async () => {
await connect();
const repository = new Repository(schema, client);
await repository.createIndex();
};
module.exports = {
createCar,
createIndex,
};
My JSON Body
You cannot call new on Repository. This is a breaking change I introduced in version 0.2.0 of Redis OM. There are a couple of others that are documented in the CHANGELOG.
Call const repository = client.fetchRepository(schema) instead, as shown here. Unfortunately, there are some videos and blogs that have the older syntax and so this crops up from time to time.
Thanks for using my library!
Related
So I'm trying to send an image to our server with react native using GraphQL query and I don't know why but it always return an error : [CombinedError: [Network] Network request failed].
The query :
import { graphql } from '../../gql';
import { gql, useMutation } from 'urql';
const AddProfilePicture_Mutation = graphql(`
mutation AddPicture_Mutation ($userId: ID!, $picture: Upload!) {
uploadProfilePicture(input: {
userId: $userId
picture: $picture
}) {
id
}
}`);
export const useAddProfilePicture = () => {
const [{fetching, error}, execute] = useMutation(AddProfilePicture_Mutation);
return {
error: !!error,
fetching,
addProfilePicture: execute,
}
}
and the code :
const pictureHandler = async () => {
const options = {
mediaType: 'photo' as MediaType,
includeBase64: true,
selectionLimit: 1,
};
const profilePicture = await launchImageLibrary(options);
if (profilePicture.assets?.[0].fileSize && profilePicture.assets?.[0].fileSize > MAXFILESIZE) {
showError(t('profileScreen.PictureSize'));
}
if (profilePicture.assets?.[0].uri && profilePicture.assets[0].fileName && profilePicture.assets[0].type) {
// const myBlob = await fetch(profilePicture.assets[0].uri).then(r => r.blob());
const blob = new Blob([profilePicture.assets[0].base64 as BlobPart], {
type: profilePicture.assets[0].type,
});
const file = new File([blob], profilePicture.assets[0].fileName, { type: `${profilePicture.assets[0].type}`});
const {error} = await addProfilePicture(
{ userId: userId!, picture: file},
{ fetchOptions: { headers: { 'graphql-require-preflight': '' } } }
);
if (!error) {
showSuccess(t('profileScreen.PictureSuccessAdded'));
navigation.navigate('UserProfile');
} else {
console.log(error);
showError(t('profileScreen.PictureErrorAdded'));
}
};
};
I've been trying everything I found on the web, Formdata, react-native-blob-util and rn-fetch-blob. If I try sending anything else then a File, the server reject it and says for exemple:
Variable 'picture' has an invalid value: Expected type org.springframework.web.multipart.MultipartFile but was java.util.LinkedHashMap]
Update :
After long research and help from other programmers. We never did found the answer. We open a new access point in the backend specifically for the uploaded picture and used a regular fetch post.
So I have a project using the latest Next js 13, React 18, Urql 3, and using typescript
Currently, I have issues when trying to query the urql from the getstaticprops function. My urql request needs a guest token, and I'm storing the token on session storage(other suggestions ?).
It has no issue when the query is running on the client, but I have it when querying inside the function.
My concern is related to the token reading, so the server cannot read the session storage value.
I'm asking what is the better and simplest way to make this work.
Does use cookies to store guest tokens will make this work?
Or the configuration that doesn't work?
This is my current config for urql.ts
import {
createClient,
ssrExchange,
dedupExchange,
cacheExchange,
fetchExchange,
} from "urql";
import { GRAPH_URL } from "#lib/constant/env";
import type { TypedDocumentNode } from "#urql/core";
const isServerSide = typeof window === "undefined";
const ssrCache = ssrExchange({
isClient: !isServerSide,
});
const client = createClient({
url: GRAPH_URL,
exchanges: [dedupExchange, cacheExchange, ssrCache, fetchExchange],
fetchOptions: () => {
const token = sessionStorage.getItem("accessToken");
return {
headers: {
authorization: token ? `Bearer ${token}` : "",
},
};
},
});
const query = async (
query: TypedDocumentNode<any, object>,
variables?: Record<string, string | string[] | unknown>
) => {
try {
const response = await client.query(query, variables as any).toPromise();
return response;
} catch (error) {
if (error instanceof Error) console.error(error.message);
}
};
const mutation = async (
mutation: TypedDocumentNode<any, object>,
variables?: Record<string, string | string[] | unknown>
) => {
try {
const response = await client
.mutation(mutation, variables as any)
.toPromise();
return response;
} catch (error) {
if (error instanceof Error) console.error(error.message);
}
};
export { client, query, mutation, ssrCache };
And this some of the code for the blog index page
export const getStaticProps = async () => {
await fetchArticlesSummary();
return {
props: {
urqlState: ssrCache.extractData(),
},
revalidate: 600,
};
};
export default withUrqlClient(() => ({
url: GRAPH_URL,
}))(BlogPage);
This is for the fetchArticlesSummary
export const fetchArticlesSummary = async () => {
try {
const {
data: { listArticles },
}: any = await query(getListArticle);
return listArticles.items;
} catch (error) {
return {
notFound: true,
};
}
};
I also doing a setup on _app.tsx
export default function App({ Component, pageProps }: AppProps) {
if (pageProps.urqlState) {
ssrCache.restoreData(pageProps.urqlState);
}
return (
<Provider value={client}>
<Component {...pageProps} />
</Provider>
);
}
Thank you
I have followed urql documentation about server-side configuration and many others but still don't have any solutions.
I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`PiƱata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}
I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)
I'm building a GraphQL API and I want to test some resolvers and the database with jest.
Here is my helper file, where I set up the context and the Prisma Client for testing.
import { PrismaClient } from "#prisma/client";
import { ServerInfo } from "apollo-server";
import { execSync } from "child_process";
import getPort, { makeRange } from "get-port";
import { GraphQLClient } from "graphql-request";
import { nanoid } from "nanoid";
import { join } from "path";
import { Client } from "pg";
import { server } from "../api/server";
type TestContext = {
client: GraphQLClient;
db: PrismaClient;
};
export function createTestContext(): TestContext {
let ctx = {} as TestContext;
const graphqlCtx = graphqlTestContext();
const prismaCtx = prismaTestContext();
beforeEach(async () => {
const client = await graphqlCtx.before();
const db = await prismaCtx.before();
Object.assign(ctx, {
client,
db,
});
});
afterEach(async () => {
await graphqlCtx.after();
await prismaCtx.after();
});
return ctx;
}
function graphqlTestContext() {
let serverInstance: ServerInfo | null = null;
return {
async before() {
const port = await getPort({ port: makeRange(4000, 6000) });
serverInstance = await server.listen({ port });
return new GraphQLClient(`http://localhost:${port}`);
},
async after() {
serverInstance?.server.close();
},
};
}
function prismaTestContext() {
const prismaBinary = join(__dirname, "..", "node_modules", ".bin", "prisma");
let schema = "";
let databaseUrl = "";
let prismaClient: null | PrismaClient = null;
return {
async before() {
schema = `test_${nanoid()}`;
databaseUrl = `postgresql://user:123#localhost:5432/testing?schema=${schema}`;
process.env.DATABASE_URL = databaseUrl;
execSync(`${prismaBinary} migrate up --create-db --experimental`, {
env: {
...process.env,
DATABASE_URL: databaseUrl,
},
});
prismaClient = new PrismaClient();
return prismaClient;
},
async after() {
const client = new Client({
connectionString: databaseUrl,
});
await client.connect();
await client.query(`DROP SCHEMA IF EXISTS "${schema}" CASCADE`);
await client.end();
await prismaClient?.$disconnect();
},
};
}
My test file looks like this:
import { createTestContext } from "./__helpers";
const ctx = createTestContext();
it("register user", async () => {
const testUser = {
username: "Test",
email: "test#test.com",
password: "password",
};
const registerResult = await ctx.client.request(
`
mutation registerNewUser($username: String!, $email: String!, $password: String!) {
register(username: $username, email: $email, password: $password) {
user {
user_id
username
email
}
}
}
`,
{
username: testUser.username,
email: testUser.email,
password: testUser.password,
}
);
const resultUsername = registerResult.register.user.username;
const resultEmail = registerResult.register.user.email;
const resultUserID = registerResult.register.user.user_id;
expect(resultUsername).toBe(testUser.username);
expect(resultEmail).toBe(testUser.email);
expect(resultUserID).not.toBeNull;
const users = await ctx.db.user.findMany();
const savedUser = users[0];
expect(savedUser.username).toBe(testUser.username);
expect(savedUser.email).toBe(testUser.email);
expect(savedUser.user_id).toBe(resultUserID);
expect(savedUser.first_name).toBeNull;
expect(savedUser.last_name).toBeNull;
expect(savedUser.role).toBe("USER");
expect(savedUser.password).not.toBe(testUser.password);
});
it("all events", async () => {
const eventsResult = await ctx.client.request(
`
query {
allEvents {
event_id
title
description
}
}
`
);
expect(eventsResult.allEvents.length).toBe(0)
});
When I just run one file with one test in it, everything works. But when I run multiple tests in one file, the first one runs normal, but the ones after not. I receive this error:
The table `test_LjrcmbMjI4vLaDYM9-lvw.Event` does not exist in the current database.: {"response":{"errors":[{"message":"\nInvalid `prisma.event.findMany()` invocation:\n\n\n The table `test_LjrcmbMjI4vLaDYM9-lvw.Event` does not exist in the current database.","locations":[{"line":3,"column":7}],"path":["allEvents"],"extensions":{"code":"INTERNAL_SERVER_ERROR","exception":{"code":"P2021","clientVersion":"2.11.0","meta":{"table":"test_LjrcmbMjI4vLaDYM9-lvw.Event"}}}}],"data":null,"status":200},"request":{"query":"\n query {\n allEvents {\n event_id\n title\n description\n }\n }\n "}}
Also when I run two tests in separated files, on every second test run I get this error:
listen EADDRINUSE: address already in use :::4200
I did the nexus tutorial (Step 4, and 5), where they explained how to test, but somehow it doesn't work. So please help me.
https://nexusjs.org/docs/getting-started/tutorial
I have created a repo with parallel tests for the same here. The test environment setup is in the prisma folder and a similar helper is created in the tests folder.
I'm testing a simple rule:
match /users/{userId} {
allow write, get: if isSignedIn() && userOwner(userId);
}
function isSignedIn(){
return request.auth != null
}
function userOwner(userId){
return userId == request.auth.uid
}
Here is my test:
test("read succeed only if requested user is authenticated user", async () => {
const db = await setup(
{
uid: "testid",
email: "test#test.com"
},
{
"users/testid": {},
"users/anotherid": {}
}
);
const userRef = db.collection("users");
expect(await assertSucceeds(userRef.doc("testid").get()));
expect(await assertFails(userRef.doc("anotherid").get()));
})
And the setup method:
export const setup = async (auth?: any, data?: any) => {
const projectId = `rules-spec-${Date.now()}`;
const app = firebase.initializeTestApp({
projectId,
auth
});
const db = app.firestore();
if (data) {
for (const key in data) {
const ref = db.doc(key);
await ref.set(data[key]);
}
}
await firebase.loadFirestoreRules({
projectId,
rules: fs.readFileSync("firestore.rules").toString()
});
return db;
};
It throws the following error :
FirebaseError: 7 PERMISSION_DENIED:
false for 'create' # L5, Null value error. for 'create' # L9
It seems that when it tries to set the mock data given in setup, it can't because of the write rule. but I don't understand, I load the rules after the database being set.
Any idea what's going on here?
You can try setting the rules to be open before you populate the database.
After the data is set you load the rules you are trying to test.
export const setup = async(auth ? : any, data ? : any) => {
const projectId = `rules-spec-${Date.now()}`;
const app = firebase.initializeTestApp({
projectId,
auth
});
const db = app.firestore();
await firebase.loadFirestoreRules({
projectId,
rules:
"service cloud.firestore {match/databases/{database}/documents" +
"{match /{document=**} {" +
"allow read, write: if true;" +
"}}}",
});
if (data) {
for (const key in data) {
const ref = db.doc(key);
await ref.set(data[key]);
}
}
await firebase.loadFirestoreRules({
projectId,
rules: fs.readFileSync("firestore.rules").toString()
});
return db;
};