GraphQL schema won't import - express

I'm trying setup an express GraphQL server. Following a tutorial when I put the following in the server startup like this:
// ENTIRE SCHEMA IN MAIN FILE THIS WORKS!!!
...
var graphql = require('graphql');
const RootQuery = new graphql.GraphQLObjectType({
name: 'RootQuery',
description: 'The root query',
fields: {
viewer: {
type: graphql.GraphQLString,
resolve() {
return 'viewer!';
}
}
}
});
const Schema = new graphql.GraphQLSchema({
query: RootQuery
});
app.use('/graphql', graphqlHTTP({ schema: Schema }));
...
it works, returning the data 'viewer! But as I don't want everything in the main file, I tried to transfer this exact code to another file and import it like this:
//THIS DOES NOT WORK
...
var Schema = require('./build/models/graphql/schema');
app.use('/graphql', graphqlHTTP({ schema: Schema }));
...
I get the following error:
{
"errors": [
{
"message": "Schema must be an instance of GraphQLSchema. Also ensure that there are not multiple versions of GraphQL installed in your node_modules directory."
}
]
}
I'm not sure what I'm doing wrong. In case this has anything to do with it, I am writing in es6 then transpiling back to 5 in a build script. Here's the build of the schema file:
// TRANSPILED SCHEMA
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var graphql = require('graphql');
var RootQuery = new graphql.GraphQLObjectType({
name: 'RootQuery',
description: 'The root query',
fields: {
viewer: {
type: graphql.GraphQLString,
resolve: function resolve() {
return 'viewer!';
}
}
}
});
var Schema = new graphql.GraphQLSchema({
query: RootQuery
});
exports.default = Schema;
And here is my package.json:
"express": "^4.13.4",
"express-graphql": "^0.5.3",
"graphql": "^0.6.0",
I've checked that only one graphql is in the node_modules folder. Does graphql expect the same INSTANCE across all modules, like a shared global instance? Does express-graphql use it's own version? How do I check? I'm new to node, is there a way to check the instances?

I don't think this is a GraphQL problem, but a problem with how you're using require and exports in JS. The problem is probably:
var Schema = require('./build/models/graphql/schema')
along with
var Schema = new graphql.GraphQLSchema({
query: RootQuery
});
exports.default = Schema;
You're not importing the same value that you're exporting. Try either exporting Schema as module.exports = Schema or importing it as Schema = require("./...").default

Also ensure that there are not multiple versions of GraphQL installed in your node_modules directory.
As the error indicates, this most likely has to do with more than one copy of GraphQL in your node_modules directory. Have you checked that? If there is more than one copy, you might be able to solve it by running npm dedupe if you're using npm version 2. If you're using npm 3, then most likely you've installed two different versions of the graphql module.
Either way, you have to make sure that after the compile step, express-graphql and your schema both point to the same copy of the graphql module.

If you think implementing & maintaining Graphql services (mainly their schemas), please have a look at graphqly. Here's a sample code to define Graphql schemas:
import graphly from "graphqly";
const gBuilder = graphly.createBuilder();
// define types, inputs ... (in any order)
gBuilder.type("Products").implements("List").def(`
products: [Product]!
`);
gBuilder.type("Product").def(`
id: ID!
name: String!
link: String
price: Int
`);
// we're too lazy to define a separate input, so we can `extend` other structure
gBuilder.input("ProductInput").ext("Product");
gBuilder.enum("ProductOrder").def(`
PRICE_DESCENDING
PRICE_ASCENDING
NEWEST
`);

Related

Sveltekit,Supabase and Vercel (problem with Supabase when deploying to Vercel)

I'm trying to set up Sveltekit, Supabase and Vercel.
It works correctly on a local environment (SvelteKit and Supabase), but when I deploy it to Vercel there is a problem with Supabase - "Error: supabaseUrl is required" (I post a screenshot below).
If I don't use Supabase, there are no problems with deploying to Vercel.
Please someone if you have encountered a similar one or have a suggestion to share.
I finally got this to work after doing a couple of things I pieced together from a few sources.
First, I added the the environment variables in Vercel just as the were in the .env file. For example, VITE_SUPABASE_URL and VITE_SUPABASE_ANON_KEY along with their values.
Next, I added some code in the svelte.config.js file. The result of the file looks like this:
import adapter from '#sveltejs/adapter-auto'
/** #type {import('#sveltejs/kit').Config} */
const config = {
kit: {
adapter: adapter(),
vite: {
define: {
'process.env': process.env,
},
},
// hydrate the <div id="svelte"> element in src/app.html
target: '#svelte',
},
}
export default config
I redeployed the project at Vercel, and it worked.
You should add your Supabase URL and Supabase ANON KEY to vercel and stick to the format given below VITE_SUPABASE_URL,VITE_SUPABASE_ANON_KEY if you have initialized according to the supabase guide.
More than adding the configuration to your svelte.config.js file, you should emphasize on adding environment variables to your Vercel environment if you have added this file
// utils/supabase.js
import { createClient } from '#supabase/supabase-js'
const supabaseUrl = import.meta.env.VITE_SUPABASE_URL
const supabaseAnonKey = import.meta.env.VITE_SUPABASE_ANON_KEY
export const supabase = createClient(supabaseUrl, supabaseAnonKey)

Strapi v4 Extending Server API for Plugins does not work

I am trying to follow the Strapi v4.0.0 guide on https://docs.strapi.io/developer-docs/latest/developer-resources/plugin-api-reference/server.html#entry-file for extending the users-permission plugin to add a custom route/controller, but so far have been unsuccessful. I add the custom files as stated in the docs, but there is no change in the UI.
I managed to get this to work for normal API highlighted in yellow, but was unable to do so for the users-permission plugin
In the previous version 3.6.8 this functionality was allowed through the extensions folder.
Am I missing something from the new guide, I even tried copying the files from node_modules > #strapi > plugin-users-permission and adding a new route and method to the exiting controller file but it still does not reflect the change in the section where we assign different route permission to roles. The user-permission plugin still shows the original routes, with no change.
Thanks,
I ran into this thread while researching pretty much the same issue, and I wanted to share my solution.
First of all, I found this portion of the documentation more useful than the one you referenced: https://docs.strapi.io/developer-docs/latest/development/plugins-extension.html
My goal was the write a new route to validate JWT tokens based on the comment made here: https://github.com/strapi/strapi/issues/3601#issuecomment-510810027 but updated for Strapi v4.
The solution turned out to be simple:
Create a new folder structure: ./src/extensions/user-permissions if it does not exist.
Create a new file ./src/extensions/user-permissions/strapi-server.js if it does not exist.
Add the following to the file:
module.exports = (plugin) => {
plugin.controllers.<controller>['<new method>'] = async (ctx) => {
// custom logic here
}
plugin.routes['content-api'].routes.push({
method: '<method>',
path: '/your/path',
handler: '<controller>.<new method>',
config: {
policies: [],
prefix: '',
},
});
return plugin;
};
If you're unsure what controllers are available, you can always check the API documentation or console.log(plugin) or console.log(plugin.controllers).
After the admin server restarts, you should see your new route under the user-permissions section as you would expect, and you can assign rights to it as you see fit.
My full strapi-server.js file including the logic to validate JWT:
module.exports = (plugin) => {
plugin.controllers.auth['tokenDecrypt'] = async (ctx) => {
// get token from the POST request
const {token} = ctx.request.body;
// check token requirement
if (!token) {
return ctx.badRequest('`token` param is missing')
}
try {
// decrypt the jwt
const obj = await strapi.plugin('users-permissions').service('jwt').verify(token);
// send the decrypted object
return obj;
} catch (err) {
// if the token is not a valid token it will throw and error
return ctx.badRequest(err.toString());
}
}
plugin.routes['content-api'].routes.push({
method: 'POST',
path: '/token/validation',
handler: 'auth.tokenDecrypt',
config: {
policies: [],
prefix: '',
},
});
return plugin;
};
When exporting routes you need to export the type, either content-api or admin. Look at the Strapi email plugin in node_modules for example, change the folder and file structure in your routes folder to match that and then you will be able to set permissions in the admin panel.
If your Strapi server is using Typescript, make sure that you name your extension files accordingly. So instead of strapi-server.js, you would need to name your file strapi-server.ts.

Nuxt.js env Property, understanding and how to use it?

following https://nuxtjs.org/api/configuration-env
I have been trying to set up my apiUrl in nuxt.config.js once for the whole project, like:
export default {
env: {
apiUrl: process.env.MY_REMOTE_CMS_API_URL || 'http://localhost:1337'
}
}
adding this in nuxt.config.js, I'd expect (and would like) to have apiUrl accessible everywhere in the project.
In particular, it is needed for the 3 following cases:
with axios, to generate static pages from dynamic urls (in nuxt.config.js)
generate: {
routes: function () {
return axios.get(apiUrl + '/posts')
.then((res) => {
return res.data.filter(page => {
return page.publish === true;
}).map(page => {
return {
route: '/news/' + page.slug
}
})
})
}
},
with apollo, to get data via graphql (in nuxt.config.js)
apollo: {
clientConfigs: {
default: {
httpEndpoint: apiUrl + '/graphql'
}
}
},
in every layout, page and components, as the base url of media:
<img :src="apiUrl + item.image.url" />
As you might see, only thing I need is to 'print' the actual base url of the cms.
I have also tried to access it with process.env.apiUrl, with no success.
The only way I was able to make it has been to create an extra plugin/apiUrl.js file, which injects the api url, and seems wrong to me as I am now setting the apiUrl twice in my project.
I asked this question in the past, but in a way less clear way. I was suggested to use dotenv, but from the docs it looks like adding an additional layer of complication that might not be necessary for a simpler setup.
Thanks.
I think dotenv module really is what you need.
This is my setup:
Project root has a .env file that contains
BASE_URL=https://www.myapi.com
require('dotenv').config() at top of nuxt.config.js
#nuxtjs/dotenv installed and added to buildModules of nuxt.config.js
env: { BASE_URL: process.env.BASE_URL} added to nuxt.config.js
axios: { baseURL: process.env.BASE_URL } added to nuxt.config.js (optional)
You should have access to your .env throughout the project. (process.env.BASE_URL)
I haven't used apollo, but you should be able to set the apollo endpoint with process.env.BASE_URL + '/graphql'
As of Nuxt 2.13, #nuxtjs/dotenv is not required anymore. Read here
The concept that I was missing is that you set up the same named variable in your server / pipeline, so that you have your (always local / never pushed) .env file and a same name variable remotely, not added to your repo (where the value can be the same or different)

Upload images with apollo-upload-client in React Native

I'm trying out Prisma and React Native right now. Currently I'm trying to upload images to my db with the package _apollo-upload-client (https://github.com/jaydenseric/apollo-upload-client). But it's not going so well.
Currently I can select an image with the ImagePicker from Expo. And then I'm trying to do my mutation with the Apollo Client:
await this.props.mutate({
variables: {
name,
description,
price,
image,
},
});
But I get the following error:
Network error: JSON Parse error: Unexpected identifier "POST"
- node_modules/apollo-client/bundle.umd.js:76:32 in ApolloError
- node_modules/apollo-client/bundle.umd.js:797:43 in error
And I believe it's from these lines of code:
const image = new ReactNativeFile({
uri: imageUrl,
type: 'image/png',
name: 'i-am-a-name',
});
Which is almost identical from the their example, https://github.com/jaydenseric/apollo-upload-client#react-native.
imageUrl is from my state. And when I console.log image I get the following:
ReactNativeFile {
"name": "i-am-a-name",
"type": "image/png",
"uri": "file:///Users/martinnord/Library/Developer/CoreSimulator/Devices/4C297288-A876-4159-9CD7-41D75303D07F/data/Containers/Data/Application/8E899238-DE52-47BF-99E2-583717740E40/Library/Caches/ExponentExperienceData/%2540anonymous%252Fecommerce-app-e5eacce4-b22c-4ab9-9151-55cd82ba58bf/ImagePicker/771798A4-84F1-4130-AB37-9F382546AE47.png",
}
So something is popping out. But I can't get any further and I'm hoping I could get some tips from someone.
I also didn't include any code from the backend since I believe the problem lays on the frontend. But if anyone would like to take a look at the backend I can update the question, or you could take a look here: https://github.com/Martinnord/Ecommerce-server/tree/image_uploads.
Thanks a lot for reading! Cheers.
Update
After someone asked after the logic in the server I have decided to past it below:
Product.ts
// import shortid from 'shortid'
import { createWriteStream } from 'fs'
import { getUserId, Context } from '../../utils'
const storeUpload = async ({ stream, filename }): Promise<any> => {
// const path = `images/${shortid.generate()}`
const path = `images/test`
return new Promise((resolve, reject) =>
stream
.pipe(createWriteStream(path))
.on('finish', () => resolve({ path }))
.on('error', reject),
)
}
const processUpload = async upload => {
const { stream, filename, mimetype, encoding } = await upload
const { path } = await storeUpload({ stream, filename })
return path
}
export const product = {
async createProduct(parent, { name, description, price, image }, ctx: Context, info) {
// const userId = getUserId(ctx)
const userId = 1;
console.log(image);
const imageUrl = await processUpload(image);
console.log(imageUrl);
return ctx.db.mutation.createProduct(
{
data: {
name,
description,
price,
imageUrl,
seller: {
connect: { id: userId },
},
},
},
info
)
},
}
Solution has been found.
I am a little embarrassed that this was the problem that I faced and I don't know if I should even accept this answer because of awkward I felt when I fixed the issue. But....
There was nothing wrong with my code, but there was a problem with the dependencies versions. I tried to backtrack everything on my app, so I decided to start from the beginning and create a new account. I expected it to work just fine, but I got this error:
Error: Cannot use GraphQLNonNull "User!" from another module or realm.
Ensure that there is only one instance of "graphql" in the node_modules
directory. If different versions of "graphql" are the dependencies of other
relied on modules, use "resolutions" to ensure only one version is installed.
https://yarnpkg.com/en/docs/selective-version-resolutions
Duplicate "graphql" modules cannot be used at the same time since different
versions may have different capabilities and behavior. The data from one
version used in the function from another could produce confusing and
spurious results.
Then I understand that something (that I didn't think of) was wrong. I checked my dependencies versions and compared them with Graphcool's example, https://github.com/graphcool/graphql-server-example/blob/master/package.json. And I noticed that my dependencies was outdated. So I upgraded them and everything worked! So that was what I had to do. Update my dependencies.
Moral of the story
Always, always check your damn dependencies versions...
Crawling through your code, I have found this repository, which must be the front-end code if I am not mistaken?
As you've mentioned, apollo-upload-server requires some additional set-up and same goes for the front-end part of your project. You can find more about it here.
As far as I know, the problematic part of your code must be the initialisation of the Apollo Client. From my observation, you've put everything Apollo requires inside of src/index folder, but haven't included Apollo Upload Client itself.
I have created a gist from one of my projects which initialises Apollo Upload Client alongside some other things, but I think you'll find yourself out.
https://gist.github.com/maticzav/86892448682f40e0bc9fc4d4a3acd93a
Hope this helps you! 🙂

React Native Realm Migration

In React Native, where are you supposed to put the migration code or code to delete the realm database (ignoring migration) and for it to only run once?
I tried deleting the Realm database each time I am back on the Login screen. When I try to login, it is supposed to save the user info into Realm and then the app proceeds as normal. However this is not the case, it seems because Realm database was deleted, it has no where to save it. I would have thought that once I login, by saving the user info into Realm, it would initialize Realm and then save the user in Realm.
In debug mode, it seems even by deleting the Realm database, everything functions normally. Debug mode is a lot slower, so is there a timing issue somewhere?
Is there a method to initialize Realm?
This is what I did to get the migration to work.
I have realm.js located in /src where I keep all my react files. When I need to use my realm I import realm from 'path/to/realm.js';
In realm.js I have my old schema and my new schema.
import Realm from 'realm';
const schema = {
name: 'mySchema',
properties: {
name: 'string',
}
};
const schemaV1 = {
name: 'mySchema',
properties: {
name: 'string',
otherName: 'string',
}
};
Note they have the same name. Then at the bottom of my realm.js where I used to have export default new Realm({schema: [schema]});
I now have this:
export default new Realm({
schema: [schemaV1],
schemaVersion: 1,
migration: (oldRealm, newRealm) => {
// only apply this change if upgrading to schemaVersion 1
if (oldRealm.schemaVersion < 1) {
const oldObjects = oldRealm.objects('schema');
const newObjects = newRealm.objects('schema');
// loop through all objects and set the name property in the new schema
for (let i = 0; i < oldObjects.length; i++) {
newObjects[i].otherName = 'otherName';
}
}
},
});
If you don't need to migrate the data, you could just open the Realm with the new schema version and new schema and it should also work.
If you have just added or removed fields of your schema, you can performe a empty migration. This is my realm.js file:
import Realm from 'realm';
//models
import Registros from '../models/registros';
import Local from '../models/local';
export default function getRealm() {
return Realm.open({
schema: [Registros, Local],
schemaVersion: 1, //add a version number
migration: (oldRealm, newRealm) => {
},
});
}