So I'm trying to send an image to our server with react native using GraphQL query and I don't know why but it always return an error : [CombinedError: [Network] Network request failed].
The query :
import { graphql } from '../../gql';
import { gql, useMutation } from 'urql';
const AddProfilePicture_Mutation = graphql(`
mutation AddPicture_Mutation ($userId: ID!, $picture: Upload!) {
uploadProfilePicture(input: {
userId: $userId
picture: $picture
}) {
id
}
}`);
export const useAddProfilePicture = () => {
const [{fetching, error}, execute] = useMutation(AddProfilePicture_Mutation);
return {
error: !!error,
fetching,
addProfilePicture: execute,
}
}
and the code :
const pictureHandler = async () => {
const options = {
mediaType: 'photo' as MediaType,
includeBase64: true,
selectionLimit: 1,
};
const profilePicture = await launchImageLibrary(options);
if (profilePicture.assets?.[0].fileSize && profilePicture.assets?.[0].fileSize > MAXFILESIZE) {
showError(t('profileScreen.PictureSize'));
}
if (profilePicture.assets?.[0].uri && profilePicture.assets[0].fileName && profilePicture.assets[0].type) {
// const myBlob = await fetch(profilePicture.assets[0].uri).then(r => r.blob());
const blob = new Blob([profilePicture.assets[0].base64 as BlobPart], {
type: profilePicture.assets[0].type,
});
const file = new File([blob], profilePicture.assets[0].fileName, { type: `${profilePicture.assets[0].type}`});
const {error} = await addProfilePicture(
{ userId: userId!, picture: file},
{ fetchOptions: { headers: { 'graphql-require-preflight': '' } } }
);
if (!error) {
showSuccess(t('profileScreen.PictureSuccessAdded'));
navigation.navigate('UserProfile');
} else {
console.log(error);
showError(t('profileScreen.PictureErrorAdded'));
}
};
};
I've been trying everything I found on the web, Formdata, react-native-blob-util and rn-fetch-blob. If I try sending anything else then a File, the server reject it and says for exemple:
Variable 'picture' has an invalid value: Expected type org.springframework.web.multipart.MultipartFile but was java.util.LinkedHashMap]
Update :
After long research and help from other programmers. We never did found the answer. We open a new access point in the backend specifically for the uploaded picture and used a regular fetch post.
Related
So I have a project using the latest Next js 13, React 18, Urql 3, and using typescript
Currently, I have issues when trying to query the urql from the getstaticprops function. My urql request needs a guest token, and I'm storing the token on session storage(other suggestions ?).
It has no issue when the query is running on the client, but I have it when querying inside the function.
My concern is related to the token reading, so the server cannot read the session storage value.
I'm asking what is the better and simplest way to make this work.
Does use cookies to store guest tokens will make this work?
Or the configuration that doesn't work?
This is my current config for urql.ts
import {
createClient,
ssrExchange,
dedupExchange,
cacheExchange,
fetchExchange,
} from "urql";
import { GRAPH_URL } from "#lib/constant/env";
import type { TypedDocumentNode } from "#urql/core";
const isServerSide = typeof window === "undefined";
const ssrCache = ssrExchange({
isClient: !isServerSide,
});
const client = createClient({
url: GRAPH_URL,
exchanges: [dedupExchange, cacheExchange, ssrCache, fetchExchange],
fetchOptions: () => {
const token = sessionStorage.getItem("accessToken");
return {
headers: {
authorization: token ? `Bearer ${token}` : "",
},
};
},
});
const query = async (
query: TypedDocumentNode<any, object>,
variables?: Record<string, string | string[] | unknown>
) => {
try {
const response = await client.query(query, variables as any).toPromise();
return response;
} catch (error) {
if (error instanceof Error) console.error(error.message);
}
};
const mutation = async (
mutation: TypedDocumentNode<any, object>,
variables?: Record<string, string | string[] | unknown>
) => {
try {
const response = await client
.mutation(mutation, variables as any)
.toPromise();
return response;
} catch (error) {
if (error instanceof Error) console.error(error.message);
}
};
export { client, query, mutation, ssrCache };
And this some of the code for the blog index page
export const getStaticProps = async () => {
await fetchArticlesSummary();
return {
props: {
urqlState: ssrCache.extractData(),
},
revalidate: 600,
};
};
export default withUrqlClient(() => ({
url: GRAPH_URL,
}))(BlogPage);
This is for the fetchArticlesSummary
export const fetchArticlesSummary = async () => {
try {
const {
data: { listArticles },
}: any = await query(getListArticle);
return listArticles.items;
} catch (error) {
return {
notFound: true,
};
}
};
I also doing a setup on _app.tsx
export default function App({ Component, pageProps }: AppProps) {
if (pageProps.urqlState) {
ssrCache.restoreData(pageProps.urqlState);
}
return (
<Provider value={client}>
<Component {...pageProps} />
</Provider>
);
}
Thank you
I have followed urql documentation about server-side configuration and many others but still don't have any solutions.
I've this code to get nearby places and nearby beaches from a point, with Google maps. This is called from a Next.js component, via the useSWR hook.
All the data is returned correctly, but before first Axios call (const fetchNearbyPlaces = async (urlWithToken = null) => {...), I'm receiving this error in the console:
API resolved without sending a response for /api/google/places/33.807501/-78.70039, this may result in stalled requests.
I can't figure out what the error is, although there may be several because I'm a novice. I appreciate any suggestion.
const axios = require("axios");
const GetNearbyPlaces = async (req, res) => {
const {
latitude,
longitude,
} = req.query;
const radius = 50000;
const types = [
"airport",
"tourist_attraction",
"amusement_park",
"aquarium",
"art_gallery",
"bar",
"museum",
"night_club",
"cafe",
"restaurant",
"shopping_mall",
"store",
"spa",
];
function checkFunc(arr, val) {
return arr.some(arrVal => val === arrVal);
}
const url = `https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=${latitude}%2C${longitude}&radius=${radius}&key=${process.env.CW_GOOGLE_MAPS_API_KEY}`;
const beachesUrl = `https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=${latitude}%2C${longitude}&radius=${radius}&type=natural_feature&key=${process.env.CW_GOOGLE_MAPS_API_KEY}`;
try {
let results = [];
let beaches = [];
const fetchNearbyBeaches = async (urlWithToken = null) => {
await axios.get(urlWithToken ? urlWithToken : beachesUrl).then(data => {
beaches = [...beaches, ...data.data.results];
if (data?.data?.next_page_token) {
const newUrl = `https://maps.googleapis.com/maps/api/place/nearbysearch/json?key=${process.env.CW_GOOGLE_MAPS_API_KEY}&pagetoken=${data.data.next_page_token}`;
setTimeout(() => {
fetchNearbyBeaches(newUrl);
}, 2000);
} else {
beaches.length > 5 && beaches.splice(5);
results.length > 5 && results.splice(5);
const finalResults = [...beaches, ...results];
finalResults.length > 10 && finalResults.splice(10);
return res.status(200).json({
data: {
results: finalResults,
},
success: true,
});
}
});
};
const fetchNearbyPlaces = async (urlWithToken = null) => {
await axios.get(urlWithToken ? urlWithToken : url).then(data => {
results = [...results, ...data.data.results];
if (data?.data?.next_page_token) {
const newUrl = `https://maps.googleapis.com/maps/api/place/nearbysearch/json?key=${process.env.CW_GOOGLE_MAPS_API_KEY}&pagetoken=${data.data.next_page_token}`;
setTimeout(() => {
fetchNearbyPlaces(newUrl);
}, 2000);
} else {
const dirtyResultsWithDuplicates = [];
results.map(result => {
return types.map(type => {
if (checkFunc(result.types, type) && !result.types.includes("lodging")) {
dirtyResultsWithDuplicates.push(result);
}
});
});
const set = new Set(dirtyResultsWithDuplicates);
const filtered = Array.from(set);
results = filtered.length > 10 ? filtered.splice(10) : filtered;
return fetchNearbyBeaches();
}
});
};
fetchNearbyPlaces();
} catch (err) {
res.status(500).json({
message: err.message,
statusCode: 500,
});
}
};
export default GetNearbyPlaces;
The problem is with the backend application not the frontend component.
Nextjs expects a response to have been sent when the api handler function exits. If for example you have a databaseCall.then(sendResponse) in your api handler function what happens is that the handler function exits before the database returns.
Now this is not a problem if the database does return after that and sends the response, but it is if for example the database has an error. Because the handler function exits without a response already being sent Nextjs can't be sure that at that point there isn't a stalled request.
One way to fix this is by await-ing the db call(or whatever other async function you call) thereby preventing the handler function from exiting before some kind of response has been send.
The solution was added this object to mi API code.
export const config = {
api: {
externalResolver: true,
},
};
Documentation: https://nextjs.org/docs/api-routes/request-helpers
I am trying to upload a lot of files from S3 to IPFS via Pinata. I haven't found in Pinata documentation something like that.
This is my solution, using the form-data library. I haven't tested it yet (I will do it soon, I need to code some things).
Is it a correct approach? anyone who has done something similar?
async uploadImagesFolder(
items: ItemDocument[],
bucket?: string,
path?: string,
) {
try {
const form = new FormData();
for (const item of items) {
const file = getObjectStream(item.tokenURI, bucket, path);
form.append('file', file, {
filename: item.tokenURI,
});
}
console.log(`Uploading files to IPFS`);
const pinataOptions: PinataOptions = {
cidVersion: 1,
};
const result = await pinata.pinFileToIPFS(form, {
pinataOptions,
});
console.log(`PiƱata Response:`, JSON.stringify(result, null, 2));
return result.IpfsHash;
} catch (e) {
console.error(e);
}
}
I had the same problem
So, I have found this: https://medium.com/pinata/stream-files-from-aws-s3-to-ipfs-a0e23ffb7ae5
But in the article If am not wrong, is used a different version to the JavaScript AWS SDK v3 (nowadays the most recent: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/index.html).
This is for the Client side with TypeScript:
If you have this version, for me works this code snippet:
export const getStreamObjectInAwsS3 = async (data: YourParamsType) => {
try {
const BUCKET = data.bucketTarget
const KEY = data.key
const client = new S3Client({
region: 'your-region',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'secret-key'
}
})
const resource = await client.send(new GetObjectCommand({
Bucket: BUCKET,
Key: KEY
}))
const response = resource.Body
if (response) {
return new Response(await response.transformToByteArray()).blob()
}
return null
} catch (error) {
return null
}
}
With the previous code, you can get the Blob Object for pass it to the File object with this method and get the URL resource using the API:
export const uploadFileToIPFS = async(file: Response) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`
const data = new FormData()
data.append('file', file)
try {
const response = await axios.post(url, data, {
maxBodyLength: Infinity,
headers: {
pinata_api_key: 'your-api',
pinata_secret_api_key: 'your-secret'
},
data: data
})
return {
success: true,
pinataURL: `https://gateway.pinata.cloud/ipfs/${ response.data.IpfsHash }`
}
} catch (error) {
console.log(error)
return null
}
}
I have found this solution from this nice article and you can explore other implementations (including the Node.js side)
I suffer from uploading image to getstream service. And it doesn't upload image.
Here is my code. I get the following error.
Here is my code.
const filter = { type: 'messaging', members: { $in: [userInfo.id] } };
const sort = [{ last_message_at: -1 }];
let channels;
await chatClient.queryChannels(filter, sort, {
watch: false,
state: true,
}).then((response) => {
channels = response;
})
const file = this.state.avatar;
let avatarURI;
channels[0].sendImage(file).then((response) => {
avatarURI = response.file;
console.log(avatarURI);
})
I am trying to setup front end for graphQl file upload with Apollo-boost-upload. The backend code is based on this link
https://dev.to/dnature/handling-file-uploads-with-apollo-server-2-0-14n7.
It's now reaching the resolver breakpoint after adding the following line in the server.js file
const { apolloUploadExpress } = require("apollo-upload-server");
app.use(apolloUploadExpress({ maxFileSize: 1000000000, maxFiles: 10 }));
And after modifying the schema for the upload type
scalar Upload
Here is the Vue component
<input
type="file"
style="display:none"
ref="fileInput"
accept="image/*"
#change="upload"
>
//Upload method
upload({ target: { files = [] } }) {
if (!files.length) {
return;
}
this.logoImage = files[0];
},
//Dispatching action from vue component
this.$store.dispatch("uploadLogo", { image: this.logoImage });
//Vuex action
const uploadLogo = async (context, payload) => {
context.commit("setLoading", true);
try {
const { data } = await apolloClient.mutate({
mutation: UPLOAD_LOGO,
variables: {file: payload.image},
context: {
hasUpload: true,
},
});
context.commit("setLoading", false);
console.log("Logo:", data.uploadLogo);
} catch (error) {
context.commit("setLoading", false);
console.log(error);
}
};
//Mutation
export const UPLOAD_LOGO = gql`
mutation uploadLogo($file: Upload!) {
uploadLogo(file: $file) {
_id
path
filename
mimetype
user {
_id
}
}
}
`;
// Apolloclient config on main.js
import ApolloClient from "apollo-boost-upload";
import { InMemoryCache } from "apollo-boost";
import VueApollo from "vue-apollo";
// Set up Apollo Client
export const defaultClient = new ApolloClient({
uri: "http://localhost:4000/graphql",
cache: new InMemoryCache({
addTypename: false,
}),
fetchOptions: {
credentials: "include",
},
request: (operation) => {
// if no token in local storage, add it
if (!localStorage.someToken) {
localStorage.setItem("someToken", "");
}
// operation adds the token to authorizatrion header, which is sent o backend
operation.setContext({
headers: {
authorization: "Bearer " + localStorage.getItem("someToken"),
},
});
},
onError: ({ graphQLErrors, networkError }) => {
if (networkError) {
console.log("[networkError]", networkError);
}
if (graphQLErrors) {
for (const error of graphQLErrors) {
console.dir(error);
if (error.name === "AuthenticationError") {
// set auth errir in state
store.commit("setError", error);
// signout user to clear error
store.dispatch("signUserOut");
}
}
}
},
});
Here is the updated typedef (old code commented out) from backend if that helps to identify the issue
const logoUploadTypeDefs = gql`
type File {
_id: ID!
path: String!
filename: String!
mimetype: String!
encoding: String!
user: User
}
# input Upload {
# name: String!
# type: String!
# size: Int!
# path: String!
# }
scalar Upload
type Mutation {
uploadLogo(file: Upload!): File
}
type Query {
info: String
logo: File!
}
`;
Now, the Node app crashes with the following log
I had to change "apollo-upload-server" to "graphql-upload"
change 1:
Commented out "apollo-upload-server" and used "graphql-upload"
// const { apolloUploadExpress } = require("apollo-upload-server");]
const {
graphqlUploadExpress, // A Koa implementation is also exported.
} = require("graphql-upload");
And in the middleware, used this
change 2:
app.use(graphqlUploadExpress());
await apolloServer.start();
instead of old code
app.use(apolloUploadExpress());// Not to be used
await apolloServer.start();
Also, in the resolver, I added this
change 3:
Import Upload from graphql-upload in the resolver file
const { GraphQLUpload } = require("graphql-upload");
....
....
const resolvers = {
// This maps the `Upload` scalar to the implementation provided
// by the `graphql-upload` package.
Upload: GraphQLUpload,
Query: {
....
}
Mutations: {
....
}
}
Refer to Apollo Docs for more details. This fixed the issue of Node crashing with error "Maximum call stack size exceeded at _openReadFs..."