I'm trying to use the useQuery function (from package '#vue/apollo-composable').
This function doesn't return a promise, just refs to result, loading etc. so I can't directly use this data in my store (Pinia).
Currently I have this code:
fetchArticle: function (id: string) {
// check if article is already in cache
const cache = this.articles.find(a => a.id == id);
if (cache && !cache.partial) return cache;
// fetch article from server
const { result: article } = useQuery<{ article: Article }>(GET_ARTICLE, { id });
// update state, but... when `article` contains data?
},
When I'am in a store I don't know how to wait for request end.
I tried to transform useQuery to return promise but that doesn't work, Nuxt.js freeze on server with this code:
fetchArticle: async function (id: string) {
// check if article is already in cache
const cache = this.articles.find(a => a.id == id);
if (cache && !cache.partial) return cache;
// fetch article from server
const { onResult, result } = useQuery<{ article: Article }>(GET_ARTICLE, { id });
const article = result.value?.article || (await new Promise(r => onResult(({ data }) => r(data.article))));
if (!article) return;
const data = { ...article, partial: false };
this.articles = this.articles.map(a => (a.id == id ? data : a)) as Article[];
// return article
return article;
},
Informations
Store: Pinia
Versions: Nuxt 3.1.2; #vue/apollo-composable 4.0.0-beta.2
I use the apollo client like this:
// The creation of the client
// This part is on a base class, that all of my Service extends
this.apolloClient = new ApolloClient({
link: //the link,
cache: new InMemoryCache(),
name: "My APP name",
version: `v${version.toString()}`,
defaultOptions: defaultOptions //some options I customize
});
provideApolloClient(this.apolloClient);
//The query
// This part is on the Service class
return this.apolloClient.query({ query: query, variables: { id: id }}).then((result) => {
return result.data.myData;
});
I always used like this, never used the useQuery. In the link I use a combination of three, one for the Auth, one for Errors and one for the base URL
Related
I was trying to make an app which lists a user's repositories from github using github API, however I'm having a big problem with fetching data from all pages (so far I can only get repos from one page). I tried to fix it by using an async/await function (instead of Promise), but it's also my first time using vue3 and I have no idea how to have a function inside of the setup() method.
The current code is here:
https://github.com/agzpie/user_repos
My try at using async/await, which didn't work:
import ListElement from "./components/ListElement";
import { ref, reactive, toRefs, watchEffect, computed } from "vue";
export default {
name: "App",
components: {
ListElement,
},
setup() {
const name = ref(null);
const userName = ref(null);
const state = reactive({ data: [] });
let success = ref(null);
const userNameValidator = /^[a-z\d](?:[a-z\d]|-(?=[a-z\d])){0,38}$/i;
const split1 = reactive({ spl1: [] });
const split2 = reactive({ spl2: [] });
async function myFetch() {};
/*
* Check for input in the form and then fetch data
*/
watchEffect(() => {
if (!userName.value) return;
if (!userNameValidator.test(userName.value)) {
console.log("Username has invalid characters");
return;
}
let hasNext = false;
state.data = [];
do {
async function myFetch() {
let url = `https://api.github.com/users/${userName.value}/repos?per_page=5`;
let response = await fetch(url);
if (!response.ok) {
success.value = false;
throw new Error(`HTTP error! status: ${response.status}`);
}
success.value = true;
// check response.headers for Link to get next page url
split1.spl1 = response.headers.get("Link").split(",");
let j = 0;
while (j < split1.spl1.length) {
split2.spl2[j] = split1.spl1[j].split(";");
console.log(split2.spl2[j][0]);
console.log(split2.spl2[j][1]);
if (split2.spl2[j][1].includes("next")) {
let urlNext = split2.spl2[j][0].replace(/[<>(\s)*]/g, "");
console.log(urlNext);
url = urlNext;
hasNext = true;
break;
} else {
hasNext = false;
}
j++;
}
// second .then
let myData = await response.json();
state.data.push(...myData);
console.log("data", myData);
name.value = "";
}
myFetch().catch((err) => {
if (err.status == 404) {
console.log("User not found");
} else {
console.log(err.message);
console.log("oh no (internet probably)!");
}
});
} while (hasNext);
});
// Sort list by star count
const orderedList = computed(() => {
if (state.data == 0) {
return [];
}
return [...state.data].sort((a, b) => {
return a.stargazers_count < b.stargazers_count ? 1 : -1;
});
});
return {
myFetch,
success,
isActive: true,
name,
userName,
ListElement,
...toRefs(state),
orderedList,
};
},
};
Any help would be highly appreciated
The call to myFetch() near the end is a call to an async function without an await, so it is effectively going to loop (if hasNext was initialized to true, but it isn't) without waiting for it to complete.
You should probably change that line to await myFetch() and wrap it all with a try/catch block.
I also don't really care for the way you're directly updating state inside the async myFetch call (it could also be doing several of those if it looped) and perhaps it should be returning the data from myFetch instead, and then you can use let result = await myFetch() and then make use of that when it returns.
Also, instead of awaiting myFetch() result, you could not await it but push it onto a requests array and then use await Promise.all(requests) outside the loop and it is one operation to await, all requests running in parallel. In fact, it should probably be await Promise.allSettled(requests) in case one of them fails. See allSettled for more.
But also I wonder why you're reading it paged if the goal is to fetch them all anyway? To reduce load on the server? If that is true, issuing them paged but in parallel would probably increase the load since it will still read and return all the data but require multiple calls.
I have a vuex store that I am pulling data from into a component. When the page loads the first time, everything behaves as expected. Yay.
When I refresh the page data is wiped from the store as expected and pulled again into the store as designed. I have verified this is the case monitoring the state using Vuex dev tools. My getter however doesn't pull the data this time into the component. I have tried so many things, read the documentation, etc and I am stuck.
Currently I am thinking it might be an issue with the argument?...
If I change the argument in the getter, 'this.id' to an actual value (leaving the dispatch alone - no changes there), the getter pulls the data from the store. So it seems the prop, this.id has the correct data as the dispatch statement works just fine. So why then wouldn't the getter work?
this.id source - The header includes a search for the person and passes the id of the person that is selected as the id prop. example data: playerId: 60
Thoughts? Appreciate any help.
This code works on initial page load, but not on page refresh.
props: ["id"],
methods: {
fetchStats() {
this.$store.dispatch("player/fetchPlayer", this.id).then(() => {
// alert(this.id);
this.player = this.$store.getters["player/getPlayerById"](this.id);
this.loading = false;
});
}
},
This code (only changing this.id to '6' on getter) works both on initial load and page refresh.
props: ["id"],
methods: {
fetchStats() {
this.$store.dispatch("player/fetchPlayer", this.id).then(() => {
// alert(this.id);
this.player = this.$store.getters["player/getPlayerById"](6);
this.loading = false;
});
}
},
Here is the getPlayerById getter:
getPlayerById: state => id => {
return state.players.find(plr => plr.playerId === id);
},
Here is the fetchPlayer action:
export const actions = {
fetchPlayer({ state, commit, getters }, id) {
// If the player being searched for is already in players array, no other data to get, exit
if (getters.getIndexByPlayerId(id) != -1) {
return;
}
// If the promise is set another request is already getting the data. return the first requests promise and exit
if (state.promise) {
return state.promise;
}
//We need to fetch data on current player
var promise = EventService.getPlayer(id)
.then(response => {
commit("ADD_PLAYER", response.data);
commit("CLEAR_PROMISE", null);
})
.catch(error => {
console.log("There was an error:", error.response);
commit("CLEAR_PROMISE", null);
});
//While data is being async gathered via Axios we set this so that subsequent requests will exit above before trying to fetch data multiple times
commit("SET_PROMISE", promise);
return promise;
}
};
and mutations:
export const mutations = {
ADD_PLAYER(state, player) {
state.players.push(player[0]);
},
SET_PROMISE(state, data) {
state.promise = data;
},
CLEAR_PROMISE(state, data) {
state.promise = data;
}
};
I am trying to upload images from react-admin to rails api backend using active storage.
In the documentation of react-admin it says: "Note that the image upload returns a File object. It is your responsibility to handle it depending on your API behavior. You can for instance encode it in base64, or send it as a multi-part form data" I am trying to send it as a multi-part form.
I have been reading here and there but I can not find what I want, at least a roadmap of how I should proceed.
You can actually find an example in the dataProvider section of the documentation.
You have to decorate your dataProvider to enable the data upload. Here is the example of transforming the images into base64 strings before posting the resource:
// in addUploadFeature.js
/**
* Convert a `File` object returned by the upload input into a base 64 string.
* That's not the most optimized way to store images in production, but it's
* enough to illustrate the idea of data provider decoration.
*/
const convertFileToBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file.rawFile);
reader.onload = () => resolve(reader.result);
reader.onerror = reject;
});
/**
* For posts update only, convert uploaded image in base 64 and attach it to
* the `picture` sent property, with `src` and `title` attributes.
*/
const addUploadFeature = requestHandler => (type, resource, params) => {
if (type === 'UPDATE' && resource === 'posts') {
// notice that following condition can be true only when `<ImageInput source="pictures" />` component has parameter `multiple={true}`
// if parameter `multiple` is false, then data.pictures is not an array, but single object
if (params.data.pictures && params.data.pictures.length) {
// only freshly dropped pictures are instance of File
const formerPictures = params.data.pictures.filter(p => !(p.rawFile instanceof File));
const newPictures = params.data.pictures.filter(p => p.rawFile instanceof File);
return Promise.all(newPictures.map(convertFileToBase64))
.then(base64Pictures => base64Pictures.map((picture64, index) => ({
src: picture64,
title: `${newPictures[index].title}`,
})))
.then(transformedNewPictures => requestHandler(type, resource, {
...params,
data: {
...params.data,
pictures: [...transformedNewPictures, ...formerPictures],
},
}));
}
}
// for other request types and resources, fall back to the default request handler
return requestHandler(type, resource, params);
};
export default addUploadFeature;
You can then apply this on your dataProvider:
// in dataProvider.js
import simpleRestProvider from 'ra-data-simple-rest';
import addUploadFeature from './addUploadFeature';
const dataProvider = simpleRestProvider('http://path.to.my.api/');
const uploadCapableDataProvider = addUploadFeature(dataProvider);
export default uploadCapableDataProvider;
Finally, you can use it in your admin as usual:
// in App.js
import { Admin, Resource } from 'react-admin';
import dataProvider from './dataProvider';
import PostList from './posts/PostList';
const App = () => (
<Admin dataProvider={uploadCapableDataProvider}>
<Resource name="posts" list={PostList} />
</Admin>
);
When using files, use a multi-part form in the react front-end and for example multer in your API backend.
In react-admin you should create a custom dataProvider and extend either the default or built a custom one. Per implementation you should handle the file/files upload. For uploading a file or files from your custom dataprovider in react-admin:
// dataProvider.js
// this is only the implementation for a create
case "CREATE":
const formData = new FormData();
for ( const param in params.data ) {
// 1 file
if (param === 'file') {
formData.append('file', params.data[param].rawFile);
continue
}
// when using multiple files
if (param === 'files') {
params.data[param].forEach(file => {
formData.append('files', file.rawFile);
});
continue
}
formData.append(param, params.data[param]);
}
return httpClient(`myendpoint.com/upload`, {
method: "POST",
body: formData,
}).then(({ json }) => ({ data: json });
From there you pick it up in your API using multer, that supports multi-part forms out-of-the-box. When using nestjs that could look like:
import {
Controller,
Post,
Header,
UseInterceptors,
UploadedFile,
} from "#nestjs/common";
import { FileInterceptor } from '#nestjs/platform-express'
#Controller("upload")
export class UploadController {
#Post()
#Header("Content-Type", "application/json")
// multer extracts file from the request body
#UseInterceptors(FileInterceptor('file'))
async uploadFile(
#UploadedFile() file : Record<any, any>
) {
console.log({ file })
}
}
I'm in the middle of learninig Cycle.JS and ran into a challenge. I have a component that will get a result from an HTTP call and I'd like to persist this response in indexDB. However, I feel that the request for persistence is the responsibility of another component.
The questions I have are:
Is this a use case for a custom driver that persists HTTP responses to indexDB?
How does another component access the response stream for a request it did not make?
When I try to select the category from the HTTP source, nothing gets logged to the console. I'm using xstream, so the streams should be hot and I expect debug to output. What's going on here?
Below is my component that makes the HTTP call:
import { Feed } from './feed'
export function RssList ({HTTP, props}, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({
url: url,
method: 'GET',
category: 'rss'
}))
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
const vDom$ = response$
.map(Feed)
.startWith('')
return {
DOM: vDom$,
HTTP: request$
}
}
Here is my attempt at accessing the response at the app level:
export function main (sources) {
const urlSource = url$(sources)
const rssSink = rss$(sources, urlSource.value)
const vDom$ = xs.combine(urlSource.DOM, rssSink.DOM)
.map(([urlInput, rssList]) =>
<div>
{urlInput}
{rssList}
</div>
)
sources.HTTP.select('rss').flatten().debug() // nothing happens here
return {
DOM: vDom$,
HTTP: rssSink.HTTP
}
}
Selecting a category in the main (the parent) component is the correct approach, and is supported.
The only reason why sources.HTTP.select('rss').flatten().debug() doesn't log anything is because that's not how debug works. It doesn't "subscribe" to the stream and create side effects. debug is essentially like a map operator that uses an identity function (always takes x as input and outputs x), but with a logging operation as a side effect. So you either need to replace .debug() with .addListener({next: x => console.log(x)}) or use the stream that .debug() outputs and hook it with the operator pipeline that goes to sinks. In other words, debug is an in-between logging side effect, not a destination logging side effect.
Question #1: Custom HTTP->IDB Driver: It depends on the nature of the project, for a simple example I used a general CycleJS IDB Driver. See example below or codesandbox.io example.
Question #2: Components Sharing Streams: Since components and main share the same source/sink API you can link the output (sink) of one component to the input (source) of another. See example below or codesandbox.io example.
Question #3: debug and Logging: As the authoritative (literally) André Staltz pointed out debug needs to be inserted into a completed stream cycle, I.E., an already subscribed/listened stream.
In your example you can put debug in your RssList component:
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
.debug()
OR add a listener to your main example:
sources.HTTP.select('rss').flatten().debug()
.addListener({next: x => console.log(x)})
OR, what I like to do, is include a log driver:
run(main, {
DOM: makeDOMDriver('#app'),
HTTP: makeHTTPDriver(),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
Then I'll just duplicate a stream and send it to the log sink:
const url$ = props.url
const http$ = url$.map(url => ({url: url, method: 'GET', category: 'rss'}))
const log$ = url$
return {
DOM: vdom$,
HTTP: http$,
log: log$,
}
Here's some example code for sending HTTP response to IndexedDB storage, using two components that share the data and a general IndexedDB driver:
function main(sources) {
const header$ = xs.of(div('RSS Feed:'))
const rssSink = RssList(sources) // input HTTP select and props
// output VDOM and data for IDB storage
const vDom$ = xs.combine(header$, rssSink.DOM) // build VDOM
.map(([header, rssList]) => div([header, rssList])
)
const idbSink = IdbSink(sources, rssSink.IDB) // output store and put HTTP response
return {
DOM: vDom$,
HTTP: rssSink.HTTP, // send HTTP request
IDB: idbSink.put, // send response to IDB store
log: idbSink.get, // get and log data stored in IDB
}
}
function RssList({ HTTP, props }, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({url: url, method: 'GET', category: 'rss'}))
const response$ = HTTP.select('rss').flatten().map(feedAdapter)
const idb$ = response$
const vDom$ = response$
.map(Feed)
.startWith(div('','...loading'))
return {
DOM: vDom$,
HTTP: request$,
IDB: { response: idb$ },
}
}
function Feed (feed) {
return div('> ' + feed)
}
function IdbSink(sources, idb) {
return {
get: sources.IDB.store('rss').getAll()
.map(obj => (obj['0'] && obj['0'].feed) || 'unknown'),
put: idb.response
.map(feedinfo => $put('rss', { feed: feedinfo }))
}
}
run(main, {
props: () => ({ url$: xs.of('http://lorem-rss.herokuapp.com/feed') }),
DOM: makeDOMDriver('#root'),
HTTP: makeHTTPDriver(),
IDB: makeIdbDriver('rss-db', 1, upgradeDb => {
upgradeDb.createObjectStore('rss', { keyPath: 'feed' })
}),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
This is a contrived example, simply to explore the issues raised. Codesandbox.io example.
I am working on a GraphQL server built using Express and attempting to support Relay.
For a regular GraphQL query, I can handle authorization in the resolve function. E.g.:
var queryType = new GraphQLObjectType({
name: 'RootQueryType',
fields: () => ({
foo: {
type: new GraphQLList(bar),
description: 'I should have access to some but not all instances of bar',
resolve: (root, args, request) => getBarsIHaveAccessTo(request.user)
}
})
});
To support Relay refetching on the back-end, Facebook's Relay tutorial instructs us to have GraphQL objects implement a nodeInterface for mapping global ids to objects and objects to GraphQL types. The nodeInterface is defined by the nodeDefinitions function from graphql-relay.
const {nodeInterface, nodeField} = nodeDefinitions(
(globalId) => {
const {type, id} = fromGlobalId(globalId);
if (type === 'bar') {
// since I don't have access to the request object here, I can't pass the user to getBar, so getBar can't perform authorization
return getBar(id);
} else {
return null;
}
},
(obj) => {
// return the object type
}
);
The refetching function that gets passed to nodeDefinitions doesn't get passed the request object, only the global id. How can I get access to the user during refetching so I can authorize those requests?
As a sanity check, I tried querying for nodes that the authenticated user doesn't otherwise have access to (and shouldn't) through the node interface, and got the requested data back:
{node(id:"id_of_something_unauthorized"){
... on bar {
field_this_user_shouldnt_see
}
}}
=>
{
"data": {
"node": {
"field_this_user_shouldnt_see": "a secret"
}
}
}
As it turns out, the request data actually does get passed to resolve. If we look at the source, we see that nodeDefinitions tosses out the parent parameter and passes the global id, the context (containing the request data), and the info arguments from nodeField's resolve function.
Ultimately, where a resolve call would get the following arguments:
(parent, args, context, info)
the idFetcher instead gets:
(id, context, info)
So we can implement authorization as follows:
const {nodeInterface, nodeField} = nodeDefinitions(
(globalId, context) => {
const {type, id} = fromGlobalId(globalId);
if (type === 'bar') {
// get Bar with id==id if context.user has access
return getBar(context.user, id);
} else {
return null;
}
},
(obj) => {
// return the object type
}
);
https://github.com/graphql/graphql-relay-js/blob/master/src/node/node.js#L94-L102