Storing REST response to indexedDB with Cycle.js - cyclejs

I'm in the middle of learninig Cycle.JS and ran into a challenge. I have a component that will get a result from an HTTP call and I'd like to persist this response in indexDB. However, I feel that the request for persistence is the responsibility of another component.
The questions I have are:
Is this a use case for a custom driver that persists HTTP responses to indexDB?
How does another component access the response stream for a request it did not make?
When I try to select the category from the HTTP source, nothing gets logged to the console. I'm using xstream, so the streams should be hot and I expect debug to output. What's going on here?
Below is my component that makes the HTTP call:
import { Feed } from './feed'
export function RssList ({HTTP, props}, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({
url: url,
method: 'GET',
category: 'rss'
}))
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
const vDom$ = response$
.map(Feed)
.startWith('')
return {
DOM: vDom$,
HTTP: request$
}
}
Here is my attempt at accessing the response at the app level:
export function main (sources) {
const urlSource = url$(sources)
const rssSink = rss$(sources, urlSource.value)
const vDom$ = xs.combine(urlSource.DOM, rssSink.DOM)
.map(([urlInput, rssList]) =>
<div>
{urlInput}
{rssList}
</div>
)
sources.HTTP.select('rss').flatten().debug() // nothing happens here
return {
DOM: vDom$,
HTTP: rssSink.HTTP
}
}

Selecting a category in the main (the parent) component is the correct approach, and is supported.
The only reason why sources.HTTP.select('rss').flatten().debug() doesn't log anything is because that's not how debug works. It doesn't "subscribe" to the stream and create side effects. debug is essentially like a map operator that uses an identity function (always takes x as input and outputs x), but with a logging operation as a side effect. So you either need to replace .debug() with .addListener({next: x => console.log(x)}) or use the stream that .debug() outputs and hook it with the operator pipeline that goes to sinks. In other words, debug is an in-between logging side effect, not a destination logging side effect.

Question #1: Custom HTTP->IDB Driver: It depends on the nature of the project, for a simple example I used a general CycleJS IDB Driver. See example below or codesandbox.io example.
Question #2: Components Sharing Streams: Since components and main share the same source/sink API you can link the output (sink) of one component to the input (source) of another. See example below or codesandbox.io example.
Question #3: debug and Logging: As the authoritative (literally) André Staltz pointed out debug needs to be inserted into a completed stream cycle, I.E., an already subscribed/listened stream.
In your example you can put debug in your RssList component:
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
.debug()
OR add a listener to your main example:
sources.HTTP.select('rss').flatten().debug()
.addListener({next: x => console.log(x)})
OR, what I like to do, is include a log driver:
run(main, {
DOM: makeDOMDriver('#app'),
HTTP: makeHTTPDriver(),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
Then I'll just duplicate a stream and send it to the log sink:
const url$ = props.url
const http$ = url$.map(url => ({url: url, method: 'GET', category: 'rss'}))
const log$ = url$
return {
DOM: vdom$,
HTTP: http$,
log: log$,
}
Here's some example code for sending HTTP response to IndexedDB storage, using two components that share the data and a general IndexedDB driver:
function main(sources) {
const header$ = xs.of(div('RSS Feed:'))
const rssSink = RssList(sources) // input HTTP select and props
// output VDOM and data for IDB storage
const vDom$ = xs.combine(header$, rssSink.DOM) // build VDOM
.map(([header, rssList]) => div([header, rssList])
)
const idbSink = IdbSink(sources, rssSink.IDB) // output store and put HTTP response
return {
DOM: vDom$,
HTTP: rssSink.HTTP, // send HTTP request
IDB: idbSink.put, // send response to IDB store
log: idbSink.get, // get and log data stored in IDB
}
}
function RssList({ HTTP, props }, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({url: url, method: 'GET', category: 'rss'}))
const response$ = HTTP.select('rss').flatten().map(feedAdapter)
const idb$ = response$
const vDom$ = response$
.map(Feed)
.startWith(div('','...loading'))
return {
DOM: vDom$,
HTTP: request$,
IDB: { response: idb$ },
}
}
function Feed (feed) {
return div('> ' + feed)
}
function IdbSink(sources, idb) {
return {
get: sources.IDB.store('rss').getAll()
.map(obj => (obj['0'] && obj['0'].feed) || 'unknown'),
put: idb.response
.map(feedinfo => $put('rss', { feed: feedinfo }))
}
}
run(main, {
props: () => ({ url$: xs.of('http://lorem-rss.herokuapp.com/feed') }),
DOM: makeDOMDriver('#root'),
HTTP: makeHTTPDriver(),
IDB: makeIdbDriver('rss-db', 1, upgradeDb => {
upgradeDb.createObjectStore('rss', { keyPath: 'feed' })
}),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
This is a contrived example, simply to explore the issues raised. Codesandbox.io example.

Related

Cypress: login through magic link error with cy.origin()

Devs at my startup have switched login to a magic link system, in which you get inside after clicking a link on the email body.
I have set up a Mailsac email to receive mails containing magic links but I haven't been able to actually follow those links because of the following:
cy.request({
method: "GET",
url: "https://mailsac.com/api/addresses/xxxx#mailsac.com/messages",
headers: {
"Mailsac-Key": "here-goes-the-key",
},
}).then((res) => {
const magicLink = res.body[0].links[0];
cy.origin(magicLink, () => {
cy.visit('/')
});
});
I wasn't able to use cy.visit() either because the magic link URL is slightly different from the baseURL in this testing environment.
So my question is:
How could I follow this cumbersome link to find myself logged in home, or else, is there another way to deal with magic links?
Thanks
The docs say
A URL specifying the secondary origin in which the callback is to be executed. This should at the very least contain a hostname, and may also include the protocol, port number & path. Query params are not supported.
Not sure if this means the cy.visit() argument should not have query params, of just the cy.origin() parameter.
Try passing in the link
cy.request({
...
}).then((res) => {
const magicLink = res.body[0].links[0];
const magicOrigin = new URL(magicLink).origin
cy.origin(magicOrigin, { args: { magicLink } }, ({ magicLink }) => {
cy.visit(magicLink)
});
});
If that doesn't fix it, you could try using cy.request() but you'll have to observe where the token is stored after using the magicLink.
cy.request({
...
}).then((res) => {
const magicLink = res.body[0].links[0];
cy.request(magicLink).then(response =>
const token = response??? // find out where the auth token ends up
cy.setCookie(name, value) // for example
});
});
You need to pass the domain as the first parameter to origin, and do the visit within the callback function, something like this:
const magicLinkDomain = new Url(magicLink).hostname
cy.origin(magicLinkDomain, {args: magicLink}, ({ magicLink }) => {
cy.visit(magicLink);
//...
})
Reference: https://docs.cypress.io/api/commands/origin#Usage

Is there a way to substitute dynamic base url in RTK-Query with React Native?

I found one way. I can store base url in AsyncStorage so that the user can reload the page and still have access to that url. But there is one problem. I can’t have asynchronous code inside RTK endpoints.
const postAuthEndpoint = api.injectEndpoints({
endpoints: build => ({
postAuth: build.mutation<PostAuthResponse, PostAuthRequest>({
query: async data => {
// throws an error:
// Type 'Promise<{ url: string; method: string; body: PostAuthRequest; }>'
// is not assignable to type 'string | FetchArgs'
const baseUrl = await AsyncStorage.getItem('baseUrl');
return {
url: `${baseUrl}/auth`,
method: 'POST',
body: data,
};
},
}),
}),
});
Because of this, I decided to create a custom hook, that performs an async operation to get the base url. Then the custom hook passes the base url to the api hook from RTK-Query, which we pass to the custom hook. And returns wrapped mutation with the rest of parameters.
export const useEndpointWrapper = (endpoint: any) => {
const [mutation, ...rest] = endpoint;
const wrappedMutation = async (args: Request) => {
const baseUrl = await AsyncStorage.getItem('baseUrl');
return mutation({ baseUrl, ...args }).unwrap();
};
return [wrappedMutation, rest];
};
The main disadvantage here is that the TypeScript typing breaks down. This is solvable, but inconvenient.
Maybe there are some other ways to substitute the dynamic base url in react native?

VueJS getting "undefined" data from ipcRenderer (ElectronJS)

When trying to get a message from ipcMain to ipcRenderer (without node integration and with contextIsolation), it's received but as undefined. Not only that, but if I were to reload the VueComponent (regardless of what change I make to it), the number of responses gets doubled.
For example, the first time I start my application, I get 1x undefined at a time every time I click the button. If I reload the component, I start getting 2x undefined every time I click the button. I reload again and get 4x undefined every time I click the button... and it keeps doubling. If I restart the application, it goes back to 1x.
SETUP
ElectronJS + VueJS + VuetifyJS has been set up as described here.
preload.js as per the official documentation.
import { contextBridge, ipcRenderer } from 'electron'
window.ipcRenderer = ipcRenderer
// Expose protected methods that allow the renderer process to use
// the ipcRenderer without exposing the entire object
contextBridge.exposeInMainWorld('ipcRenderer', {
send: (channel, data) => {
// whitelist channels
let validChannels = ['toMain']
if (validChannels.includes(channel)) {
ipcRenderer.send(channel, data)
}
},
receive: (channel, func) => {
let validChannels = ['fromMain']
if (validChannels.includes(channel)) {
// Deliberately strip event as it includes `sender`
ipcRenderer.on(channel, (event, ...args) => func(...args))
}
}
})
background.js (main process) as per the official documentation for the preload.js file. The omitted code via ... is the default project code generated upon creation.
...
const path = require('path')
const { ipcMain } = require('electron')
async function createWindow() {
// Create the browser window.
const win = new BrowserWindow({
width: 800,
height: 600,
webPreferences: {
// Use pluginOptions.nodeIntegration, leave this alone
// See nklayman.github.io/vue-cli-plugin-electron-builder/guide/security.html#node-integration for more info
nodeIntegration: process.env.ELECTRON_NODE_INTEGRATION,
contextIsolation: true,
preload: path.join(__dirname, 'preload.js'),
},
icon: 'src/assets/icon.png',
})
ipcMain.on('toMain', (event, data) => {
console.log(data)
event.sender.send('fromMain', 'Hello IPC Renderer')
// The two lines below return 'undefined' as well in the 'ipcRenderer'
//win.webContents.send('fromMain', "Hello IPC Renderer")
//event.reply('fromMain', 'Hello IPC Renderer')
})
...
}
...
vue.config.js file:
module.exports = {
...
pluginOptions: {
electronBuilder: {
preload: 'src/preload.js',
}
}
}
main.js (renderer process) contains only the default project code generated upon creation.
VueComponent.vue
<template>
<div id="vue-component">
<v-btn #click="sendMessageToIPCMain()">
</div>
</template>
<script>
export default {
name: "VueComponent",
components: {
//
},
data: () => ({
myData: null,
}),
methods: {
// This works. I get 'Hello IPC Main' in the CMD console.
sendMessageToIPCMain() {
var message = "Hello IPC Main"
window.ipcRenderer.send("toMain", message);
}
},
mounted() {
window.ipcRenderer.receive('fromMain', (event, data) => {
// this.myData = data // 'myData' is not defined error
this.$refs.myData = data;
console.log('myData variable: ' + this.$refs.myData) // undefined
console.log(data) // undefined
})
},
}
</script>
The VueComponent.vue's mounted() has been set up as described here, though If I try to send the data to a variable using this.myData = data, I get an error saying that myData has not been defined - using this.$refs.myData works, though it's still undefined.
P.S. myData has not been defined error =/= undefined. The former is a proper error in red letters while the latter is as seen in the image above.
For solving the first problem (doubling of function calls) you have to remove window.ipcRenderer = ipcRenderer. In contextIsolation mode the approach is to use contextBridge.exposeInMainWorld() only. Using both implementation definitely causes issues.
For the second problem, the callback to receive in ipcRenderer is called with only ...args from main (no event passed to func). see:
ipcRenderer.on(channel, (event, ...args) => func(...args)) <-- func() is called with only args
The only thing you should change is your function in mounted, to accept only data:
window.ipcRenderer.receive('fromMain', (data) => {
console.log(data) // should log you data
})

how to add attributes to a PUT request in GUN?

I have the following code in my HTML page
Gun.on('opt', function (ctx) {
if (ctx.once) {
return
}
this.to.next(ctx)
window.auth = ctx.opt.auth
ctx.on('get', function (msg) {
msg.auth = window.auth
this.to.next(msg)
})
ctx.on('put', function (msg) {
msg.put.auth = window.auth
this.to.next(msg)
})
})
var gun = Gun({
peers: ['http://localhost:8765/gun'],
auth: {
user: 'mroon',
password: 'titi'
}
})
On the server, I simply watch the requests
Gun.on('create', function(db) {
console.log('gun created')
this.to.next(db);
db.on('get', function(request) {
// this request contains the auth attribute from the client
this.to.next(request);
});
db.on('put', function(request) {
// this request does not contain the auth attribute from the client
this.to.next(request);
});
});
every time I query the graph with gun.get('someAttribute') the request on the server contains the auth attribute.
but when a gun.get('someAttribute').put({attribute: 'my new value'}) is called, the request on the server does not contain the auth attribute.
How can I add the auth attribute to the put request in such a way that all the peers will get it too?
#micha-roon you jumped straight to GUN's core/internal wire details, which is not the easiest thing to start with, but here is something I do that I'm guessing is what you are looking for:
(if not, please just comment & I'll update)
What this does is it adds a DEBUG flag to all outbound messages in GUN, you can change this to add other metadata or info
Gun.on('opt', function(root){
if(!root.once){
root.on('out', function(msg){
msg.DBG = msg.DBG || +new Date;
this.to.next(msg);
});
}
this.to.next(root);
})
Also another good reference: https://github.com/zrrrzzt/bullet-catcher

React-Admin <ImageInput> to upload images to rails api

I am trying to upload images from react-admin to rails api backend using active storage.
In the documentation of react-admin it says: "Note that the image upload returns a File object. It is your responsibility to handle it depending on your API behavior. You can for instance encode it in base64, or send it as a multi-part form data" I am trying to send it as a multi-part form.
I have been reading here and there but I can not find what I want, at least a roadmap of how I should proceed.
You can actually find an example in the dataProvider section of the documentation.
You have to decorate your dataProvider to enable the data upload. Here is the example of transforming the images into base64 strings before posting the resource:
// in addUploadFeature.js
/**
* Convert a `File` object returned by the upload input into a base 64 string.
* That's not the most optimized way to store images in production, but it's
* enough to illustrate the idea of data provider decoration.
*/
const convertFileToBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file.rawFile);
reader.onload = () => resolve(reader.result);
reader.onerror = reject;
});
/**
* For posts update only, convert uploaded image in base 64 and attach it to
* the `picture` sent property, with `src` and `title` attributes.
*/
const addUploadFeature = requestHandler => (type, resource, params) => {
if (type === 'UPDATE' && resource === 'posts') {
// notice that following condition can be true only when `<ImageInput source="pictures" />` component has parameter `multiple={true}`
// if parameter `multiple` is false, then data.pictures is not an array, but single object
if (params.data.pictures && params.data.pictures.length) {
// only freshly dropped pictures are instance of File
const formerPictures = params.data.pictures.filter(p => !(p.rawFile instanceof File));
const newPictures = params.data.pictures.filter(p => p.rawFile instanceof File);
return Promise.all(newPictures.map(convertFileToBase64))
.then(base64Pictures => base64Pictures.map((picture64, index) => ({
src: picture64,
title: `${newPictures[index].title}`,
})))
.then(transformedNewPictures => requestHandler(type, resource, {
...params,
data: {
...params.data,
pictures: [...transformedNewPictures, ...formerPictures],
},
}));
}
}
// for other request types and resources, fall back to the default request handler
return requestHandler(type, resource, params);
};
export default addUploadFeature;
You can then apply this on your dataProvider:
// in dataProvider.js
import simpleRestProvider from 'ra-data-simple-rest';
import addUploadFeature from './addUploadFeature';
const dataProvider = simpleRestProvider('http://path.to.my.api/');
const uploadCapableDataProvider = addUploadFeature(dataProvider);
export default uploadCapableDataProvider;
Finally, you can use it in your admin as usual:
// in App.js
import { Admin, Resource } from 'react-admin';
import dataProvider from './dataProvider';
import PostList from './posts/PostList';
const App = () => (
<Admin dataProvider={uploadCapableDataProvider}>
<Resource name="posts" list={PostList} />
</Admin>
);
When using files, use a multi-part form in the react front-end and for example multer in your API backend.
In react-admin you should create a custom dataProvider and extend either the default or built a custom one. Per implementation you should handle the file/files upload. For uploading a file or files from your custom dataprovider in react-admin:
// dataProvider.js
// this is only the implementation for a create
case "CREATE":
const formData = new FormData();
for ( const param in params.data ) {
// 1 file
if (param === 'file') {
formData.append('file', params.data[param].rawFile);
continue
}
// when using multiple files
if (param === 'files') {
params.data[param].forEach(file => {
formData.append('files', file.rawFile);
});
continue
}
formData.append(param, params.data[param]);
}
return httpClient(`myendpoint.com/upload`, {
method: "POST",
body: formData,
}).then(({ json }) => ({ data: json });
From there you pick it up in your API using multer, that supports multi-part forms out-of-the-box. When using nestjs that could look like:
import {
Controller,
Post,
Header,
UseInterceptors,
UploadedFile,
} from "#nestjs/common";
import { FileInterceptor } from '#nestjs/platform-express'
#Controller("upload")
export class UploadController {
#Post()
#Header("Content-Type", "application/json")
// multer extracts file from the request body
#UseInterceptors(FileInterceptor('file'))
async uploadFile(
#UploadedFile() file : Record<any, any>
) {
console.log({ file })
}
}