Refresh a function every 3 seconds in react native - react-native

I was wondering if it is okay to set the time interval to trigger a function every 3 seconds. Let's say I have 5 different screens in my application and all 5 screens have the time interval set to 3 seconds and will keep on calling a function to auto refresh the screen.
My concern is, will it cause a heavy traffic to the server if there are multiple users using the app at the same time and the server will keep on receiving the request from the app?
Sample code :
componentDidMount(){
this.interval = setInterval(() => {
this.loadCase()
}, 3000);
}
componentWillUnmount(){
clearInterval(this.interval);
}
loadCase(){
CaseController.loadCase().then(data=>{
if(data.status == true){
this.setState({ case: data.case })
}
})
}

If you have an API endpoint that you need to poll every 3 seconds and you're looking to avoid redundant calls from the app, try using setInterval in your App.js, or wherever the root of your app is, and dump the result into Redux/whatever state management solution you're using so that you can access it elsewhere.
To answer your question regarding "heavy traffic," yeah, that is inevitably going to be a lot of API calls that your server will need to handle. If it's going to cause issues with the current setup of your API server, I would look closely at your app and see if there's a way you can reduce the effects that large numbers of users will have, whether that's some sort of caching, or increasing the amount of time between API calls, or entirely reconsidering this approach.

Related

Can't able to call the rest api's asynchoronously

I am new to ios threads. While calling the api in the particular screen its not giving the response until 60 seconds, In between time am calling other api's from same screens or other screens its kept loading. After 60 seconds, it will show the response.
We need to call the asynchronous api's using Alamofire. but its not working
private let alamofireManager : Session
let configuration = URLSessionConfiguration.default
configuration.timeoutIntervalForRequest = 300 // seconds
configuration.timeoutIntervalForResource = 500
alamofireManager = Session.init(configuration: configuration, serverTrustManager: .none)
alamofireManager.request("sample_api",method: .post,parameters: parameters,encoding: URLEncoding.default,headers: nil).responseJSON { (response) in}
First, when getting started it's not necessary to customize anything. Especially when the values you're customizing aren't much different from the defaults.
Second, Alamofire 5.6 now ships with Swift async / await APIs, so I suggest you use those when getting started. You can investigate your network calls without tying them to specific screens until you understand the calls, how they work, and when they should be called.
Third, a 60 second delay sounds like a timeout, as 60 seconds is the default timeout wait. Is that expected? Make sure you have proper access to your server. You should also print your response so you can see what's happening and whether you got data back or an error.
Fourth, use a Decodable type to encapsulate your expected response, it makes interacting with various APIs much simpler. Do not use Alamofire's responseJSON, it's now deprecated.
So you can start your experiment by using Alamofire's default APIs.
let response1 = await AF.request("url1").serializingDecodable(Response1.self)
debugPrint(response1)
let response2 = await AF.request("url2").serializingDecodable(Response2.self)
debugPrint(response2)

Should I close page in puppetter cluster task closure when using long lasting cluster

I have a cluster for which I have defined a task. As per example in the README.md, I have a closure which accepts a page instance as an argument. I navigate to the page and capture a screenshot. I don't do anything else with the page instance. In the README.md example, there's an await for idle event and then the cluster is closed. However I have a cluster which I virtually never want to close. Should I change the behaviour of my closure in that scenario to close the page?
I suspect I have got a memory leak somewhere in my service and one of the causes I am investigating is whether the cluster closes pages after I am done using them. I use concurrency: Cluster.CONCURRENCY_CONTEXT option.
await puppeteer.task(async ({ page }) => {
// ... my screenshot logic
// do I need to do this?
await page.close();
});

Expo: Get audio data realtime and send via Socket.IO

App I want to make
I would like to make audio recognition mobile app like Shazam with
Expo
Expo AV(https://docs.expo.io/versions/latest/sdk/audio)
Tensorflow serving
Socket.IO
I want to send recording data to machine learning based recognition server via Socket.IO every second or every sample (Maybe it is too much to send data sample-rate times per second), and then mobile app receives and shows predicted result.
Problem
How to get data while recording from recordingInstance ? I read Expo audio document, but I couldn't figure out how to do it.
So far
I ran two example:
https://github.com/expo/audio-recording-example
https://github.com/expo/socket-io-example
Now I want to mix two examples. Thank you for reading. If I could console.log recording data, it would help much.
Related questions
https://forums.expo.io/t/measure-loudness-of-the-audio-in-realtime/18259
This might be impossible (to play animation? to get data realtime?)
https://forums.expo.io/t/how-to-get-the-volume-while-recording-an-audio/44100
No answer
https://forums.expo.io/t/stream-microphone-recording/4314
According to this question,
https://www.npmjs.com/package/react-native-recording
seems to be a solution, but it requires eject.
I think I found a good solution to this problem.
await recordingInstance.prepareToRecordAsync(recordingOptions);
recordingInstance.setOnRecordingStatusUpdate(checkStatus);
recordingInstance.setProgressUpdateInterval(10000);
await recordingInstance.startAsync();
setRecording(recordingInstance);
Above after creating and preparing for recording, I added a callback function that runs every 10 seconds.
const duration = status.durationMillis / 1000;
const info = await FileSystem.getInfoAsync(recording.getURI());
const uri = info.uri;
console.log(`Recording Status: ${status.isRecording}, Duration: ${duration}, Meterring: ${status.metering}, Uri: ${uri}`)
if(duration >10 && duration - prevDuration > 0){
sendBlob(uri);
}
setPrevDuration(duration);
The callback function checks if the duration is greater than 10 seconds and the difference between last duration is greater than 0, then sends the data through WebSocket.
Currently only problem, it doesn't run the callback the first time but runs the second time.

Flux without data caching?

Almost all examples of flux involve data cache on the client side however I don't think I would be able to do this for a lot of my application.
In the system I am thinking about using React/Flux, a single user can have 100's of thousands of the main piece of data we store (and 1 record probably has at least 75 data properties). Caching this much data on the client side seems like a bad idea and probably makes things more complex.
If I were not using Flux, I would just have a ORM like system that can talk to a REST API in which case a request like userRepository.getById(123) would always hit the API regardless if I requested that data in the last page. My idea is to just have the store have these methods.
Does Flux consider it bad that if I were to make request for data, that it always hit the API and never pulls data from a local cache instance? Can I use Flux in a way were a majority of the data retrieval requests are always going to hit an API?
The closest you can sanely get to no caching is to reset any store state to null or [] when an action requesting new data comes in. If you do this you must emit a change event, or else you invite race conditions.
As an alternative to flux, you can simply use promises and a simple mixin with an api to modify state. For example, with bluebird:
var promiseStateMixin = {
thenSetState: function(updates, initialUpdates){
// promisify setState
var setState = this.setState.bind(this);
var setStateP = function(changes){
return new Promise(function(resolve){
setState(changes, resolve);
});
};
// if we have initial updates, apply them and ensure the state change happens
return Promise.resolve(initialUpdates ? setStateP(initialUpdates) : null)
// wait for our main updates to resolve
.then(Promise.params(updates))
// apply our unwrapped updates
.then(function(updates){
return setStateP(updates);
}).bind(this);
}
};
And in your components:
handleRefreshClick: function(){
this.thenSetState(
// users is Promise<User[]>
{users: Api.Users.getAll(), loading: false},
// we can't do our own setState due to unlikely race conditions
// instead we supply our own here, but don't worry, the
// getAll request is already running
// this argument is optional
{users: [], loading: true}
).catch(function(error){
// the rejection reason for our getUsers promise
// `this` is our component instance here
error.users
});
}
Of course this doesn't prevent you from using flux when/where it makes sense in your application. For example, react-router is used in many many react projects, and it uses flux internally. React and related libraries/patters are designed to only help where desired, and never control how you write each component.
I think the biggest advantage of using Flux in this situation is that the rest of your app doesn't have to care that data is never cached, or that you're using a specific ORM system. As far as your components are concerned, data lives in stores, and data can be changed via actions. Your actions or stores can choose to always go to the API for data or cache some parts locally, but you still win by encapsulating this magic.

Caching best practice for mobile hybrid/bridge app development

I really need to limit any unnecessary network traffic and server trips. Solution: common sense caching. (I am not going to cache everything under the sun).
However, after reading through the Caching Files documentation and implementing a couple of quick examples, when is the best time to cache an ajax json result? Sure I can do the usual cache/no cache check each time my view is displayed. But is there a way to perform an asynchronous load during initial application startup to prefetch remote data that I know the user is going to need? Is using the connectionStateChanged event the only way (or closest way)? Is there a way to "hook" into the splash screen (yes, I know Apple wants the splash screen for mostly transition)? window.onload?
So if I understand you correctly, you're looking for a way to asynchronously fetch remote resources once for each time the app starts up, and cache those data away?
Our request module is asynchronous by nature, so you could simply drop in a forge.request.ajax to start fetching an Ajax response, then store it away in the preferences module.
Although it's probably identical in practice, you could even wrap it in a setTimeout to make it even more asynchronous:
setTimeout(function ()
forge.request.ajax({
url: 'http://example.com/method.json',
success: function (data) {
forge.prefs.set("method.json-cache", data);
}
});
}, 10);