I am looking for a way to integrate a tensorflow.js model into my react-native application which is built using EXPO.
The model needs to **be able to access the camera** to **detect real-time** sign-language letters.
My current solution:
Train a model via Google's Teachable Machine platform and use their code.
The platform supplies a Web script that I uploaded to the cloud.
You can see the website here
Using the 'react-native-webview' I was able to present the site inside my app.
<WebView source={{uri: 'https://whatever.tiiny.site/'}} style={{ marginTop: 20 }} />
However, it feels like cheating and doesn't look very good.
I also built my own React.js project with the sign-language model, and tried to convert it to react-native. It failed as well.
I know react-native released a tflite-react-native, tensorflowjs-react-native packages and I have read time and again their documentation but I wasn't able to convert it to my needs.
BTW:
I also found this project:
https://github.com/expo/examples/tree/master/with-tfjs-camera
which is very close to what I need, but they are using '#tensorflow-models/mobilenet' and I need to use my own tensorflow model.
Relevant\similar posts:
how to use teachable machine model in react native expo
Related
I am trying to build this video-calling application using React Native framework and express, mongoDB and firebase Authentication as the backend services.
To render the image I am using expo-camera package and I am testing it on my android phone but image coming is distorted.
After a lot of googling I got to know about this aspect ratio issue and I also tried this this prepareAspectRatio functions given by some of the techies and getAvailableAspectRatio but none of them are working. Please help.
I am building a React Native Application using Expo and I want to integrate a 1:1 video calling functionality in the app.
From what I have researched so far on the topic is that I can use SDKs of various libraries like Twilio, Videosdk, VoxImplant etc to implement that feature or I have to use WebRtc in native project alongwith some mechanism to create rooms using socket.io and node and then join users in that room (not completely sure about it but something like this)
But both of these solutions require me to make changes in native files which are not present in expo app by default for which I think I have to run expo run:android and then make require changes in files (correct me if I am wrong)
Although on web I think its relatively easy to implement video calling using vanilla js or react js.
My question is if I implement a webpage that has video calling function and try to open it in webview in my expo react native app will the functionality work on app or not? has someone tried this before.
As I was exploring options I came BigBlueButton APIs and another question on Stackoverflow that is using Webview to connect to BigBlueButton APIs. Can I use this logic to implement something in expo app without ejecting or using any sdks? Will it work
What would be the best way to implement video calling in my expo app
Thanks
Utilizing Expo, you are essentially using 'React Native for Web', and with the new functionality of Expo config-plugins, almost all packages for React Native with auto-linking can be made to work with your Expo app, or more-or-less can have Config Plugins created for them.
For your case, good news for you, you can make it all work on Expo managed workflow, just utilize expo-dev-client and the following library:
react-native-webrtc
There's an Expo config plugin for this package. So now all you have to do is just utilize the Web-based functionality of WebRTC, such as calling:
navigator.mediaDevices.getUserMedia()..
navigator.mediaDevices.enumerateDevices()..
navigator.*
And ensure it works properly on Web. Then, just make a platform hook at your App.js / First loaded module / before having these WebRTC based functionalities calling the aforementioned library's initializer. e.g:
import React, { useEffect } from "react";
import { Platform } from "react-native";
import { registerGlobals } from "react-native-webrtc";
...
Platform.OS !== "web" && useEffect(() => {
registerGlobals();
}, []);
...
One more thing you'd have to do is that for Web, have the <video> element and for native apps, resolve it to <RTCView>, so essentially you could also make a platform specific component/module that resolves to either the web-only <video> tag or the native <RTCView> based on platform, like the example above. Perhaps even have the imports be resolved based on dependencies if you face any errors importing 'registerGlobals' at Web, for example.
I'm building a React Native App for a Chinese Company. I'm using Expo.
I really would love to use Google Maps, but it is not allowed there...
The best solution I found was to use Baidu Map,
I searched for SDKs and found some of them in GitHub.
I decided to use this one: https://github.com/qiuxiang/react-native-baidumap-sdk
which provides great documentation.
Although, I'm having some trouble implementing it to the app. I think that they don't support expo.
Does anyone here ever have a similar problem?
Or used another map...
Would really save my life!
This library uses native (Android & iOS) SDKs and provides a React Native API on top of them. Expo does not yet support custom native modules, which means you'll have to eject from it to use this package (or any other that uses BaiduMap or other native code).
I'm guessing they probably have a web based JS SDK instead which you could try to integrate into your app via a <WebView /> instead?
It is a question to professional developers at react-native. My react-native application needs augmented reality in it to develop complex games. I tried using viro-react, if you know this package. Developers of viro-react gave up on the package:
Not supported on IOS anymore
Documentation for some components is missing or poor
Are here people who is successfully using AR in their apps? Can you please tell me how you are doing it (different package or native components)? If it is native components, can you please tell me in a nut shell how to use native components with AR?
Thank you
Viro is actually a company that no longer exists, so don't hold your breath waiting for them. You can try using this example that works on both Android and iOS if you really want to use viro.
From my experience, there is no great solution to building AR apps in react native. You can also build the AR part of your app in Unity and import it into your app using this package.
I have recently started to build a react-native app, which I want to build for both iOS and Android using a firebase Backend, and Branch.io/Firebase deep-linking services. I want to include in it both OAuth provided by Google and some kind of invitation link by using deep-linking. I have encountered problems in finding a way of supporting both.
It is worth mentioning that I am new to React and javascript, and have no experience in it.
I have tried using react-native-cli and tried to add OAuth to Google using firebase. I couldn't find enough documentation, so I followed a tutorial on youtube, and only managed to make it work by switching to Expo instead of react-native-cli.
Then I tried implementing some kind of invitation to groups in my app, using deep linking. I tried firebase's Dynamic links but got an error that firebase.links() is not a function and couldn't solve it due to lack of docs about it.
I then found out about react-native-branch library. It is in Alpha version in Expo so in order to use the native module I found out that I had to detach from Expo, and I wonder if there's an easier way to do it.
I've spent several days, as expected, in order to try and solve those issues. Is there a relatively simple way to do it?