I have used react-native autosuggest, but it is not showing results according to what user has typed. No results are getting showed.Can you suggest me autosuggest wherein as soon as user types a text filtered results are shown and as user clicks on any element from autosuggest list then autosuggest gets closed I want it for both android and ios and also how to use it.
react-native-autocomplete-input is one of my favorite for the purpose
Related
I am trying to implement react-native-autocomplete-input in my app, currently I am facing a blocker, where the item I have entered in the auto search, if it is not present in the list, I need to add it to the list. Is there any way for it ?
I have a long list of company names need to be displayed in the picker dropdown and it's causing the app to freeze. I am aware of the infinite scroll if it's viewing the content, where you fetch a limited size of the data from the server side and load more as the scroll reach to the bottom. But does it apply the same concept for a dropdown picker?
I am using the library #react-native-picker/picker and considering the behavior of the component I have no idea how to handle it.
I haven't done anything on my codes yet. Currently it's fetching all the listing from the server side and dump all the data in the picker.
usually it's best practice for large select list to create a modal screen which open up by clicking on the selectbox and there you provide a search box on top and show list using FlatList as it's good for large set of data and will not freeze your screen while rendering.
Check out this
This issue 2850 mentioned something may help you and to be clear there was a solution mentioned by M1K3Yio
please check the code using react-window in this link !!
I inherited a vuejs project. People using screen readers as assistive devices complain that their screen readers are unable to read the options in drop down menus that were made from vue-search-select. Here is how you can reproduce the issue:
Install a screen reader such as NVDA.
Turn on NVDA screen reader.
Go to https://vue-search-select.netlify.app/#/model
Tab to a search text field.
Confirm drop down of results appear.
Press the down arrow key to focus on any of the search result items.
Confirm the NVDA says the word "Blank" instead of actually reading out the contents of the selected item.
Here is a 10 second clip to that demonstrates steps 3 of 7.
https://www.youtube.com/watch?v=Nxx1k1oKETI
How do you modify vue-search-select such that in step 7, the screen reader will read out the contents of the selected item instead of reading out the word "Blank"?
Right now, as a temporary solution, I'm trying to write a setTimeout function that will automatically add the appropriate meta data to force screen readers to read out the content. But I'm not sure how successful this approach will be. I prefer an approach that is idiomatic to vue-search-select.
I tried adding a customAttr like so:
<model-select :custom-attr="ariaAttrs" />
function ariaAttrs() {
return function() { return '" aria-label="hello" tabindex="0'; }
}
Although the attributes do appear in my developer console's inspector, my screen reader still does not read out the options.
It seems custom-attr will not help you as it does not allow you to add any attr you want - anything the function returns is just placed as a value of data-vss-custom-attr attribute
Any decent Vue library with similar functionality would offer a slot to customize rendering of menu items, but this does not. Plus it doesn't seem to be maintained for a long time so maybe it is a time to look for an alternative....
Tell us about Dailogflow.
In the settings of Intents, it was set to display the Carousel card of GoogleAssistant in the response.
I've been able to display the carousel card, but I want to select one of these cards by voice action and open the URL set for that card.
I couldn't find it in the reference, so please let me know if there is a way to achieve this.
If this is not possible, is there any other way to achieve equivalent content?
I don't have good English.Thank you for reading.
You can use follow up intent. So when you are displaying a list or carousel card and you want to make this selection by click and as well as by user utterance. You have to add two follow up intent one for selection one for text utterance.
consider below example: ShippingOption intent displays a list and that can be selectable as well as get the response by user utterance.
app.js
intentMap.set("shippingOptionIntent", shippingOptionIntent);
intentMap.set("shippingOptionIntent - select", shippingOptionIntentSelect);
intentMap.set("shippingOptionIntentChoose", shippingOptionIntentChoose);
Now set the utterance for followup intent:
Where another followup intent responsible for user selection by clicking on the option. It has the google assistant option event.
So, in this way you can handle both types of responses.
I have an app where the user answers a lot of questions. E.g he clicks "I have a car", then I give him some new questions like "which color" and "which brand". The list is dynamic so I never know exactly how far it is from the top to the next question.
Is it possible to do something like Flatlist.scrollToLocation('#second_question')?
Well, I know scrollToLocation does not support scroll to id, but does React Native have any other lists that I can use to achieve this?
To me it looks like scrollToLocation requires that one need the exact length. Is there a list in React Native that support "scroll to anchor / id"?