Good day guys. I'm making a football score live data app in flutter. I don't know how to get around this, if I use http.get, I'll have to refresh Everytime to get the recent data. I don't know if a streamBuilder would work and how to go about it. Thanks in advance for your help.
As explained in the docs, StreamBuilder is a:
Widget that builds itself based on the latest snapshot of interaction with a Stream.
So, to use it, first you need to create a Stream that provides your data, pass it to the stream prop of StreamBuilder, then in the builder prop you build your widget based on the snapshot data.
Here is a short example that uses Stream.periodic to return the future every 5 seconds and yield the future call:
import 'dart:async';
import 'package:flutter/material.dart';
import 'package:http/http.dart' as http;
class PeriodicRequester extends StatelessWidget {
Stream<http.Response> getRandomNumberFact() async* {
yield* Stream.periodic(Duration(seconds: 5), (_) {
return http.get("http://numbersapi.com/random/");
}).asyncMap((event) async => await event);
}
#override
Widget build(BuildContext context) {
return StreamBuilder<http.Response>(
stream: getRandomNumberFact(),
builder: (context, snapshot) => snapshot.hasData
? Center(child: Text(snapshot.data.body))
: CircularProgressIndicator(),
);
}
}
Related
I am working on investment tracking app that will be free for everyone. I am using net Core with blazor and Blazorise for charts.
I stumbled upon a problem with rendering the charts. From the official Blazorise documentation I added method protected override async Task OnAfterRenderAsync(bool firstRender) (see in the code below). This method should redraw the charts the first time the page renders. The problem is that this method always fires twice. It renders the charts in the first go and the second time it leaves them empty (as firstRender = false the second time it fires). If I remove the if block the charts render ok.
Furthermore I've added button that should refresh the data + charts. After pressing this button the charts refresh twice (this is unwanted behaviour as it distracts the users) and what is interesting the data itself (the values) change after the second go.
Have anybody dealt with this problem before?
My html code
...
<div class="btn" #onclick="(async () => await RerenderPage())">Refresh Data</div>
...
My code
List<Models.VM.OverView> overview = new List<Models.VM.OverView>();
protected override async Task OnInitializedAsync()
{
overview = await GetOverview(); //gets overview from api
}
public async Task<List<Models.VM.OverView>> GetOverview()
{
return await Http.GetFromJsonAsync<List<Models.VM.OverView>>("/api/Overview/GetOverView/" + await GetUserIdAsync);
}
protected override async Task OnAfterRenderAsync(bool firstRender)
{
if (firstRender)
{
await HandleRedraw();
}
}
async Task HandleRedraw()
{
await pieChart.Clear();
//this method goes in the overview object and gets data from it
await pieChart.AddLabelsDatasetsAndUpdate(GetLabelsPieChart(), GetPieChartDataset());
}
What is the await Rerenderpage() doing ?
From the samples it looks like it employs the user of a bool isAlreadyInitialised flag.
protected override async Task OnAfterRenderAsync( bool firstRender )
{
if ( !isAlreadyInitialised )
{
isAlreadyInitialised = true;
await HandleRedraw();
}
}
I am assuming that async () => await RerenderPage()) calls something that then calls the chart.Update in order for the component to know its statuschanged ?
I have a Flutter app with two tabs,
when I open the first tap, the app gets data from API,
when I move to the second app and return to the first tab, the app connects to the API one more time to get the data,
what is the easier method to save the data from the first time??
use AutomaticKeepAliveClientMixin in the first tab :
class _FirstTabState extends State<FirstTab> with AutomaticKeepAliveClientMixin {
#override
Widget build(BuildContext context) {
super.build(context);
return Container();
}
#override
bool get wantKeepAlive => true;
}
I would like to mock my Bloc in order to test my view.
For example, this is my Bloc:
class SearchBloc extends Bloc<SearchEvent, SearchState> {
#override
// TODO: implement initialState
SearchState get initialState => SearchStateUninitialized();
#override
Stream<SearchState> mapEventToState(SearchState currentState, SearchEvent event) async* {
if (event is UserWrites) {
yield (SearchStateInitialized.success(objects);
}
}
}
And this is the view:
class _SearchViewState extends State<SearchView> {
final TextEditingController _filterController = new TextEditingController();
#override
void initState() {
_filterController.addListener(() {
widget._searchBloc.dispatch(FetchByName(_filterController.text));
}
}
TextField buildAppBarTitle(BuildContext context) {
return new TextField(
key: Key("AppBarTextField"),
controller: _filterController,
);
}
#override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: buildAppBarTitle(context),),
body: buildBlocBuilder(),
);
}
BlocBuilder<SearchEvent, SearchState> buildBlocBuilder() {
return BlocBuilder(
bloc: widget._searchBloc,
builder: (context, state) {
if (state is SearchStateUninitialized) {
return Container(
key: Key("EmptyContainer"),
);
}
return buildInitializedView(state, context);
}
});
buildInitializedView(SearchStateInitialized state, BuildContext context) {
if (state.objects.isEmpty) {
return Center(child: Text("Nothing found"),);
} else {
return buildListOfCards();
}
}
}
Now, this is my test:
testWidgets('Should find a card when the user searches for something', (WidgetTester tester) async {
_searchView = new SearchView(_searchBloc);
when(mockService.find( name: "a")).thenAnswer((_) =>
[objects]);
await tester.pumpWidget(generateApp(_searchView));
await tester.enterText(find.byKey(Key("searchBar")), "a");
await tester.pump();
expect(find.byType(Card), findsOneWidget);
});
}
As you can see, I just want to test that, when the user writes something in the search, and the object he's looking for exists, a card should be shown.
If I understood correctly, you are mocking some service that is used by the searchBloc. I personally try to design the app in a way that the app only depends on a bloc and the bloc may depend on some other services. Then when I would like to make a widget test, I only need to mock the bloc. You can use bloc_test package for that.
There is this example on the bloc_test page for stubbing a counterBloc:
// Create a mock instance
final counterBloc = MockCounterBloc();
// Stub the bloc `Stream`
whenListen(counterBloc, Stream.fromIterable([0, 1, 2, 3]));
however, I often do not need to stub the bloc stream and it is enough to emit the state, like this
when(counterBloc.state).thenAnswer((_) => CounterState(456));
Hope this helps.
Have a look at a post from David Anaya which deal with Unit Testing with “Bloc” and mockito.
The last version of his example is here
Sometimes widgets require a little time to build. Try with:
await tester.pumpWidget(generateApp(_searchView));
await tester.enterText(find.byKey(Key("searchBar")), "a");
await tester.pump(Duration(seconds: 1));
expect(find.byType(Card), findsOneWidget);
To mock the bloc, you can use the bloc_test package
Also, you may watch this tutorial which covers bloc testing include mock bloc very nice.
When I try to test Widget that contains
hintText: LocalizationResources.of(context).writecomment
I get an Exception saying:
The following NoSuchMethodError was thrown building CommentInput(dirty, state:
_CommentInputState#32224):
The getter 'writecomment' was called on null.
So, is there something that I'm missing?
Widget builds nicely on a device and a simulator.
This is how my test looks like:
....
final Widget widget =
MaterialApp(home: CommentWallWidget(channelUid: '123456'), title: 'jelena',);
await tester.pumpWidget(widget);
And the LocalizationResources is just simple l10n:
import 'package:flutter/material.dart';
import 'package:intl/intl.dart';
import 'l10n/messages_all.dart';
///class containing localization logic and string getters
class LocalizationResources {
static Future<LocalizationResources> load(Locale locale) {
final String name =
locale.countryCode.isEmpty ? locale.languageCode : locale.toString();
final String localeName = Intl.canonicalizedLocale(name);
return initializeMessages(localeName).then((_) {
Intl.defaultLocale = localeName;
return LocalizationResources();
});
}
static LocalizationResources of(BuildContext context) {
return Localizations.of<LocalizationResources>(
context, LocalizationResources);
}
....
You need to wrap your test widget in the MediaQuery widget and provide your localization delegate. For more an example see here.
final Widget widget = MediaQuery(
data: MediaQueryData(),
child: MaterialApp(
localizationsDelegates: [LocalizationResourcesDelegate()],
home: CommentWallWidget(channelUid: '123456'),
title: 'jelena',
)
);
I would like to implement near real-time OCR on the camera feed of my flutter app. To do this I would like to access the camera data in a speedy manner.
As far as I can tell I have two options, and have hit roadblocks with both:
Take a screenshot of the CameraPreview by putting a RepaintBoundary around it and creating a RenderRepaintBoundary, and calling boundary.toImage(). The problem with this method is that the .toImage method only seems to capture the painted widgets in the boundary and not the data from the camera preview. Simmilar to the issue described here: https://github.com/flutter/flutter/issues/17687
Capture an image with controller.takePicture(filePath) from Camera 0.2.1, similar to the example docs. The problem here is that it takes super long before the image becomes available (2-3 seconds). I guess that this is because the file is saved to the disc on capture and then needs to be read from the file again.
Is there any way that one can directly access the picture information after capture, to do things like pre-process and OCR?
For "near real-time OCR", you need CameraController#startImageStream
example code
import 'package:camera/camera.dart';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
void main() => runApp(MaterialApp(home: _MyHomePage()));
class _MyHomePage extends StatefulWidget {
#override
_MyHomePageState createState() => _MyHomePageState();
}
class _MyHomePageState extends State<_MyHomePage> {
dynamic _scanResults;
CameraController _camera;
bool _isDetecting = false;
CameraLensDirection _direction = CameraLensDirection.back;
#override
void initState() {
super.initState();
_initializeCamera();
}
Future<CameraDescription> _getCamera(CameraLensDirection dir) async {
return await availableCameras().then(
(List<CameraDescription> cameras) => cameras.firstWhere(
(CameraDescription camera) => camera.lensDirection == dir,
),
);
}
void _initializeCamera() async {
_camera = CameraController(
await _getCamera(_direction),
defaultTargetPlatform == TargetPlatform.iOS
? ResolutionPreset.low
: ResolutionPreset.medium,
);
await _camera.initialize();
_camera.startImageStream((CameraImage image) {
if (_isDetecting) return;
_isDetecting = true;
try {
// await doSomethingWith(image)
} catch (e) {
// await handleExepction(e)
} finally {
_isDetecting = false;
}
});
}
Widget build(BuildContext context) {
return null;
}
}
This functionality was merged to https://github.com/flutter/plugins but it was not well documented.
Ref:
https://github.com/flutter/flutter/issues/26348
https://github.com/flutter/plugins/pull/965
https://github.com/bparrishMines/mlkit_demo/blob/master/lib/main.dart#L43
https://youtu.be/OAEWySye0BQ?t=1460
A better solution today (2022) for real-time OCR is to use the camera in a loop with a frequency of 500ms and process the image using google ML Kit's Text recognition.