I need the real time GPS coordinates from the AR Drone 2.0 which has the flight recorder in it.And i couldn't find any method to get the values from GPS directly. Is there any way i could get the GPS data from the AR Drone
Here is what's working for me:
var arDrone = require('ar-drone');
var droneClient = arDrone.createClient();
droneClient.config('general:navdata_demo', 'FALSE'); // get back all data the copter can send
droneClient.config('general:navdata_options', 777060865); // turn on GPS
droneClient.on('navdata', function(navdata) {
console.log(navdata.gps.latitude + ', ' + navdata.gps.longitude);
// do stuff with the GPS information....
});
droneClient.takeoff(); .....
This code alone did not get me the GPS information. I also had to comment out part of the parseNavdata.js (in ar-drone/lib/navdata) code. Look for: "'gps': function(reader) {" ~line 546 in my file.
Comment out the bottom half of that function:
'gps': function(reader) {
return {
latitude: reader.double64(),
longitude: reader.double64(),
elevation: reader.double64(),
hdop: reader.double64(),
data_available: reader.int32(),
unk_0: timesMap(8, reader.uint8, reader),
lat0: reader.double64(),
lon0: reader.double64(),
lat_fuse: reader.double64(),
lon_fuse: reader.double64(),
gps_state: reader.uint32(),
unk_1: timesMap(40, reader.uint8, reader),
vdop: reader.double64(),
pdop: reader.double64(),
speed: reader.float32(),
last_frame_timestamp: droneTimeToMilliSeconds(reader.uint32()),
degree: reader.float32(),
degree_mag: reader.float32()
// unk_2: timesMap(16, reader.uint8, reader),
// channels: timesMap(12, reader.satChannel, reader),
// gps_plugged: reader.int32(),
// unk_3: timesMap(108, reader.uint8, reader),
// gps_time: reader.double64(),
// week: reader.uint16(),
// gps_fix: reader.uint8(),
// num_satellites: reader.uint8(),
// unk_4: timesMap(24, reader.uint8, reader),
// ned_vel_c0: reader.double64(),
// ned_vel_c1: reader.double64(),
// ned_vel_c2: reader.double64(),
// pos_accur_c0: reader.double64(),
// pos_accur_c1: reader.double64(),
// pos_accur_c2: reader.double64(),
// speed_accur: reader.float32(),
// time_accur: reader.float32(),
// unk_5: timesMap(72, reader.uint8, reader),
// temperature: reader.float32(),
// pressure: reader.float32()
};
Other posts (https://github.com/felixge/node-ar-drone/issues/75) imply that this has been fixed and merged, but that must not be the case.
Related
I'm trying to provide my audioHandler on my Player class, but something weird is happening
When I enter the screen, the StreamBuilder will go active just fine but if i pop and navigate to the screen again the stream connect will stay on 'waiting' forever, unless i play the audio. This causes some weird behaviors. What m i doing wrong?
relevant code
Player class
final audioHandlerProvider = Provider<AudioHandler>((ref) {
AudioHandler _audioHandler = ref.read(audioHandlerServiceProvider);
return _audioHandler;
});
class _PlayerClicVozzState extends State<PlayerClicVozz> {
#override
Widget build(BuildContext context) {
return Scaffold(
extendBodyBehindAppBar: true,
backgroundColor: Color(0xff131313),
appBar: AppBar(
automaticallyImplyLeading: false,
actions: [
IconButton(
icon: Icon(Icons.clear, color: Colors.white),
onPressed: () => Navigator.of(context).pop(),
),
],
backgroundColor: Colors.transparent,
elevation: 0,
),
body: Center(
child: Consumer(builder: (context, watch, child) {
final res = watch(audioHandlerProvider);
return StreamBuilder<MediaState>(
stream: _mediaStateStream(res),
builder: (context, snapshot) {
final mediaState = snapshot.data;
return SeekBar(
duration: mediaState?.mediaItem?.duration ?? Duration.zero,
position: mediaState?.position ?? Duration.zero,
onChangeEnd: (newPosition) {
res.seek(newPosition);
},
);
},
);
...
audioservice init
late AudioHandler _audioHandler;
final audioHandlerServiceProvider = Provider<AudioHandler>((ref) {
return _audioHandler;
});
Future<void> main() async {
_audioHandler = await AudioService.init(
builder: () => AudioPlayerHandler(),
config: AudioServiceConfig(
androidNotificationChannelId: 'com.mycompany.myapp.channel.audio',
androidNotificationChannelName: 'Audio playback',
androidNotificationOngoing: true,
),
);
...
My audiohandler is exatcly the same as the plugin example
import 'package:audio_service/audio_service.dart';
import 'package:just_audio/just_audio.dart';
class AudioPlayerHandler extends BaseAudioHandler with SeekHandler {
static final _item = MediaItem(
id: 'https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3',
album: "Science Friday",
title: "A Salute To Head-Scratching Science",
artist: "Science Friday and WNYC Studios",
duration: const Duration(milliseconds: 5739820),
artUri: Uri.parse(
'https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg'),
);
final _player = AudioPlayer();
/// Initialise our audio handler.
AudioPlayerHandler() {
// So that our clients (the Flutter UI and the system notification) know
// what state to display, here we set up our audio handler to broadcast all
// playback state changes as they happen via playbackState...
_player.playbackEventStream.map(_transformEvent).pipe(playbackState);
// ... and also the current media item via mediaItem.
mediaItem.add(_item);
// Load the player.
_player.setAudioSource(AudioSource.uri(Uri.parse(_item.id)));
}
// In this simple example, we handle only 4 actions: play, pause, seek and
// stop. Any button press from the Flutter UI, notification, lock screen or
// headset will be routed through to these 4 methods so that you can handle
// your audio playback logic in one place.
#override
Future<void> play() => _player.play();
#override
Future<void> pause() => _player.pause();
#override
Future<void> seek(Duration position) => _player.seek(position);
#override
Future<void> stop() => _player.stop();
/// Transform a just_audio event into an audio_service state.
///
/// This method is used from the constructor. Every event received from the
/// just_audio player will be transformed into an audio_service state so that
/// it can be broadcast to audio_service clients.
PlaybackState _transformEvent(PlaybackEvent event) {
return PlaybackState(
controls: [
MediaControl.rewind,
if (_player.playing) MediaControl.pause else MediaControl.play,
MediaControl.stop,
MediaControl.fastForward,
],
systemActions: const {
MediaAction.seek,
MediaAction.seekForward,
MediaAction.seekBackward,
},
androidCompactActionIndices: const [0, 1, 3],
processingState: const {
ProcessingState.idle: AudioProcessingState.idle,
ProcessingState.loading: AudioProcessingState.loading,
ProcessingState.buffering: AudioProcessingState.buffering,
ProcessingState.ready: AudioProcessingState.ready,
ProcessingState.completed: AudioProcessingState.completed,
}[_player.processingState]!,
playing: _player.playing,
updatePosition: _player.position,
bufferedPosition: _player.bufferedPosition,
speed: _player.speed,
queueIndex: event.currentIndex,
);
}
}
MediaStateStream and QueueStateStream
Stream<MediaState> _mediaStateStream(AudioHandler audioHandler) {
return Rx.combineLatest2<MediaItem?, Duration, MediaState>(
audioHandler.mediaItem,
AudioService.position,
(mediaItem, position) => MediaState(mediaItem, position));
}
_queueStateStream(AudioHandler audioHandler) {
return Rx.combineLatest2<List<MediaItem>?, MediaItem?, QueueState>(
audioHandler.queue,
audioHandler.mediaItem,
(queue, mediaItem) => QueueState(queue, mediaItem));
}
When you subscribe to a stream, you only start receiving new events that are emitted after the moment that you subscribe, and you may have a period of waiting for that next event.
In your implementation of _mediaStateStream you are making use of AudioService.position which only emits events when the position is changing (i.e. not paused or stalled). So even though the stream may have emitted position events in the past, if you subscribe to that stream again while paused or stalled, you will be in a waiting state until the next position event arrives which is after playback resumes again.
I would suggest wrapping your stream in rxdart's BeehaviorSubject so that it retains a memory of the last event and re-emits the last event to new listeners. Also, you could seed this BehaviorSubject with the very first value to ensure there is no waiting period even for the first listener:
_mediaStateSubject = BehaviorSubject.seeded(MediaState(
handler.mediaItem.valueOrNull,
handler.playbackState.position))
..addStream(_mediaStateStream(handler));
Then you can listen to _mediaStateSubject instead of _mediaStateStream.
I am trying make a multi colored polyline. I have been able to do it successfully before with Vue.js but now we are adding it to react native app and its not working as i expected in React js.
I am making multiple polylines, each line (segment) has multiple points. I have a structure like this:
groups: [ { key: 'BLUE', cordinates: [] }, key: 'GREEN', cordinates: [] ];
Now each key represent a color and cordinates is an array of cordinates. Now when I loop it like this:
{
this.state.groups.map((group, index) => {
return (
<Polyline
key={index}
coordinates={group.Points}
strokeColor={
group.Key === "GREEN" ? "#0F0" : "#000"
// "#000"
// group.Key === "G"
} // fallback for when `strokeColors` is not supported by the map-provider
strokeColors={[
'#7F0000',
'#00000000', // no color, creates a "long" gradient between the previous and next coordinate
'#B24112',
'#E5845C',
'#238C23',
'#7F0000'
]}
strokeWidth={6}
/>
);
})
}
The problem is it works! perfectly but it doesnt draw the last polyline which is being updated. So for example there are 10 segments in this polyline. Now after 3 are drawn and loop is on 4th segment, its pushing each cordinate in the last group with a delay of 30 ms. I added delay to show it animated. Now it won't draw on map untill all cordinates of the 4th segments are pushed. When its done, and 5th segment is started, 4th segment shows perfectly but now 5th segment stops working.
I know that points are being perfectly added because I have added a Camera as well and I change its center to be last point that was pushed in groups/segments.
Group/Segments loop:
addPoint(group, point) {
var data = this.state.data;
if (group <= (data.length - 1)) {
var g = data[group];
// console.log('g', g);
if (point <= (g.Item2.length - 1)) {
var p = g.Item2[point];
var {groups} = this.state;
// console.log('groups,', groups);
groups[group].Points = groups[group].Points.concat({
longitude: p.longitude,
latitude: p.latitude,
});
this.MapView.animateCamera({
center: {
latitude: p.latitude,
longitude: p.longitude,
},
duration: 100,
zoom: 15,
});
point++;
setTimeout(() => {
this.addPoint(group, point);
}, 300);
} else {
point = 0;
group++;
if (group < this.state.data.length - 1) {
var key = this.state.data[group].Item1;
console.log('key', key);
var groups = this.state.groups.concat({
Key: key,
Points: [],
});
this.setState({
groups: groups,
})
}
setTimeout(() => {
this.addPoint(group, point);
}, 300);
}
} else {
console.log('last group reached');
}
}
Is there any solution for this?
I figured it out. The problem was whenever I updated the coordinates array of any polyline, it had to re-render whole thing which was performance wise very poor decision.
I solved it by making a custom polyline component which maintains its own coordinates array. Implemented an inner timeout function which pushes coordinates incrementally. This solved the problem and its now super easy to use.
You can read more about this here: multi colored gradient polyline using google maps on react native
I'm trying to get same size of my circle while zooming. I was tryging to do that with settings "scale" but it also doesnt work. How can i do that?
Could someone show me how can do that, maybe with any example ?
<MapView.Circle
key = { (this.state.longitude + this.state.latitude).toString() }
center = {{latitude: this.state.latitude,
longitude: this.state.longitude} }
radius = { 100 }
strokeWidth = { 1 }
strokeColor = { '#1a66ff' }
fillColor = { '#1a66ff' }
/>
Best regards
One way to do it is to save the current map region to the state, this expresses the coordinate domain which the viewer sees...
onRegionChange(region) {
// Update state
this.setState({ mapRegion: region });
}
<MapView
onRegionChange={this.onRegionChange.bind(this)}
/>
With this, you can use simple trigonometry to resize the radius kind of like this...
size = {radius / Math.sqrt(this.state.mapRegion.longitudeDelta*this.state.mapRegion.longitudeDelta + this.state.mapRegion.latitudeDelta*this.state.mapRegion.latitudeDelta)}
I am using navigator.geolocation.watchPosition in my react native project to paint a path on a map while the user is moving. I noticed that the return frequency is quite low for this function. I taught it was the frequency at least, when I tested using the iOS emulator and the "freeway drive" mode in the gps emulator. Now when I tested with "city run" instead, I can see that the return frequency of the position is not dependent on some time interval, but instead on a distance... The function is returning its position once each 100 meters, no matter how long it took for the position to change that much.
Why is it like this? Is this a expected behaviour? I don't know if it has to do with the iOS emulator or with my code, but I would really like the position to be more precise, I want it to return as often as possible.
componentDidMount() {
const { region } = this.state;
navigator.geolocation.getCurrentPosition(
(position) => {
this.setState({position});
},
(error) => alert(JSON.stringify(error)),
{enableHighAccuracy: true, timeout: 20000, maximumAge: 1000}
);
this.watchID = navigator.geolocation.watchPosition((lastPosition) => {
var { distanceTotal, record } = this.state;
this.setState({lastPosition});
if(record) {
var newLatLng = {latitude:lastPosition.coords.latitude, longitude: lastPosition.coords.longitude};
this.setState({ track: this.state.track.concat([newLatLng]) });
this.setState({ distanceTotal: (distanceTotal + this.calcDistance(newLatLng)) });
this.setState({ prevLatLng: newLatLng });
}
},
(error) => alert(JSON.stringify(error)),
{enableHighAccuracy: true, timeout: 20000, maximumAge: 0});
}
There's an option you can set that is called distanceFilter which you set the accuracy in meters. It's stated in the documentation for geolocation but not explained what it does or the default value. If you take a look at the source code at github the default is set to 100 meters, which explains your behavior.
If you want 1 meter accuracy you set the options as:
{enableHighAccuracy: true, timeout: 20000, maximumAge: 0, distanceFilter: 1}
I'm doing something using Geolocation API and ReactJS, and I'm storing location as a state variable, since the component changes location as the web page moves around or as the user randomly changes location.
So the component looks like this:
var GeoComp = React.createClass({
setPosition: function(position){
var lat = position.coords.latitude;
var longitude = position.coords.longitude;
this.setState({longitude: {longitude}, latitude: {lat}});
this.setState({statusText: 'Successfully found you at ' + this.state.longitude + ',' + this.state.latitude});
},
getInitialState: function(){
return {longitude: 0, latitude: 0, placeName: '', statusText: 'Locating you....'};
},
componentDidMount: function(){
if (!navigator.geolocation){
this.setState({statusText: 'Your browser does not support geolocation...'});
}
else{
navigator.geolocation.getCurrentPosition(this.setPosition, this.errorPosition);
}
}
......
But the problem I'm running into is extracting the double from longitude and latitude variables. React stores them as objects. How can I actually store the doubles and pass them to child components?
I'm sorry -- the problem was poor syntax when setting up initialState:
this.setState({longitude: longitude, latitude: lat});
as opposed to
this.setState({longitude: {longitude}, latitude: {lat}});
It's a n00b mistake about how React syntax works :-/