Where Can I Find Envisat Asar Dataset or SAR Geotiff Dataset for Oil Spill Detection? - sar

I am working on oil spill detection with imageJ. And I am looking for freely available dataset for Testing my implemented work. If anyone can help me...

Related

monitor cpu, gpu, memory usage in colab (pro)

I want to track the usage of the above resources while training a model with pyspark. Is there any built in method from colab, I have purchased colab pro, in order to store those numbers in a txt file? The goal is to train a model to predict these values, so we need a big amount of data, so monitoring by the graphs on the right hand side is not an option.
I have also tried using wandb, but couldn't make sense of it, so if someone has a tutorial i would be grateful.

Using estimators with tf xla.compile

I am using some canned estimators provided by tensorflow such as DNN .
The resulting model size are quite big (several hundred megabytes) and also latency are pretty high (hundreds of milliseconds). I want to try AOT compilation to see if i get serving latency improvements and model size reduction. Is there an example of how can i get xla.compile working with canned estimator . Can you point me to some example code or any one has similar experience guide me in the right direction ?

Is there a way to reduce the ram usage when training with "dlib.train_simple_object_detector"?

I've been trying to use "dlib.train_simple_object_detector" to make a detector to detect pedestrians. After a few hours of making rectangles in imglab I made the Inria person dataset consisting of about 600 into an xml file. But when I try to train on this data I get "MemoryError: bad allocation" I have 16GB ram but I guess that's not enough here.
So I reduce the amount of training images and find that it onlys starts working when there are about 100 images.
When I try this detector on test images the detection rate is pretty bad so I would really like to train on all the images or atleast on more of them than 100.
So my questions are:
Is there a way to reduce the ram usage when training with "dlib.train_simple_object_detector"?
Is there an already trained .svm file somewhere that is trained for pedestrian detection that I can use insead?
Thanks for any help you can give!

How does the object-detection methods really work?

I'm in a group project in school and we are using the tensorflow object-detection API in a Raspberry Pi 3 but do not know how the object detection methods, SSD (single shot detector) and CNN (convolutional neural network), works underneath.
Can someone give a simple yet non-trivial explanation on how SSD and CNN works and recommendations on possible factors that might optimize the speed of the object detection methods.
Please link us to good articles if you know any!

How to predict using Tensorflow?

This is a newbie question for the tensorflow experts:
I reading lot of data from power transformer connected to an array of solar panels using arduinos, my question is can I use tensorflow to predict the power generation in future.
I am completely new to tensorflow, if can point me to something similar I can start with that or any github repo which is doing similar predictive modeling.
Edit: Kyle pointed me to the MNIST data, which I believe is a Image Dataset. Again, not sure if tensorflow is the right computation library for this problem or does it only work on Image datasets?
thanks, Rajesh
Surely you can use tensorflow to solve your problem.
TensorFlowâ„¢ is an open source software library for numerical
computation using data flow graphs.
So it works not only on Image dataset but also others. Don't worry about this.
And about prediction, first you need to train a model(such as linear regression) on you dataset, then predict. The tutorial code can be found in tensorflow homepage .
Get your hand dirty, you will find it works on your dataset.
Good luck.
You can absolutely use TensorFlow to predict time series. There are plenty of examples out there, like this one. And this is a really interesting one on using RNN to predict basketball trajectories.
In general, TF is a very flexible platform for solving problems with machine learning. You can create any kind of network you can think of in it, and train that network to act as a model for your process. Depending on what kind of costs you define and how you train it, you can build a network to classify data into categories, predict a time series forward a number of steps, and other cool stuff.
There is, sadly, no short answer for how to do this, but that's just because the possibilities are endless! Have fun!