With the impending deprecation of the Fusion Tables SQL API, I want to ensure my application continues to operate as expected.
I use the Google Visualization api to visualize and query the Fusion tables like this:
To visualize the map:
map = new google.maps.Map(document.getElementById("map_canvas"), myOptions);
layer = new google.maps.FusionTablesLayer({
query: {
select : 'Latitude',
from : table
}
});
layer.setMap(map);
To query the fusion tables:
var query = "SELECT * FROM " + tableid;
query = encodeURIComponent(query);
var gvizQuery = new google.visualization.Query(
'http://www.google.com/fusiontables/gvizdata?tq=' + query);
My question is, what URL endpoints do I have to change to ensure my application continues working?
I would really appreciate some guidance on this subject.
Only the Fusion Tables API changed, see the migration guide for details.
To visualize the map you use FusionTablesLayer, there you don't have a API endpoint, just make sure to use the encrypted table id.
Here you use the Google Visualization API, which did not change. But if you use the Google Visualization API to get data (rather than to visualize data), you should consider to use the Fusion Tables API instead.
Related
I am trying to create a Bigquery UDF function that uses Googles Geocoding service.
It seems we can import external libraries with the option parameter but I feel that I cant use the Geocoding service here.
Following my function approach:
CREATE OR REPLACE FUNCTION
functions.returnGeoCode(address STRING)
RETURNS Array<String>
LANGUAGE js AS """
var geocoder = new google.maps.Geocoder();
geocoder.geocode( { 'address': address}, function(results, status) {
if (status == google.maps.GeocoderStatus.OK) {
var latitude = results[0].geometry.location.lat();
var longitude = results[0].geometry.location.lng();
alert(latitude);
}
});
"""
which complains because it does not know google of course saying ReferenceError: google is not defined at UDF$1(STRING) line 2, columns 23-24 when I try to use the function.
My ultimate goal is to convert addresses that I have in a Bigquery dataset to lat/longs so I can then create a heatmap in a visualization tool.
Any tips for my approach or something totally different? I saw some suggestions to use some public Bigquery datasets (openstreetmaps suggestion) but I have addresses from Germany and it does not cover that well.
Also Bigquery does not support the conversion this way it seems.
Thank you in advance!
Since the feature you need is from Geocoding, what I could suggest is you can script out everything where you use BigQuery API (to execute queries) and Geocoding API (to perform Geocoding calculations) in a javascript I suppose. You can perform the calculation of Geocoder separately from the query, afterwards use the returned value from Geocoder on your query using BigQuery API.
You can include a self-contained external javascript library, but it would not work with Geocoder service - here the javascript library makes external HTTP calls, which is not allowed for javascript UDF.
I think the appropriate solution is Cloud DataFlow - you can include arbitrary code there, without security and performance restrictions of UDF, read data out of BigQuery table, and write the results back.
If you have a lot of data, and Geocoder service becomes expensive, I think OpenStreetMaps can help - try to resolve data using OSM tables, then resolve the remaining addresses using Geocoder service.
Please check my previous question
EMBER JS - Fetch associated model data from back-end only when required
Related to the above question I need help on API formation in ruby on rails(JSON format: jsonapi.org)
how to form the API for sideloading only students.records and link with data already available in ember-data store (school and students)
based on the comments in the other question, I think you're wanting something like
GET /api/students?include=records
But you need that filtered to a school, which is where application-specific code can come in, as { json:api } does not dictate how filtering should happen
but, I've used this: https://github.com/activerecord-hackery/ransack with much success
So, your new query would be something like:
GET /api/students?include=records&q[school_id_eq]=1
to get all students and their records for the school with id 1
and then to make this query in ember:
store.query('student', {
include: 'records',
q: {
['school_id_eq']: 1
}
});
hope this helps
I am developing a search engine with angular 2.
Therefore I use APIs from multiple platforms.
It works if I call the search function from every api service manually.
But is it possible to do the same foreach api service?
Every api service has the same function:
search (query: string): Observable<Array<SearchResult>> { ... }
In the UI I want to separate the results by tabs.
Therefore every api service has a title:
public title: string = "the title";
For storing the search results locally I have a class that is extended by every api service. This class has helper functions etc.
Depending on the behaviour you need you can use merge, concat or forkJoin to merge multiple streams into one.
The code would look pretty much the same.
For example using merge in order to merge 2 streams into one.
If you have a list of apis you need to call for the search. Your code would look like this.
let apis: string[] = [];
let observables = apis.map(api => search(api)); // get an array of observables
let merged = observables.reduce((previous, current) => previous.merge(current), new EmptyObservable()); // merge all obserbable in the list into one.
merged.subscribe(res => doSomething(res));
This article might be helpful.
I'm working on an web app that collects traffic information for websites that use my service. Think google analytics but far more visual. I'm using SQL Server 2012 for the backbone of my app and am considering using MongoDB as the data gathering analytic side of the site.
If I have 100 users with an average of 20,000 hits a month on their site, that's 2,000,000 records in a single collection that will be getting queried.
Should I use MongoDB to store this information (I'm new to it and new things are intimidating)?
Should I dynamically create new collections/tables for every new user?
Thanks!
With MongoDB the collection (aka sql table) can get quite big without much issue. That is largely what it is designed for. The Mongo is part HuMONGOus (pretty clever eh). This is a great use for mongodb which is great at storing point in time information.
Options :
1. New Collection for each Client
very easy to do I use a GetCollectionSafe Method for this
public class MongoStuff
private static MongoDatabase GetDatabase()
{
var databaseName = "dbName";
var connectionString = "connStr";
var client = new MongoClient(connectionString);
var server = client.GetServer();
return server.GetDatabase(databaseName);
}
public static MongoCollection<T> GetCollection<T>(string collectionName)
{
return GetDatabase().GetCollection<T>(collectionName);
}
public static MongoCollection<T> GetCollectionSafe<T>(string collectionName)
{
//var db = GetDatabase();
var db = GetDatabase();
if (!db.CollectionExists(collectionName)) {
db.CreateCollection(collectionName);
}
return db.GetCollection<T>(collectionName);
}
}
then you can call with :
var collection = MongoStuff.GetCollectionSafe<Record>("ClientName");
Running this script
static void Main(string[] args)
{
var times = new List<long>();
for (int i = 0; i < 1000; i++)
{
Stopwatch watch = new Stopwatch();
watch.Start();
MongoStuff.GetCollectionSafe<Person>(String.Format("Mark{0:000}", i));
watch.Stop();
Console.WriteLine(watch.ElapsedMilliseconds);
times.Add(watch.ElapsedMilliseconds);
}
Console.WriteLine(String.Format("Max : {0} \nMin : {1} \nAvg : {2}", times.Max(f=>f), times.Min(f=> f), times.Average(f=> f)));
Console.ReadKey();
}
Gave me (on my laptop)
Max : 180
Min : 1
Avg : 6.635
Benefits :
Ease of splitting data if one client needs to go on their own
Might match your brain map of the problem
Cons :
Almost impossible to do aggregate data over all collections
Hard to find collections in Management studios (like robomongo)
2. One Large Collection
Use one collection for it all access it this way
var coll = MongoStuff.GetCollection<Record>("Records");
Put an index on the table (the index will make reads orders of magnitude quicker)
coll.EnsureIndex(new IndexKeysBuilder().Ascending("ClientId"));
needs to only be run once (per collection, per index )
Benefits :
One Simple place to find data
Aggregate over all clients possible
More traditional Mongodb setup
Cons :
All Clients Data is intermingled
May not mentally map as well
Just as a reference the mongodb limits for sizes are here :
[http://docs.mongodb.org/manual/reference/limits/][1]
3. Store only aggregated data
If you are never intending to break down to an individual record just save the aggregates themselves.
Page Loads :
# Page Total Time Average Time
15 Default.html 1545 103
I will let someone else tackle the MongoDB side of your question as I don't feel I'm the best person to comment on it, I would point out that MongoDB is a very different animal and you'll lose a lot of the RI you enjoy in SQL.
In terms of SQL design I would not use a different schema for each customer approach. Your database schema and backups could grow uncontrollably, maintaining a dynamically growing schema will be a nightmare.
I would suggest one of two approaches:
Either you can create a new database for each customer:
This is more secure as users cannot access each other's data (just use different credentials) and users are easier to manage/migrate and separate.
However many hosting providers charge per database, it will cost more to run and maintain and should you wish to compare data across users it gets much more challenging.
Your second approach is to simply host all users in a single DB, your tables will grow large (although 2 million rows is not over the top for a well maintained SQL DB). You would simply use a UserID column to discriminate.
The emphasis will be on you to get the performance you need through proper indexing
Users' data will exist in the same system and there's no SQL defense against users accessing each other's data - your code will have to be good!
Question
How do I return different results for the same resource?
Details
I have been searching for some time now about the proper way to build a RESTful API. Tons of great information out there. Now I am actually trying to apply this to my website and have run into a few snags. I found a few suggestions that said to base the resources on your database as a starting point, considering your database should be structured decently. Here is my scenario:
My Site:
Here is a little information about my website and the purpose of the API
We are creating a site that allows people to play games. The API is supposed to allow other developers to build their own games and use our backend to collect user information and store it.
Scenario 1:
We have a players database that stores all player data. A developer needs to select this data based on either a user_id (person who owns the player data) or a game_id (the game that collected the data).
Resource
http://site.com/api/players
Issue:
If the developer calls my resource using GET they will receive a list of players. Since there are multiple developers using this system they must specify some ID by which to select all the players. This is where I find a problem. I want the developer to be able to specify two kinds of ID's. They can select all players by user_id or by game_id.
How do you handle this?
Do I need two separate resources?
Lets say you have a controller name 'Players', then you'll have 2 methods:
function user_get(){
//get id from request and do something
}
function game_get(){
//get id from request and do something
}
now the url will look like: http://site.com/api/players/user/333, http://site.com/api/players/game/333
player is the controller.
user/game are the action
If you use phil sturgeon's framework, you'll do that but the url will look like:
http://site.com/api/players/user/id/333, http://site.com/api/players/game/id/333
and then you get the id using : $this->get('id');
You can limit the results by specifying querystring parameters, i.e:
http://site.com/api/players?id=123
http://site.com/api/players?name=Paolo
use phil's REST Server library: https://github.com/philsturgeon/codeigniter-restserver
I use this library in a product environment using oauth, and api key generation. You would create a api controller, and define methods for each of the requests you want. In my case i created an entirely seperate codeigniter instance and just wrote my models as i needed them.
You can also use this REST library to insert data, its all in his documentation..
Here is a video Phil threw together on the basics back in 2011..
http://philsturgeon.co.uk/blog/2011/03/video-set-up-a-rest-api-with-codeigniter
It should go noted, that RESTful URLs mean using plural/singular wording e.g; player = singular, players = all or more than one, games|game etc..
this will allow you to do things like this in your controller
//users method_get is the http req type.. you could use post, or put as well.
public function players_get(){
//query db for players, pass back data
}
Your API Request URL would be something like:
http://api.example.com/players/format/[csv|json|xml|html|php]
this would return a json object of all the users based on your query in your model.
OR
public function player_get($id = false, $game = false){
//if $game_id isset, search by game_id
//query db for a specific player, pass back data
}
Your API Request URL would be something like:
http://api.example.com/player/game/1/format/[csv|json|xml|html|php]
OR
public function playerGames_get($id){
//query db for a specific players games based on $userid
}
Your API Request URL would be something like:
http://api.example.com/playerGames/1/format/[csv|json|xml|html|php]