How to add Seoul dataset in BigQuery - google-bigquery

In BigQuery I want to add a dataset in Seoul (asia-northeast3) but I can't find it in the list of locations (I only see Mumbai, Singapore, Hong Kong, Taiwan, and Tokyo)
however the big query documentation states that there is such a location
How can I add a dataset in Seoul?
Reference:
https://cloud.google.com/bigquery/docs/locations

That's very bizarre, I've just check it out on my console and I can confirm that it exist a bit in the bottom between europ-west4 and us-west3.
Anyway, in case you weren't able to get it on the list, you can still create it using the command line on the console with :
bq --location=asia-northeast3 mk --dataset <project_id>:my_dataset

Related

List all Schedule Query with user who created

I use bq ls --transfer_config --transfer_location=asia-southeast-1 --format json without max-results to list all the schedule query existing.
It should returns ListTransferConfigsResponse having TransferConfig Obj inside but it's not. I need "OwnerInfo" which has the email of the user. Did google write missing docs? I want to call 1 api to get all info. Any help?

Enforce MapBox Geocoding API to return only results with a zipcode

I'm working with MapBox Geocoding API to have suggestions of addresses in the location search feature of my website.
The following is a sample call:
https://api.mapbox.com/geocoding/v5/mapbox.places/Zur.json?country=ch&limit=5&proximity=8.765.432&language=en-GB&access_token=***
My goal is to enforce the api to return only results that include a zipcode.
For example if I input "Zur" (limiting the search to Swiss) I get the following:
- Zürich, Zürich, Switzerland
- Zürich, Switzerland
- Zürich Airport, Flughafenstrasse, Kloten, Zürich 8302, Switzerland
- Zurich, Buchs, Zürich 8107, Switzerland
The expected results should be without the first two lines, as they don't have a zipcode.
I tried implementing myself the removal of results without the zipcode on client side (just after I got an answer from the api), but it is a suboptimal solution (for example it doesn't ensure to have enough results).
I was not able to find such feature in MapBox. Is there a better solution out of there?
Just add ?types=address
See all list of types:
https://docs.mapbox.com/api/search/geocoding/#data-types

How can I query Wikidata API to get details of all the Korean films?

If possible, i want to return the results in Json or XML format. Is there any ways to do so? Earlier I did it using freebase.com but it is now deprecated. Please help.
This query would look a lot like the one to get the list of all films on Wikidata but adding another filter:
instead of http://wdq.wmflabs.org/api?q=claim[31:11424] (return all the entities marked as instances of film), you would do
http://wdq.wmflabs.org/api?q=claimCLAIM[31:11424] AND CLAIM[495:884] (return all the entities marked as instances of film and South Korea (Q884) as country of origin (P495))
http://wdq.wmflabs.org/api?q=claimCLAIM[31:11424] AND CLAIM[495:423] (the same for North Korea (Q423))
Then to parse the results and get the entities data, it would be the same as for the list of all the films
Remarques:
you will probably need to encode those URLs to get something that looks like: http://wdq.wmflabs.org/api?q=CLAIM%5B31%3A11424%5D%20AND%20CLAIM%5B495%3A884%5D
here is the full API documentation. Notice that this is an experimental API, which might be replaced in the coming year
The overview on Wikipedia may be more complete than Wikidata, as you've noticed yourself also. However, I could only find overviews per year, such as on https://en.wikipedia.org/wiki/List_of_South_Korean_films_of_2015.
To get a list of titles from that page, you would first retreive the raw wikicode of the page: https://en.wikipedia.org/w/index.php?action=raw&title=List_of_South_Korean_films_of_2015, and then run a regular expression such as /\{lang\|[^\|]+\|([^\}]+)/g on the code.
This returns a list of 149 titles.

DataSift and GoogleBigQuery

I have been trying to export data to a google bigquery dataset from datasift, but except for 4 empty rows, no other relevant data has been pushed.
I followed instruction from this link: http://dev.datasift.com/docs/push/connectors/bigquery. Not sure if it's the csdl code that I used the cause.
For example I configured a stream using:
wikipedia.title contains "Audi".
The live preview has no output. Also, the only data sources that I've set as active are Interaction and Wikipedia.
Please let me know what may be the reason for this. At the end of every stream recording I don't see any changes, expect the creation of the table mentioned in the destination with 4 empty rows(some row have null values, and interaction.type is ds_verify).
Thank you!

Twitter API: How to search only for geotagged tweets

How can I use Twitter Search API (or other) to get a list of tweets which have the "geo" param?
--EDIT--
By example: I wont get list of geotagged tweets, by #apple tag. Without location filter, worldwide.
Looks like the latest API supports that; simply use a large enough geo region for your query:
-180,-90,180,90
See more from the API link for filter and location
The streaming API allowed you to filter by a location and the search API allows you to search by geocode. You can find more information on these services on our developer resources site.
Streaming API: http://dev.twitter.com/pages/streaming_api
Example: Create a file called ‘locations’ that
contains, excluding the quotation
marks, the phrase:
“locations=-122.75,36.8,-121.75,37.8,-74,40,-73,41” then execute:
curl -d #locations
http://stream.twitter.com/1/statuses/filter.json
-uAnyTwitterUser:Password.
You will receive all geo tagged tweets
from the San Francisco and New York
City area.
Search API: http://dev.twitter.com/doc/get/search
Example: http://search.twitter.com/search.json?geocode=37.781157,-122.398720,1mi
From the Twitter API Documentation, this should be the format of your search query:
http://search.twitter.com/search.json?geocode=37.781157,-122.398720,1mi
Where 37.781157 is the latitude, -122.398720 is the longitude and 1mi is the radius to search within.
You can look for every tweet but save only the geotaged ones.
I know it dont make a lot of sense, but works quite well.
if you call you search results, you can state
for result in results:
if result.geo != None:
print result.text.encode('utf-8', errors='ignore') # or do anything you want with the tweets
Use -180,-90,180,90 to get any geotagged tweet.