When I create a dataset using the expression below, it commits but doesn't set any location and when I try to copy data from other dataset I have an error because they in different locations: source: EU, destination: US.
bigquery.createDataset('my-dataset', function(err, dataset, apiResponse) {});
I didn't find anything in the docs https://googlecloudplatform.github.io/gcloud-node/#/docs/v0.24.1/bigquery?method=createDataset
Is it possible to do it?
This is an oops on our part when writing the createDataset code. Sorry about that; I'll try to get it out for the next release.
(Issue opened at https://github.com/GoogleCloudPlatform/gcloud-node/issues/941)
Related
I'm working on importing CSV files from a Google Drive, through Apps Scripts into Big Query.
BUT, when the code gets to the part where it needs to send the job to BigQuery, it states that the dataset is not found - even though the correct dataset ID is already in the code.
Very much thank you!
If you are making use of the google example code, this error that you indicate is more than a copy and paste. However, validate that you have the following:
const projectId = 'XXXXXXXX';
const datasetId = 'YYYYYYYY';
const csvFileId = '0BwzA1Orbvy5WMXFLaTR1Z1p2UDg';
try {
table = BigQuery.Tables.insert(table, projectId, datasetId);
Logger.log('Table created: %s', table.id);
} catch (error) {
Logger.log('unable to create table');
}
according to the documentation in the link:
https://developers.google.com/apps-script/advanced/bigquery
It also validates that in the services tag you have the bigquery service enabled.
I have a GCS bucket in US-WEST1:
That bucket has two files:
wiki_1b_000000000000.csv.gz
wiki_1b_000000000001.csv.gz
I've created a external table definition to read those files like so:
The dataset where this external table definition exists is also in the US.
When I query it with:
SELECT
*
FROM
`grey-sort-challenge.bigtable.federated`
LIMIT
100
..I get the following error:
Error: Cannot read in location: us-west1
I tested with asia-northeast1 and it works fine.
Why isn't this working for the US region?
Faced the same earlier. See G's answer - must use us-central1 for now: https://issuetracker.google.com/issues/76127552#comment11
For people from Europe
If you get an error Cannot read in location: EU while trying to read from external source - regional GCS bucket, you have to place your data in region europe-west1 as per the same comment. Unfortunately it is not reflected in the documentation yet.
I wanted to create a federation(external table) to contiually load up data from a new csv file which was imported each day.
In attempting to do so I was getting "Error: Cannot read in location: xxxx "
I solved the problem by:
I recreated a NEW bucket, this time select the US ( Multiple regions )
I then went back to BIG query and created a NEW data set with the data location as United States (US)
Presto!, I am now able to query an (constantly updating) external table!
I get the following error:
Keen.delete(:iron_worker_analytics, filters: [{:property_name => 'start_time', :operator => 'eq', :property_value => '0001-01-01T00:00:00Z'}])
Keen::ConfigurationError: Keen IO Exception: Project ID must be set
However, when I set the value, I get the following:
warning: already initialized constant KEEN_PROJECT_ID
iron.io/env.rb:36: warning: previous definition of KEEN_PROJECT_ID was here
Keen works fine when I run the app and load the values from a env.rb file but from the console I cannot get past this.
I am using the ruby gem.
I figured it out. The documentation is confusing. Per the documentation:
https://github.com/keenlabs/keen-gem
The recommended way to set keys is via the environment. The keys you
can set are KEEN_PROJECT_ID, KEEN_WRITE_KEY, KEEN_READ_KEY and
KEEN_MASTER_KEY. You only need to specify the keys that correspond to
the API calls you'll be performing. If you're using foreman, add this
to your .env file:
KEEN_PROJECT_ID=aaaaaaaaaaaaaaa
KEEN_MASTER_KEY=xxxxxxxxxxxxxxx
KEEN_WRITE_KEY=yyyyyyyyyyyyyyy KEEN_READ_KEY=zzzzzzzzzzzzzzz If not,
make a script to export the variables into your shell or put it before
the command you use to start your server.
But I had to set it explicitly as Keen.project_id after doing a Keen.methods.
It's sort of confusing since from the docs, I assumed I just need to set the variables. Maybe I am misunderstanding the docs but it was confusing at least to me.
I am having an issue trying to query the ScriptDb of a resource file in Google Apps Script. I create a script file (file1), add it as a resource to another script file (file2). I call file1 from file2 to return a handle to its ScriptDb. This works fine. I then try to query the ScriptDb but have a permissions error returned.
Both files owned by same user and in same google environment
See code below:
file 1:
function getMyDb() {
return ScriptDb.getMyDb;
}
file 2 (references file1):
function getDataFromFile1() {
var db = file1.getMyDb(); // This works
var result = db.query({..............}); // This results in a permissions error!
}
I am at a loss to understand why I can access file1 and get back a handle on the ScriptDb, but then am not able to query it, due to an permissions issue.
I have tried to force file1 to require re-authorization, but have not yet been successful. I tried adding a new function and running it, so any suggestions there would be gratefully received.
Thanks in advance
Chris
There appears to be an error in file1/line2. It says "return ScriptDb.getMyDb;" but it should say "return ScriptDb.getMyDb();"
If you leave out the ()s then when you call file1 as a library, file1.getMyDb() will return a function which you store in var db. Then the line var result = db.query({..............}) results in an error because there is no method "query" in the function.
Is that what's causing your error?
I have figured out what the problem was, a misunderstanding on my part regarding authorisation. I was thinking of it in terms of file permissions, when in fact that problem was that my code was not authorised to run the DbScript service. As my code calls a different file and receives back a pointer to a ScriptDb database it is not using the ScriptDb service, so then when it calls the db.query() it invokes the ScriptDb service, for which it is not authorised.
To resolve this I just had to create a dummy function and make a ScriptDb.getMyDb() call, which triggered authorisation for the service. The code then worked fine.
Thanks for the input though.
Chris
we've got a real confusing problem. We're trying to test an SQL Bulk Load using a little app we've written that passes in the datafile XML, the schema, and the SQL database connection string.
It's a very straight-forward app, here's the main part of the code:
SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class objBL = new SQLXMLBULKLOADLib.SQLXMLBulkLoad4Class();
objBL.ConnectionString = "provider=sqloledb;Data Source=SERVER\\SERVER; Database=Main;User Id=Username;Password=password;";
objBL.BulkLoad = true;
objBL.CheckConstraints = true;
objBL.ErrorLogFile = "error.xml";
objBL.KeepIdentity = false;
objBL.Execute("schema.xml", "data.xml");
As you can see, it's very simple but we're getting the following error from the library we're passing this stuff to: Interop.SQLXMLBULKLOADLib.dll.
The message reads:
Failure: Attempted to read or write protected memory. This is often an indication that other memory has been corrupted
We have no idea what's causing it or what it even means.
Before this we first had an error because SQLXML4.0 wasn't installed, so that was easy to fix. Then there was an error because it couldn't connect to the database (wrong connection string) - fixed. Now there's this and we are just baffled.
Thanks for any help. We're really scratching our heads!
I am not familiar with this particular utility (Interop.SQLXMLBULKLOADLib.dll), but have you checked that your XML validates to its schema .xsd file? Perhaps the dll could have issues with loading the xml data file into memory structures if it is invalid?
I try to understand your problem ,but i have more doubt in that,
If u have time try access the below link ,i think it will definitely useful for you
link text
I know I did something that raised this error message once, but (as often happens) the problem ended up having nothing to do with the error message. Not much help, alas.
Some troubleshooting ideas: try to determine the actual SQL command being generated and submitted by the application to SQL Server (SQL Profiler should help here), and run it as "close" to the database as possible--from within SSMS, using SQLCMD, direct BCP call, whatever is appropriate. Detailing all tests you make and the results you get may help.