How to upload BigQuery Script to Github? - sql

Need some help because the bigquery script does not save locally, and unable to upload it to Github.

You can use 3rd party IDE for BigQuery supporting GitHub
This is Goliath - part of Potens.io Suite available at Marketplace.
Note: Another tool in this suite is Magnus - Workflow Automator. Supports all BigQuery, Cloud Storage and most of Google APIs as well as multiple simple utility type Tasks like BigQuery Task, Export to Storage Task, Loop Task and many many more along with advanced scheduling, triggering, etc. Supports GitHub as a source control as well
Disclosure: I am GDE for Google Cloud and creator of those tools and leader on Potens team

Related

How to load CSV file automatically in Google cloud platform?

I am new to GCP. I want to load CSV file automatically in any google cloud platform component like Bigquery, Bigtable etc.. I do not want manually work for loading a file everyday on GCP. I want to handle this manual work automatically by GCP. Please suggest me any scenario so i can load file automatically.
Thanks in advance
Building on Pentium's solid answer, you also have the option on the following (serverless) conga line:
GCS -> Cloud Functions -> Dataflow (template) -> BigQuery
We use this pattern a lot of our projects, and it works beautifully. It's event driven, PB scalable, fully automated and zero-ops.
You have the option to watch for Object Change Notification in GCS.
So whenever you upload a file you can have a webhook pinging an URL.
Then you can setup either an App Engine application or Cloud Function to do your import, all this is serverless.

Where to obtain Google Website Optimizer Tool for A/B Testing?

I have tried many times but didn't get any proper source to download and install Google Website Optimizer Tool for A/B Testing. Please tell me where can I get this tool. I never used any tool for testing before it.
This project has been retired
http://en.wikipedia.org/wiki/Google_Website_Optimizer
It has been integrated into Google analytics though.

Solution for storing custom files by clients in the cloud

We have multiple clients using our service.
Each client may create multiple projects.
Each client may upload multiple files to any of his projects.
Each file may have custom meta data associated.
Each client may "share" any of the projects to another client.
Each client may comment any of his or shared projects/files.
My question is about file storing in a cloud. What will best solution? I thought about Amazon S3 but maybe there are better alternatives?
You can explore Box.com solution. They are an advanced file management solution in the cloud and support fine-grained permission management as you explained above. Dropbox for Teams is also another option - The permission model is not as extensive as Box, but the sync client is very stable here. In one of my recent projects, I used box.com mainly due to their fine-grained permission controls
You can also build this on S3 (Dropbox and I guess Box too is behind the scenes built on S3). To achieve all the functionality as you mentioned, it is quite some programming work !

Can I create new S3 users and add IAM policies from the Linux command line?

Is there any good way of creating and managing S3 policies and users from the command line of my Raspberry Pi?
The AWS Universal Command Line Tools are newer and better supported. They rely on Python, so if you can get Python for Raspberry Pi, you should be set.
I have no experience of using it myself, but I found a tool for interacting with Amazon IAM, the access control service for AWS, in a manner that might work for you:
IAM Command Line Toolkit (note: last updated September 2010)
There may be more usable stuff under the IAM Resources section.
If you are unfamiliar with IAM, the documentation is one place to start. Although, knowing the general style of AWS documentation, there may be better resources and tutorials to be found elsewhere.

What is a good AWS client?

The web based AWS console seems so limited in what it can do. For example, to create a private stream distribution, you have to create CloudFront Origin Access Identity, create private content distribution, and modify ALC on the private objects, all through XML calls (WTF?). I really expect something so common to be integrated into their Console.
Is there a client smart enough that allows me to do simple tasks in simple ways?
Some options are:
ylastic - A web based tool that does automate many multi-step operations.
Cloudberry Explorer - A Windows only client application
Bucket Explorer - A cross platform client application
I'm not sure if these perform the tasks you need by they are worth a look.
You can configure CloudFront Private Content with cloudberry freeware. http://blog.cloudberrylab.com/2010/03/how-to-configure-private-content-for.html
More generally, you should probably look for cloud management softwares. They are an additional layer on top of Amazon Web Services and leverage AWS API to offer automations tools like automated backups, auto-scaling, failover by default...
Ylastic was mentionned by Geoff, but you can try Scalr (disclaimer: I work there), RightScale or enStratus.
I have felt this way too...but it seems the aws command line tools is the only options that I know..
Basic answer: No! It is even hard to find a GUI to manage more than one aws product
And no one will put in effort to develop this as aws keep changing the API interface.
For me:
For stable aws service like S3, I use Cyberduck
And the rest of them, I write my own program to do it, it is more customized for my own need and not easy to make mistake (and help me to familiar with the api)