Is there a way to export a query or table from BigQuery in .txt format? - sql

I have to deposit a report in .txt format once a day and upload it to an SFTP. I have generated the report in BigQuery but can't find a way to export it as .txt. Is it possible?

There are quite a number of ways to accomplish this and almost all involve some extend of coding with clients of your choice or great GCP tools like Dataflow, etc. They all require skilled engineers at hand
For sure, there will be few answers covering those options
Meantime, I want to provide different option.
There are some third party tools that helps to achieve same w/o no extra coding (rather than BigQuery querying)
Below is example of how simple it is to do with Magnus which is part of Potens.io suite of powerful and efficient tools for BigQuery designed so that even the non-engineer can easily explore and automate workflows to become self-sufficient in their data needs like in your question.
Disclosure: Google Developer Expert in Cloud here - author of BigQuery Mate and Potens.io (Magnus and Goliath) productivity tools
So, in below screenshot you see workflow with just two Tasks.
First Task defines payload of your report and Second Task uploads it to client's SFTP
Below you can see flip side of second task with more settings - zero coding!
In this particular example - you do not even need to persist your report in BQ Table - Second Task will just pick it from the first Task (even though obviously in real life you most likely to preserve report - which is still easy to set in first Task using Destination Entry)
I recommend you to try

Related

Bulk edit DataStage jobs?

We are repointing a large number (>1000) DataStage jobs from one database to another. As part of this, we will need to make the same changes to a single stage for many jobs.
So far, we have been able to export jobs to XML, edit and reimport. This seems to work, but will require a lot of parsing logic. We also have looked at dsjob, but that tool does not seem to have the ability to edit jobs.
What is the best method (UI or CLI/API) to bulk edit job stages?
Scenarios like this are the reason for using parameters for Databases - I recommend using ParameterSets with DBName, User, Password and Schema parameters.
This allows an easy and quick change in one place of a project: the ParameterSet
Hard coding all these things will give you a hard time - the export method is one option you know already.
There is a connector migration wizzard - I am not sure if this tool could be helpful as well - you might want to search for documentation on that.
Perhaps you can try the RJUT (Rapid Job Update Tool) or the CMT (Connector Migration Tool).
RJUT: https://www.ibm.com/support/pages/rapid-job-update-tool-ibm-infosphere-information-server-datastage-jobs
CMT: https://www.ibm.com/docs/en/iis/11.7?topic=connectors-using-command-line-migrate-jobs

Automating Sequence of Manual Steps

I have sequence of steps that an user does, e.g. logging on the a remote UNIX shell, creation of files/directories, changing permission, Running remote Shell scripts and commands, File deletion, File movements,
Run DB queries and basis the query results perform certain tasks exporting the results to a file or run further shell commands/scripts or DB insert statements etc etc.
doing there steps users achieves different processed or data processing and validating.
What is the best way to automate the above schenerio, Should we go for a Workflow tools like Activiti etc. or is there a better framework/way to achieve the requirements.
My requirement is to work with Open-source, and possibly Java based.
I am completely new to this so any help pointers would be appreciated.
The scenario you describe is certainly possible with a workflow tool like Activiti. Apache Camel or Spring Integration would be another possibility (as all the steps you mention are automatic system tasks).
A workflow framework would be a good option if you need one of these
you want to store the history data for 'audit purposes': who did what/when/how long did it take.
you want to visually model your steps, perhaps to discuss it with business people.
there is a need for human interaction between some of the steps
Your description reminds me of a software/account provisioning process.
There are a large number of provisioning tools on the market both Open Source or otherwise (Dell Crowbar is one options).
However, A couple of the comments you made in your response to Joram indicate a more general purpose tool such as Activiti may be an option:
"Swivel Chair" tasks - User tasks that may one day be automated
Visual model of process state
Most provisioning tools dont allow for generic user tasks and dont provide a (good) visual model of the process state.
However, they generally include remote script execution which would need to be cobbled together as a service task if using a BOM tool.
I would certainly expand my research to include provisioning tools as they sound like a better fit, however if you cant find anything that works for you, a BPM platform provides a generic framework to build what you need.

How to access results of Sonar metrics for use with applications like PowerPivot

I'm trying to run a number of applications with known failure rates through Sonar, with hopes of deciding which metrics are most valuable in determining whether a particular application will fail. Ultimately I'll be making some sort of algorithm that will look at the outputs of whatever metrics I'm using and generate a score from 1 - 100. I've got about 21 applications put through Sonar, and the results have been stored in a MySQL database. I originally planned to use PowerPivot to find relationships in the data, but it seems like the formatting of the tables doesn't lend itself well to that. Other questions on stackoverflow have told me that Sonar's tables are unformatted, and I should instead use the Web Service API to get the information. I'm unfamiliar with API and was unsuccessful in trying to do what I wanted by looking at Sonar's documentation for API.
From an answer to another question:
http://nemo.sonarsource.org/api/timemachine?resource=org.apache.cxf:cxf&format=csv&metrics=ncloc,violations_density,comment_lines_density,public_documented_api_density,duplicated_lines_density,blocker_violations,critical_violations,major_violations,minor_violations
This looks very similar to what I'd like to have, except I'm only looking at each application once (I'm analyzing a sample of all the live applications on a grid), which means Timemachine isn't really what I'm looking for. Would it be possible to generate a similar table, except instead of the stats for a particular application per date, it showed the statistics for an application and all of its classes, etc?
If you're not familiar with the WS API, you can also create your own Sonar plugin to achieve whatever you want: it is written in Java and it will execute on every analysis you run. This way, in the code ot this custom plugin, you can do whatever you want: flush the metrics you need in an output file, push them into a third party system, ... etc.
Just take a look on how to write a plugin (most probably you will create a Decorator). You have concrete examples also to get started faster.

Website data retrieval

An recent article has prompted me to pick up a project I have been working on for a while. I want to create a web service front end for a number of sites to allow automated completion of forms and data retrieval from the results, and other areas of the site. I have acheived a degree of success using Selenium and custom code however I am looking to extend this to a stage where adding additional sites is a trivial task (maybe one which doesn't require a developer even).
The Kapow web data server looks to achieve a lot of this however I am told it is quite expensive (currently awaiting a quote). Has anyone had experience with this, or can suggest any alternatives (Open Source ideally)?
Disclaimer: I realise the potential legality issues around automating data retrieval from 3rd party websites - this tool is designed to be used in a price comparison system and all of the websites integrated with it will be done with the express permission of the owners. Where the sites provide an API this will clearly be the favoured approach.
Thanks
Realised it's been a while since I posted this, however should anyone come across it, I have had lots of success in using the WSO2 framework (particularly the mashup server) for this. For data mining tasks I have also used a Java library that this wraps - webharvest - which has achieved everything I needed

Automating WebTrends analysis

Every week I access server logs processed by WebTrends (for about 7 profiles) and copy ad clickthrough and visitor information into Excel spreadsheets. A lot of it is just accessing certain sections and finding the right title and then copying the unique visitor information.
I tried using WebTrends' built-in query tool but that is really poorly done (only uses a drag-and-drop system instead of text-based) and it has a maximum number of parameters and maximum length of queries to query with. As far as I know, the tools in WebTrends are not suitable to my purpose of automating the entire web metrics gathering process.
I've gotten access to the raw server logs, but it seems redundant to parse that given that they are already being processed by WebTrends.
To me it seems very scriptable, but how would I go about doing that? Is screen-scraping an option?
I use ODBC for querying metrics and numbers out of webtrends. We even fill a scorecard with all key performance metrics..
Its in German, but maybe the idea helps you: http://www.web-scorecard.net/
Michael
Which version of WebTrends are you using? Unless this is a very old install, there should be options to schedule these reports to be emailed to you, and also to bookmark queries. Let me know which version it is and I can make some recommendations.