Automate ALM QC's data extraction with an API & Python - testing

I want to automate the extraction of my data relating to the tests carried out on ALM Quality Center, and I was wondering if it was possible via an API?
In particular, I want to recover, in Excel format, data such as the number of successful tests, the number of failed tests...
Thank you in advance for your response!
I'm new to the tool so for now I'm just reading the documentation.

Related

Automating web page population

I have data in a csv file & want to do the following with it:
Log into web site
Populate field of the page with the csv data
Navigate to next page
Input the rest of data
Click submit
Repeat for next line
I can do this using UiPath but it's an expensive option for a relatively simple use case.
Any one any suggestions on how do this using a different method?
Thanks,
EddieT
If you're looking for alternatives then you probably would want to investigate APIs or Webhooks. But that all depends on the access rights you have for that particular website.
Try messaging the Developers of the website you need as they might have this service already available.
UiPath may appear expensive but if you calculate the amount of time saved for this one process then you will see the money savings too.
If you can find a couple of other processes you want to automate then I'd highly recommend it.

Is there a way to export a query or table from BigQuery in .txt format?

I have to deposit a report in .txt format once a day and upload it to an SFTP. I have generated the report in BigQuery but can't find a way to export it as .txt. Is it possible?
There are quite a number of ways to accomplish this and almost all involve some extend of coding with clients of your choice or great GCP tools like Dataflow, etc. They all require skilled engineers at hand
For sure, there will be few answers covering those options
Meantime, I want to provide different option.
There are some third party tools that helps to achieve same w/o no extra coding (rather than BigQuery querying)
Below is example of how simple it is to do with Magnus which is part of Potens.io suite of powerful and efficient tools for BigQuery designed so that even the non-engineer can easily explore and automate workflows to become self-sufficient in their data needs like in your question.
Disclosure: Google Developer Expert in Cloud here - author of BigQuery Mate and Potens.io (Magnus and Goliath) productivity tools
So, in below screenshot you see workflow with just two Tasks.
First Task defines payload of your report and Second Task uploads it to client's SFTP
Below you can see flip side of second task with more settings - zero coding!
In this particular example - you do not even need to persist your report in BQ Table - Second Task will just pick it from the first Task (even though obviously in real life you most likely to preserve report - which is still easy to set in first Task using Destination Entry)
I recommend you to try

Connecting Google Analytics with Gooddata's Cloudconnect via an API

I have been given an ETL project as a task which requires me to ingest some data gleaned from GA into Gooddata via an API and perform some ETL operations. Also, the creation of reports and dashboards are an integral part of this assignment.
It is my first time using this platform. If there's any way, method or procedure that you can recommend me for doing this, that will be great.
Thanks
As you are new to GoodData, the basic tutorial may help you understand basic concepts - https://help.gooddata.com/display/doc/GoodData+Developer+Tutorial
Specifically for data loads, there are several ways how to load data into GoodData platform.
You can look at https://help.gooddata.com/display/doc/Data+Loading as a starting point.
As you are specifically mentioning API, I recommend to this part of documentation - https://help.gooddata.com/display/doc/Loading+Data+via+REST+API
In case you are interested for component ready for direct communication with GA, you can use "Google Analytics Reader" component https://help.gooddata.com/cloudconnect/manual/gareader.html within CloudConnect Designer (https://help.gooddata.com/display/doc/CloudConnect+Designer)
If you intend to transform data prior loading into GoodData platform, CloudConnect can be utilised or (depending on your GoodData environment) Agile Datawarehousing Service (https://help.gooddata.com/pages/viewpage.action?pageId=34341138) tied to Automated Data Distribution (https://help.gooddata.com/display/doc/Automated+Data+Distribution) may be option for you.
Your question is quite general, but regarding GA and using CloudConnect I recommend you to check the example in GoodData documentation:
https://help.gooddata.com/display/doc/CloudConnect+Example+Projects#CloudConnectExampleProjects-GoogleAnalyticsExampleProject

Automating Sequence of Manual Steps

I have sequence of steps that an user does, e.g. logging on the a remote UNIX shell, creation of files/directories, changing permission, Running remote Shell scripts and commands, File deletion, File movements,
Run DB queries and basis the query results perform certain tasks exporting the results to a file or run further shell commands/scripts or DB insert statements etc etc.
doing there steps users achieves different processed or data processing and validating.
What is the best way to automate the above schenerio, Should we go for a Workflow tools like Activiti etc. or is there a better framework/way to achieve the requirements.
My requirement is to work with Open-source, and possibly Java based.
I am completely new to this so any help pointers would be appreciated.
The scenario you describe is certainly possible with a workflow tool like Activiti. Apache Camel or Spring Integration would be another possibility (as all the steps you mention are automatic system tasks).
A workflow framework would be a good option if you need one of these
you want to store the history data for 'audit purposes': who did what/when/how long did it take.
you want to visually model your steps, perhaps to discuss it with business people.
there is a need for human interaction between some of the steps
Your description reminds me of a software/account provisioning process.
There are a large number of provisioning tools on the market both Open Source or otherwise (Dell Crowbar is one options).
However, A couple of the comments you made in your response to Joram indicate a more general purpose tool such as Activiti may be an option:
"Swivel Chair" tasks - User tasks that may one day be automated
Visual model of process state
Most provisioning tools dont allow for generic user tasks and dont provide a (good) visual model of the process state.
However, they generally include remote script execution which would need to be cobbled together as a service task if using a BOM tool.
I would certainly expand my research to include provisioning tools as they sound like a better fit, however if you cant find anything that works for you, a BPM platform provides a generic framework to build what you need.

import.io stuck at Test your connector

I have created a connector using import.io windows application.
I am able to successfully test my connector using example queries. I want to extract data returned from this connector into dataset. I am stuck at "Test your connector" option.
Here is the screenshot:
The import.io Connector tool requires multiple queries to ensure it captures the right template. This increases the accuracy of collecting the right dataset.
It has taken me up to 5 queries before seeing "I'm done creating tests."