I've connected BigQuery tables to my Data Studio, created the report and embedded it to my site through iframe.
I like to have an option to refresh data on my report when I need it. Not by automatically interval or proceeding to my Data Studio account and do it manually.
The case it that:
I make some actions;
Data in BigQuery Tables changes;
I send some request to Data Studio API or do some other actions and data will update in my report.
My questions:
If there exist such possibility?
Maybe I can add some control to the Report which will act as the button Refresh in Data Studio account?
I can find the data freshness info badge, in the bottom right corner on the Data Studio site. Could I, at least, show it on my embedded report on my site.
data freshness badge
Thanks in advance.
To refresh the data in the report, please press CRTL+SHIFT+E
I am quite sure that there is no way that a change in a BigQuery table triggers a refresh in Data Studio. Data Studio queries the data on request of its user and even caches them.
The API is for searching reports and permissions. The main goal of the API is that the admins see what the people are doing and can limit access to the reports. https://developers.google.com/datastudio/api/reference
In some extend a refresh can be done by a customer viz. Here is an example, in which a refresh interval of 30 sec is set, but can also be turned off in the edit mode.
https://datastudio.google.com/reporting/da9b1d78-02ff-48d5-8cb5-3086a3853319/page/CbSWC
You could build your own Viz that does the refresh and define a trigger yourself, e.g. the viz checks an url every 30 sec. and if this url content changes, the viz refreshes the report in data studio.
Related
In an on-prem SQL Server I have the option to set up scheduled Jobs with the SQL Server Agent. This feature is not present in Azure. Is there any way to do this easily in Azure or will I have to rely on automation scripts / powershell scripting for this?
The task I want to accomplish is to export a bunch of SQL views to CSV and send them to a remote FTP server.
In Azure, through Logic Apps, you can achieve this. Please check below steps.
Go to Azure Portal ( http://portal.azure.com/ ) and Search Logic Apps.
Click Add and fill the details like Logic App Name, Subscription,Resource group, Location and click Create.
After refreshing the page, click the Created Logic App. In the home page, choose Blank Logic App.
In the Logic Apps designer Page, Search for Schedule or Recurrence and click it.
Fill the Interval, Frequency, Time zone (Format is important – 2018-10-16T21:00:00Z ), Start time, at these hours, at these minutes, Check the Preview.
Choose an Action, Search for SQL Server and Click Execute Stored Procedure in the list.
Click Add New connection at the end if you want to create new connection. Then Click Manually Enter Connection Information at the end if you want to create new connection.
Or else use anyone of the Existing connections. Fill the Procedure name with the required SP & below are the input parameters of the SP that you selected.
Choose an Action, Search for Create CSV Table and Click it. Fill the From (choose dynamic result set from the right side-Choose first result set alone), Include Headers (Yes), Columns (Automatic).
Choose an Action, Search for Office 365 Outlook & Search for Send an Email and Click it. Before proceeding, Please check mail id at the bottom. Change as yours. Fill the To (zzzzzz#xxxxxx.com;zzzzz#xxxxx.com), Subject (Demo mail), Body (Please check the test attachment), From & CC & BCC (email id’s for whom you want to send), Important (Normal), Is HTML (NO), Attachment Name {choose expression from the right hand side and type concat('Test_mail',utcnow('dd-MM-yyyy'),'.csv') }, Attachment Content (Choose output from the right hand side).
Finally Click Save at the left hand side top. Click Designer option to edit, after edit completes, again save that. Click Run for Demo Run.
Click Run to initiate the trigger. Then only the automated mails should come at mentioned intervals.
Select the Required Logic App and click Delete to delete it (will ask Logic App name to delete).
You should have access to do the above changes and you have to sign in to open the azure portal.
One option is to use Azure Data Factory and create a copy activity that use Azure SQL Database as a source and SFTP as a sink. Use copy activity to copy data from any supported data store to your SFTP server located on-premises or in the cloud. You can schedule execution on Azure Data Factory as shown here.
Another option is using Azure Logic Apps with the Azure SQL Database connector and FTP connector to access/manage SQL Database and FTP server. You can create, schedule, and run recurring tasks with Azure Logic Apps as shown here.
In Qlik Sense, I have a task to allow users to enter comments next to each row in a table. The source is currently from an excel file but if I can figure this out, it will be from a database. The goal is to have the user's comments be written back to the source.
I am stuck currently on how to allow the user to enter comments in the existing table from Qlik Sense.
Any pointers?
Qlik does not have this capability. Qlik (and the other similar tools) don't write data back. They consume data.
Said that ... I had similar requirement few years ago - users to be able to write comments based on the data that is displayed (based on the current selections). My solution was to write QV extension that sends the selections and user input data to web server (NodeJS in my case) and the web server was storing the data in text file which was then consumed back from Qlik. With this approach the web server can do whatever you want with the data - write files, write in db, trigger actions etc.
But again - there is no way to write back to the database from the UI with the default tools
Repo with the QV extension and the server - https://github.com/countnazgul/qv-add-comment
I wanted to know if it is possible to refresh the data of a report when we open this one, in order to always see updated data in the report. I explain myself, if I publish a report in a workspace (other users have access to this workspace), I want that every time they open a report, this one would be updated.
Thanks for you help.
If you are importing data, I'm not sure if this is possible or not.
However, if you set up a Direct Query, then your data is pulled as you interact with the report.
OK this is my question(s) and its SSRS 2005 and SQL Server 2005/2008
I had been tasked with rebuilding a dozen or so reports that our users use on their data systems. We just build them and since every DB instance is schematically the same for all our clients, we push the reports out to their report servers for use.
So modified a great many reports, but the reports have blown away the clients subscriptions. So every user that uses these reports, that can be a great many seeing as how everyone can have their own set of parameters, has to run the reports manually or redo their subscriptions.
My company would very much like to avoid that, but I can not figure out how to change a report, and even with the same parameter set going in as the last report, keep the subscription there.
Even when I copy the report down to their report server and replace the old with the new using same name. The subscription is still there, but it gets modified.
I am looking either for a way to push down a subscription as part of the report, so that they will have minimal input to their subscriptions in order to tailor it to their needs
--OR ideally--
Upload a new version of the report to their report server and just have the subscription apply to the newest report that I have put on their server
It doesn't really matter which one but the second is best seeing as how individual users use the reports with individual names as a parameter
Many thanks in advance for anyone that can point me to the way to manage out subscriptions on my side, or enable my reports to assume the subscriptions of same named reports on their server.
--edit--
Want to put a clearer picture out there
I have a master copy of a report. The users use the report on their own systems.
I do some heavy modifications to the master copy of the report, and upload it to their systems. using the same name and same parameter set as the original report.
I want the subscriptions on their report server to find this report using the same name.
so XXX.report has a subscription. I change it to XXX.report locally, and upload to their servers. The subscriptions are not synching though.
Thanks
I'm not sure how you're accessing SSRS but you can use the following webservice methods to download and upload report subscriptions
ListSubscriptions
GetSubscriptionProperties
GetDataDrivenSubscriptionProperties
DeleteSubscription
CreateDataDrivenSubscription
CreateSubscription
Using those methods, you use ListSubscriptions to get a report's subcriptions. The Subscription.IsDataDriven property will tell you if it's data driven or not. For data driven subscriptions, use GetDataDrivenSubscriptionProperites to get the subscription properties, otherwise use GetSubscriptionProperties. All of those classes are XML serializable so you can save them to disk out of the box using the XmlSerializer.
To readd the subscriptions, use DeleteSubscription to delete the subcriptions one by one and then CreateSubscription or CreateDataDrivenSubscription to readd the subscriptions.
This is a hack though, because you should be able to modify reports without breaking the subscriptions. You should to the following to help diagnose the issue
Set SSRS logging to verbose for all components
Use the click once report builder to change the title of a report and see if the subscriptions break when you click save
Have the SQL profiler running on the ReportServer database to see what SQL is being generated when the subscriptions break
there's a tool called Reporting Services Scripter from Jasper Smith. I think it should work for you.
What I had to wind up doing was going back and change all the input parameters, including the sql content for the drop downs, back to what they originally were. Then playing with the SQL for the report to accept the new(old) parameters.
Thanks for your input. I accepted the web services answer as that is a path I will have to explore for our next update.
Can MS report data be redirected?
I have a MS report control and I want to be able to take the data that would go to the (MS report control) UI in some cases to instead of being viewed, go to a file.
I don't see where that can be done but want to ask if anyone knew a way.
I tried look at the datasource from the report but I don't see where you can get the data back.
As the comment is suggesting you can programmatically render the report directly out to one of reporting services export format using the web service. There's an example here which renders out a pdf stream.