I want to create an automate alert for our data such as null, unequal amount, etc. It'll send to our email. The data stored in bigquery. How do i create this?
it automatically send to our email if the database have null value, empty value, unequal.
e.g
Look!
Table invoice have 0 rows detected. You can check from XX, and XY table.
or for the advance
Look!
Table invoice have 0 rows detected. You can check from XX, and XY table. This causes because XYZ.
The easiest way to resolve this request is to develop a custom solution on top of app engine or cloud functions, so that you can run your own queries and trigger email notifications.
This approach will not alert you in real-time but based on an schedule.
The other option if your use case is related to INSERT statements and real-time analysis is to take advantage of the logging alerts.
This will trigger an alert based on INSERT events, so you need to build your own query and setup the alert accordingly.
You can start with something like this:
resource.type="bigquery_resource"
protoPayload.methodName="jobservice.insert"
If you don't want to create a custom solution, it sounds like the Great Expectations framework could be a good fit for you.
This is a system for defining properties you expect of your data. It can be connected to BigQuery. It can be configured to send you emails if your tests ('expectations') fail.
You would still need to host the system yourself (they have a cloud solution coming in 2023).
If you want to get started fast and don't have easy access to hosting, you could also consider DBT (data build tool). DBT can connect to BigQuery and allows you to run tests (though this is not its main purpose). These tests can include checks for null data. You can write custom tests for more complex checks.
I mention it because they already have a paid for cloud solution, DBT Cloud, and yes, you can send email notifications if your tests fail. Depending on circumstances, this could be the fastest way to get what you want.
Related
I need to extract some data from my client's SAP ECC (the SUIM -> Users by Complex Selection Criteria -program RSUSR002)
Normally I give them a table of values that I they have to fill some field to extract what I need.
They have to make 63 different extractions (with different values of objects, for example - but inside the same transaction - you can see in the print) from their SAP, to later send to me all extracted files.
Do you know if there is an automated way to extract that, so they don't have to make 63 extractions?
My biggest problem is that every time they make mistakes. It's a lot of things to fill..
Can I create a variant and send it to them? Is it possible to export my variant so they can import it without the need to fill 63x different data?
Thank you.
When this is a task which takes considerable effort by multiple people each year, then it is something which might be worth automatizing.
First you need to find out where that transaction gets its data from. If you spend some time analyzing and debugging the program behind the transaction, you will surely find which SELECT's on which database table(s) provide that data. If you are lucky, there might even be a function module for it.
Then you just need to write an own ABAP program which performs the same selections.
Now about the interesting part: How to get that data to you. There are several approaches here. The best one depends on your requirements and your technical infrastructure. Some possibilities are:
Let users run the program in foreground, use the method cl_gui_frontend_services=>gui_download to save the data to a file on the user's PC and ask them to send it to you via email
Run the program in background and save the file on the application server. Then ask your sysadmins how to get that file from their application server to you. The simplest way would be to just map a network fileserver so they all write to the same place, but there might be some organizational hurdles in the way which prevent that. (Our security people would call me crazy if I proposed to allow access to SMB shares from outside of our network, but your mileage may vary)
Have the program send the data to you directly via email. You can send emails from an SAP system using the function module SO_NEW_DOCUMENT_ATT_SEND_API1. This of course requires that the system was configured to be able to send emails (which you can do with transaction code SCOT). Again, security considerations apply. When it's PII or other confidential data, then you should not send it in an unencrypted email.
Use an RFC call to send the data to your own SAP system which aggregates the data
Use a webservice call to send the data to your own non-SAP system which aggregates the data
You can create a recording in transaction SM35.
There you fill a tcode (SUIM), start recording, make some input in transaction SUIM and then press 'Execute'. Then you can go back to recording (F3 multiple times) and the system will generate some table with commands (structure is BDCDATA). You can delete unnecessary part (i.e. BACK button click) and save it to use as a 'macro'. Then you can replay this recording and it will do exactly what you did.
Also it's possible to export/import the recording to text file, so you can explore it's structure, write some VBA script to create such recording from your parameters and sent it to users. But keep in mind that blanks are meaningful.
It's a standard tools so there's no any coding in the system.
You can save the selection as a variant.
Fill in the selection criteria and press Save.
It can be reused.
You can also transport Variants if the they have a special name
I have been assigned with the following task regarding a Mule flow application currently in production:
To stores the client IP which is using the webservice
To implements a control that limits to ten the times each IP could ask
daily to the website
I have knowledge in Java core and SQL but null background with Mule. All the people that I can ask are in the same situation.
Once I get the app package (the one that is currently in production) up and running, I have stopped it and add the following elements to the flow:
In a subflow with some initial tasks, I have addded a database element to store the IP of the computer which is using the webservice (The user_request is a table I have just created in the DB which stores the IP and date of connection.):
insert into user_request values
(#[MULE_REMOTE_CLIENT_ADDRESS], #[function:datestamp:dd-MM-yy])
To ask the website, a database element performs a select query to provide a choice with some inputs. Depending on the value of those inputs the request is done to the website or not:
Database (Select) --> Choice --> Ask or not to the website depending on the select output
So, there, I have added to the database element that performs the select and additional output that is a count of user_request table for current IP and current day so it can provide the choice with the original inputs as usual and also this extra one (I am copying only the suquery I added):
SELECT COUNT(*) as TRIES FROM USER_REQUEST
WHERE IP_ADDRESS=#[MULE_REMOTE_CLIENT_ADDRESS]
AND REQUEST_DATE=#[function:datestamp:dd-MM-yy]
In the choice, I have added this condition to the path that finally ask the website:
#[payload.get(0).TRIES < 10]
Reached this point, the app runs and give no errors but I don't know how to test it. Where does the flow start? How can I test it as I was the user?
Aditionally, if you see anything wrong in the syntax I used above, I would appreciate if you tell me.
Thanks in advance!!!
munit will require you to learn the basics of this process first, but it is the primary testing tool of mule. With it, you will create a test suite to execute various flows and verify that when given know inputs the correct processing occur in a repeatable manner. In the test, you can mock critical calls, such as a write to your DB so that the actual is called but not actually done so as to not modify your DB table. Likewise, on reads from the DB you can either actually make a call to get known data, or returned mocked test data to exercise all paths in the flow.
need your idea guys how to develop Automation WorkTask.
Actually, i want to create a automation WorkTask by pulling the data from SQL. I always used a website : XXX to submit Work Task. In another hand, i need to pull the data from SQL. SO, i will used the data from SQL and manually insert to the website to submit Work Task. my idea is, i want to make it as one. meaning that, whenever, i pull the data, it will automatically, send the data to the website and auto submit Work Task. can anyone help me to do that? or it is impossible? - Noobiest SQL
Use a Console Application. From there you can extract the data, format it in any way you want and even automate the upload of that information using the .NET library.
Then with the windows scheduler, you can tell it to run however often you need to.
For example, I have an application that runs every 5 minutes, reads a database, gets the info, then executes a number of tasks using it. It's scheduled to run every 5 minutes.
I'm trying to run a number of applications with known failure rates through Sonar, with hopes of deciding which metrics are most valuable in determining whether a particular application will fail. Ultimately I'll be making some sort of algorithm that will look at the outputs of whatever metrics I'm using and generate a score from 1 - 100. I've got about 21 applications put through Sonar, and the results have been stored in a MySQL database. I originally planned to use PowerPivot to find relationships in the data, but it seems like the formatting of the tables doesn't lend itself well to that. Other questions on stackoverflow have told me that Sonar's tables are unformatted, and I should instead use the Web Service API to get the information. I'm unfamiliar with API and was unsuccessful in trying to do what I wanted by looking at Sonar's documentation for API.
From an answer to another question:
http://nemo.sonarsource.org/api/timemachine?resource=org.apache.cxf:cxf&format=csv&metrics=ncloc,violations_density,comment_lines_density,public_documented_api_density,duplicated_lines_density,blocker_violations,critical_violations,major_violations,minor_violations
This looks very similar to what I'd like to have, except I'm only looking at each application once (I'm analyzing a sample of all the live applications on a grid), which means Timemachine isn't really what I'm looking for. Would it be possible to generate a similar table, except instead of the stats for a particular application per date, it showed the statistics for an application and all of its classes, etc?
If you're not familiar with the WS API, you can also create your own Sonar plugin to achieve whatever you want: it is written in Java and it will execute on every analysis you run. This way, in the code ot this custom plugin, you can do whatever you want: flush the metrics you need in an output file, push them into a third party system, ... etc.
Just take a look on how to write a plugin (most probably you will create a Decorator). You have concrete examples also to get started faster.
I need a simple tool to visualize the status of a series of processes (ETL processes, but that shouldn't matter). This process monitor need to be customizable with color coding for different status codes. The plan is to place the monitor on a big screen in the office making any faults instantly visible to everyone.
Today I can check the status of these processes by running an sql statement against the underlying tables in our oracle database. The output of these queries are the abovementioned status codes for each process. I'm imagining using these sql statements, run periodically (say, every minute or so), as an input to this monitor.
I've considered writing a simple web interface for doing this, but I'm thinking something like this should exist out there already. Anyone have any suggestions?
If just displaying on one workstation another option is SQL Developer Custom Reports. You would still have to fire up SQL Developer and start the report, but the custom reports have a setting so they can be refreshed at a specified interval (5-120 seconds). Depending on the 'richness' of the output you want you can either:
Create a simple Table report (style = Table)
Paste in one of the queries you already use as a starting point.
Create a PL/SQL Block that outputs HTML via DBMS_OUTPUT.PUT_LINE statements (Style = plsql-dbms_output)
Get creative as you like with formatting, colors, etc using HTML tags in the output. I have used this to create bar graphs to show progress of v$Long_Operations. A full description and screen shots are available here Creating a User Defined HTML Report
in SQL Developer.
If you just want to get some output moving you can forego SQL Developer, schedule a process to use your PL/SQL block to write HTML output to a file, and use a browser to display your generated output on your big screen. Alternately make the file available via a web server so others in your office can bring it up. Periodically regnerate the file and make sure to add a refresh meta tag to the page so browsers will periodically reload.
Oracle Application Express is probably the best tool for this.
I would say roll your own dashboard. Depends on your skillset, but I'd do a basic web app in Java (spring or some mvc framework, I'm not a web developer but I know enough to create a basic functional dashboard). Since you already know the SQL needed, it shouldn't be difficult to put together and you can modify as needed in future. Just keep it simple I would say (don't need a middleware or single sign-on or fancy views/charts).