Want to pause database on Saturday, Sunday and Monday morning want to resume automatically using any script or any option is to do ? or how?
Thanks!
There is no specific feature for this task, but it is doable using a combination of techniques.o perform the Pause/Restart functionality, you can use the Azure REST API.
Recurrence Scheduling
I recommend Logic Apps which has a robust recurrence trigger. You will most likely need to run it daily but you can specify the hour(s). To only continue on specific days, you'll need to add some additional processing to parse the DayOfWeek from the run time:
dayOfWeek(convertFromUtc(utcNow(), 'Eastern Standard Time'))
Get Bearer Token
In this example, I'm using a Service Principle to authenticate, and Azure Key Vault to store the relevant secrets:
Check the Status of the Resource
The next step is to check the status of the Data Warehouse: if it is already Paused, we only want to attempt to pause it if the status is "Online". To do this, we'll call the API again, this time passing the Bearer Token we acquired above:
In this example I'm using Variables instead of Key Vault to demonstrate different approaches.
We'll use the StatusCode property of the previous operation to make this determination:
Check if there are any running Jobs
If the Data Warehouse's Status is "Online", the next thing to check is whether or not there are any active processes. We accomplish this by running a query on the Data Warehouse itself:
We'll then capture the results in a variable and use that in another Condition activity:
body('Get_Ops_Count')?['resultsets']['Table1'][0]['OpsCount']
Issue the Pause Command
If there are no active jobs, we are free to finally issue the Pause command. Once again, we'll leverage the REST API using the Bearer Token we acquired previously:
Restarting the Data Warehouse
The Restart process is very similar, only without the need to check for active processes, so you should be able to extrapolate that from this example. To restart the Data Warehouse, the REST endpoint is "resume" instead of "pause".
Related
Working on a piece of the project where a report needs to be generated with all the flow details(memory used, number of records processed, Processes ran successful, failed, etc). Most of the details are present on the Summary tab, but the requirement is to have separate reports.
Can any one help me with solution/steps/examples/screens/videos.
Thanks much.
Every underlying behavior of the UX/UI that Apache NiFi provides is also accessible through an API (in fact, the UI calls the API to perform each of these tasks). So you can invoke the GET /system-diagnostics API to return that information in JSON form, and then parse this data and present it in whatever form you like.
So I make an account scripted by using SetScript transaction to attach a script to it, but once the account is scripted, how does it check external transaction? How those external transactions trigger it? Do I pass a reference to the script in those transacitons?
After attaching a script to an account which will make it a smart-account, the script is responsible to validate every transaction sent by this smart-account. So when this account sends a transaction, the validation is triggered.
In order to setup an Smart Account, The account needs to issue SetScriptTransaction which contains the predicate. Upon success, every outgoing transaction will be validated not by the default mechanism of signature validation, but according to the predicate logic.
The account script can be changed or cleared if the installed script allows the new SetScriptTransaction to process.
The default account has no script, which is equivalent to this script:
SigVerify(tx.bodyBytes, tx.proofs[0], tx.senderPk)
SetScriptTransaction sets the script which verifies all outgoing transactions. The set script can be changed by another SetScriptTransaction call unless it’s prohibited by a previous set script.
Trying to wrap my head around how a BPMN/CMMN model can be used in my application.
There are several CMMN User Tasks with Forms as part of my application BPMN process.
I use Embedded User Task Forms
The data submitted by my forms get stored in the task variables and passed out to the parent process using all to all variables mapping.
To progress with the process, the user needs to [claim task], fill out the form and then complete it (via a REST call).
After the User Task with the form is completed is disappears from the list of available tasks in the /task REST endpoint (as well as in the Admin UI).
But what if I'd like to show users the variables that they have submitted to their tasks before completion after they have completed the task?
First, I thought to use the Get Tasks (Historic) (POST).
And that works in a sense that I can see the metadata about the tasks completed by the users before.
But how can I see the variables and actually the HTML form that had been used at the point of task completion? That is, the data available via
/task/{id}/variables
/task/{id}/form
before the task is completed?
The response from /history/task contains neither variables nor the form key.
Trying to access the completed task by its id, like {{camunda}}/task/46386/form or {{camunda}}/task/46386/variables
results in
{
"type": "RestException",
"message": "Cannot get form for task 46386"
}
or
{
"type": "NullValueException",
"message": "task 46386 doesn't exist: task is null"
}
respectively.
I think that I am missing something basic here.
That is probably the principle of the BPMN engine. When tasks are completed, they are considered to be gone forever with no option to access its data later any more (except for basic audit log details)?
Another side-question is whether the task access permissions that were set up in the Authorizations apply to the results returned by the /history/task endpoint?
Update:
Found the way to access the historical variables: Get Variable Instances but not the historical Task form keys.
Found a similar question.
I am uploading multiple files using javascript.
After I upload the files, I need to run several processing functions.
Because of the processing time that is required, I need a UI on the front telling the user the estimated time left of the entire process.
Basically I have 3 functions:
/upload - this is an endpoint for uploading the files
/generate/metadata - this is the next endpoint that should be triggered after /upload
/process - this is the last endpoint. SHould be triggered after /generate/metadata
This is how I expect the screen to look like basically.
Information such as percentage remaining and time left should be displayed.
However, I am unsure whether to allow server to supply the information or I do a hackish estimate solely using javascript.
I would also need to update the screen like telling the user messages such as
"currently uploading"
if I am at function 1.
"Generating metadata" if I am at function 2.
"Processing ..." if I am at function 3.
Function 2 only occurs after the successful completion of 1.
Function 3 only occurs after the successful completion of 2.
I am already using q.js promises to handle some parts of this, but the code has gotten scarily messy.
I recently come across Backbone and it allows structured ways to handle single page app behavior which is what I wanted.
I have no problems with the server-side returning back json responses for success or failure of the endpoints.
I was wondering what would be a good way to implement this function using Backbone.js
You can use a "progress" file or DB entry which stores the state of the backend process. Have your backend process periodically update this file. For example, write this to the file:
{"status": "Generating metadata", "time": "3 mins left"}
After the user submits the files have the frontend start pinging a backend progress function using a simple ajax call and setTimeout. the progress function will simply open this file, grab the JSON-formatted status info, and then update the frontend progress bar.
You'll probably want the ajax call to be attached to your model(s). Have your frontend view watch for changes to the status and update accordingly (e.g. a progress bar).
Long Polling request:
Polling request for updating Backbone Models/Views
Basically when you upload a File you will assign a "FileModel" to every given file. The FileModel will start a long polling request every N seconds, until get the status "complete".
I'm inserting several ApexTestQueueItem records into an Org via the Partner API to queue the corresponding Apex Classes for asynchronous testing. The only field I'm populating is the ApexClassId. (Steps as per Running Tests Using the API)
After the tests have run and I retrieve the corresponding ApexTestResult record(s) the ApexLogId field is always null.
For the ApexLogId field the help documents have the Description:
Points to the ApexLog for this test method execution if debug logging is enabled; otherwise, null
How do I enable debug logging for asynchronous test cases?
I've used the DebuggingHeader in the past with the runTests() method but it doesn't seem to be applicable in this case.
Update:
I've found if I add the user who owns the Salesforce session under Administration Setup > Monitoring > Debug Logs as a Monitored User the ApexLogId will be populated. I'm not sure how to do this via the Partner API or if it is the correct way to enable logging for asynchronous test cases.
You've got it right. That's the intended way to get a log.