How does the validation process works on RIDE? What happens after attaching the script to account? - smartcontracts

So I make an account scripted by using SetScript transaction to attach a script to it, but once the account is scripted, how does it check external transaction? How those external transactions trigger it? Do I pass a reference to the script in those transacitons?

After attaching a script to an account which will make it a smart-account, the script is responsible to validate every transaction sent by this smart-account. So when this account sends a transaction, the validation is triggered.
In order to setup an Smart Account, The account needs to issue SetScriptTransaction which contains the predicate. Upon success, every outgoing transaction will be validated not by the default mechanism of signature validation, but according to the predicate logic.
The account script can be changed or cleared if the installed script allows the new SetScriptTransaction to process.
The default account has no script, which is equivalent to this script:
SigVerify(tx.bodyBytes, tx.proofs[0], tx.senderPk)
SetScriptTransaction sets the script which verifies all outgoing transactions. The set script can be changed by another SetScriptTransaction call unless it’s prohibited by a previous set script.

Related

Auto pause and resume azure synapse analysis database

Want to pause database on Saturday, Sunday and Monday morning want to resume automatically using any script or any option is to do ? or how?
Thanks!
There is no specific feature for this task, but it is doable using a combination of techniques.o perform the Pause/Restart functionality, you can use the Azure REST API.
Recurrence Scheduling
I recommend Logic Apps which has a robust recurrence trigger. You will most likely need to run it daily but you can specify the hour(s). To only continue on specific days, you'll need to add some additional processing to parse the DayOfWeek from the run time:
dayOfWeek(convertFromUtc(utcNow(), 'Eastern Standard Time'))
Get Bearer Token
In this example, I'm using a Service Principle to authenticate, and Azure Key Vault to store the relevant secrets:
Check the Status of the Resource
The next step is to check the status of the Data Warehouse: if it is already Paused, we only want to attempt to pause it if the status is "Online". To do this, we'll call the API again, this time passing the Bearer Token we acquired above:
In this example I'm using Variables instead of Key Vault to demonstrate different approaches.
We'll use the StatusCode property of the previous operation to make this determination:
Check if there are any running Jobs
If the Data Warehouse's Status is "Online", the next thing to check is whether or not there are any active processes. We accomplish this by running a query on the Data Warehouse itself:
We'll then capture the results in a variable and use that in another Condition activity:
body('Get_Ops_Count')?['resultsets']['Table1'][0]['OpsCount']
Issue the Pause Command
If there are no active jobs, we are free to finally issue the Pause command. Once again, we'll leverage the REST API using the Bearer Token we acquired previously:
Restarting the Data Warehouse
The Restart process is very similar, only without the need to check for active processes, so you should be able to extrapolate that from this example. To restart the Data Warehouse, the REST endpoint is "resume" instead of "pause".

Change RZ11 Profile parameters programmatically

I was asked by the IT Department to write an ABAP program that switches the profile parameter login/server_logon_restriction from 0 to 1 and back automatically triggered (time etc) on all of our SAP servers.
I am not very familiar in the SAP environment yet and until now I managed to get the wanted parameter by using:
RSAN_SYSTEM_PARAMETERS_GET
and
RSAN_SYSTEM_PARAMETERS_READ
But I cannot find anything to change and save them dynamically. Is this even possible?
Cheers, Nils
login/server_logon_restriction parameter dynamic so you can change it via cl_spfl_profile_parameter=>change_value class method.
You can configure background job for automatically trigger it in t-code SM36.
The method don't save the given value to profile file, so the parameter turn back profile value after system restart.
Current logged-in users can continue to use system. May be you can want to inform them with TH_POPUP function then kick from the system.

how to keep properties file outside the mule code in mulesoft

i have defined a dev.properties file for the mule flow.where i am passing the username and password required to run the flow.This password gets updated everymonth.So everymonth i have to deploy the code to the server after changing the password.Is there a way , where we can keep the properties file outside the code in mule server path.and change it when required in order to avoid redeployment.
One more idea is to completely discard any usage of a file to pickup the username and password.
Instead try using a credentials providing service, like a http requestor which is collecting the username and password from an independent API(child API/providing service).
Store it in a cache object-store of your parent API (the calling API). Keep using those values, unless the flow using them fails or if the client needs to expire them after a month. Later simply refresh them.
You can trigger your credentials providing service using a scheduler with a Cron expression having Monthly Triggers.
No, because even if the properties file is outside the application, properties are loaded on application deployment. So you would need to restart the application anyway to pick up the new values.
Instead you can create a custom module that read the properties from somewhere (a file, some service, etc), assign the value to a variable, and use the variable instead at execution time. Note that some configurations may only be set at deployment time, so variables will not be evaluated as such.
If the credentials are not exposing your application security or data, then you can move them to another config file(place it Outside mule app path). Generate a RAML file which will read & reload the credentials after application deploy/start-up, and store them in cache with timeToLive around 12 hours.
The next time when you have to change Username/Password, change in the file directly and cache will refresh it automatically after expiry time.
Actually not because all the properties secure properties needs to be there at runtime and is it is not there your application will get failed,
There is one way but it’s not best one, instead of editing code you can directly edit secure property I.e username and password in your case directly in cloudhub runtime manager properties tab.
After editing just apply changes then api will restart automatically and will deploy successfully

When testing end to end integration testing using TestCafe how to manage waiting for email receipt

With system under test, when adding a new user their initial password is emailed to them.
I could split my test into multiple sections with manual intervention but this is less than ideal.
Appreciate any suggestions on how to proceed using TestCafe as I am sure others have encountered this as well.
If you run full integration test with real email server, then you can use libraries like "mail-receive" to connect to this server and verify the email.
You can also run your backend/server logic in mock mode, and then verify the mock, that the send event happened, by calling some test-specific rest endpoint from your TestCafe test.
Alternatively, you could also use something like "smtp-receiver" to start your own email-server-mock in nodejs context, and receive event upon email arrival. However you will need to configure your app server/backend to point to this mocked email server.

Enable debug logging when inserting ApexTestQueueItem records via the SOAP API

I'm inserting several ApexTestQueueItem records into an Org via the Partner API to queue the corresponding Apex Classes for asynchronous testing. The only field I'm populating is the ApexClassId. (Steps as per Running Tests Using the API)
After the tests have run and I retrieve the corresponding ApexTestResult record(s) the ApexLogId field is always null.
For the ApexLogId field the help documents have the Description:
Points to the ApexLog for this test method execution if debug logging is enabled; otherwise, null
How do I enable debug logging for asynchronous test cases?
I've used the DebuggingHeader in the past with the runTests() method but it doesn't seem to be applicable in this case.
Update:
I've found if I add the user who owns the Salesforce session under Administration Setup > Monitoring > Debug Logs as a Monitored User the ApexLogId will be populated. I'm not sure how to do this via the Partner API or if it is the correct way to enable logging for asynchronous test cases.
You've got it right. That's the intended way to get a log.