Problem seeing Data from reference input (SQL) Azure Stream Analytics - sql

I am trying to preview the data coming from a SQL DB that is set as a reference input for my Azure stream analytics job.
I am getting this error:
We cannot locate the resource for the selected input. Please make sure its subscription has been selected in the global subscription filter.
I have looked online but can't seem to find how to resolve it.
I also have data coming from an event hub and that is working, error occurs only when previewing DB data.

We cannot locate the resource for the selected input. Please make sure
its subscription has been selected in the global subscription filter.
I can't reproduce your issue on my side.But based on above issue,i think you have multiple subscriptions. You could try to make sure the subscription (which your sql db resides in) has been selected in the global subscription filter.
Step is navigating to Subscription Management Page in the portal and check whether the subscription is selected with global subscription filter.It not,select it in the right box.
Add a reference for you :https://www.kunal-chowdhury.com/2018/06/azure-global-subscription-filter.html

Related

Send data from list in SharePoint in Microsoft 365 to external API

I have a list in "SharePoint in Microsoft 365" and am looking to extend its usefulness. I would like to create a button or something that can take a row of data and send it to an external site via API (another tool that we have created).
To be clear:
SharePoint initiates a GET or POST request
the request must have row or field data attached
the destination is outside of Microsoft/SharePoint
Can it be done?
(Apologies if this is a repeat question, I could only find answers that pertained to older versions of SharePoint and focused on hitting external APIs to retrieve data, not send it.)
OK, i think the best choice to send Data from a selected row, is to use the ListView Command Set here you can do anything with #pnp/nodejs.
Otherwise...
PowerAutomate can help send data.

What is the process for changing the "lead owner" through the salesforce API?

Currently creating an automation using zapier which should change the lead owner in salesforce when the event takes place. It successfully reaches salesforce but does not actually change the "lead owner" but it is instead reflected in the lead history section. There is no clear salesforce workflow or rule in place which should prevent this automation from occurring.
When the automation executes as you can see above, the lead owner successfully changes in the lead history but it does not actually change the lead owner of the actual lead so we are manually having to go back and change this.
Has anybody else faced similar issues when working with the salesforce API when changing the lead owner and if so what was the solution?
Check Lead assignment rules. It's separate area in Setup, different from workflows, flows, process builder and triggers.
You probably have an active rule that runs on update, not only on insert. Your API call works OK, changes the OwnerId field but then the assignment rule overwrites that. That's why you see it as 2 entries in history.
You can also confirm what's going on byenabling debug logging on the integration user and check if it captures anything.
Optionally you could also suppress the assignment rule during the update. This is... questionable. I mean talk with your SF admin first, if you suppress the rule then you moved bit of logic out of salesforce. 2 months later nobody will remember why something doesn't fire, it's cleaner to just modify the rule to skip these records.
If Zapier uses SF REST API there's a HTTP header it should send, Sforce-Auto-Assign: FALSE. If it uses SOAP API - similar thing will have to be set in the SOAP message's header, check the WSDL for exact syntax?
We solved this, just broke down the issue and resolved by doing the following (This was beyond the standard salesforce scope of support as we use custom prefill URL's for anybody wondering):
Create a new hidden field called something such as 'tmp_owner'
Assign the new lead owner ID to a new text field called 'tmp_owner' which is hidden on
the lead field to other salesforce org users
Added a salesforce workflow rule when this 'tmp_owner' is populated replace the 'lead owner' field with the data.

How to configure PagerDuty alerts in Splunk Cloud?

I've run into a few different issues with the PagerDuty integration in Splunk Cloud.
The documentation on PagerDuty's site is either outdated, not applicable to Splunk Cloud or else there's something wrong with the way my Splunk Cloud account is configured (could be a permissions issue): https://www.pagerduty.com/docs/guides/splunk-integration-guide/. I don't see an Alert Actions page in Splunk Cloud, I have a Searches, Reports and Alerts page though.
I've configured PD alerts in Splunk using the alert_logevent app but it's not clear if I should instead be using some other app. These alerts do fire when there are search hits but I'm seeing another issue (below). The alert_webhook app type seems like it might be appropriate but I was unable to get it to work correctly. I cannot create an alert type using the pagerduty_incident app. . . although I can set it as a Trigger Action (I guess this is how it's supposed to work, I don't find the UI to intuitive here).
When my alerts fire and create incidents in PagerDuty, I do not see a way to set the PagerDuty incident severity.
Also, the PD incidents include a link back to Splunk, which I believe should open the query with the search hits which generated the alert. However, the link brings me to a page with a Page Not Found! error. It contains a link to "more information about my request" which brings up a Splunk query with no hits. This query looks like "index=_internal, host=SOME_HOST_ON_SPLUNK_CLOUD, source=*web_service.log, log_level=ERROR, requestid=A_REQUEST_ID". It it not clear to me if this is a config issue, bug in Splunk Cloud or possibly even a permissions issue for my account.
Any help is appreciated.
I'm also a Splunk Cloud + PagerDuty customer and ran into the same issue. The PagerDuty App for Splunk seems to create all incidents as Critical but you can set different severities with event rules.
One way to do this is dynamically is to rename your Splunk alerts with the desired severity level and then create a PagerDuty event rule for each level that looks for the keyword in the Summary. For example...
If the following condition is met:
Summary contains "TEST"
Then perform the following actions:
Set severity = Info
screenshot of the example in the event rule edit screen
It's a bit of pain to rename your existing alerts in Splunk but it works.
If your severity levels in Splunk are programmatically set like in Enterprise Security, then another method would be to modify the PagerDuty App for Splunk to send the $alert.severity$ custom alert action token as a custom detail in the webhook payload and use that as event rule condition instead of Summary... but that seems harder.

Saving Sitefinity Forms Module Data to separate Database

I'm working with Sitefinity CMS and trying to figure out how to save the data from a "Forms Module" form to a separate database in the backend. Currently all the responses are saved into a table that is created when the form is built. What I want to do is re-route the save request to go through my code behind instead and save the data to Azure table storage. Is there a way to do this or by using the Forms Module am I stuck to saving the data to the table that is auto-created when the form is built? I've tried creating my own FormsSubmitRouteHandler as explained here (http://docs.sitefinity.com/for-developers-submit-forms-using-ajax-call#register-a-form-submit-route-handler) but I must be doing something wrong cause my code doesn't ever get hit.
Any help would be greatly appreciated. If I didn't explain myself well please let me know.
You should create a new provider that implements the FormsDataProvider class.
Currently Sitefinity uses OpenAccessFormsProvider - so you can use JustDecompile to see how that was implemented and probably do something similar.
Then you need to register your custom provider in the Administration > Settings > Advanced > Forms
If you don't mind having the form responses in the Sitefinity database and in your custom storage, then you can subscribe to the IFormEntryCreatedEvent and in your event handler you can write the logic of saving the form respose somewhere else.
See this article for more details: http://docs.sitefinity.com/for-developers-forms-events#iformentrycreatedevent
Have in mind that this will result in form responses being saved in both, the SF datbase and your custom storage. Also, you won't be able to manage the entries stored in the custyom storage through Sitefinity backend. If that's your goal, then Vesselin's answer is the correct way to go, but more complicated.

Worklight api console request for Push

I need to get a list of users for a specific Push adapter/event source, and, I'm trying to use the API console requests, which says the format is:
http://{hostname}:{port}/{context-root}/console/api/{api-context}/{action}/{parameters}
and I'm using:
http://192.168.1.106:10080/Module_07_04_nativeAPIForiOSPush/console/api/Push/get/PushAdapter/PushEventSource
to search the demo project, which has one subscriber. However, I get 404 return from a browser request.
The first column of the docs is the "api-context", but, it lists "Push" and "Event Sources", which, obviously seems invalid.
What is the correct format to find users subscribed to a push for a specific adapter/event source?
WL server does not provide API for listing subscribed users. By design you should maintain your own DB of subscribed users, this is why you have onSubscribe onUnsubscribe callbacks in event source. As an alternative - you can look into WL's DB tables to find this info.