Azure powershell calculating metrics for webapp - azure-powershell

I am trying to calculate metric for all the web app inside a subscription using get-azmetric cmdlet and I am using -aggregation type parameter so how can we apply more than one aggregation type in -aggregation type parameter

No, you could not apply more than one aggregation type in this parameter, the AggregationType just accepts only one value in one time, also see the description.
Your option is to run the command several times like below.
Get-AzMetric -ResourceId "<ResourceId>" -MetricName "CpuTime" -AggregationType Average
Get-AzMetric -ResourceId "<ResourceId>" -MetricName "CpuTime" -AggregationType Count
Update:
$types = #("Average", "Count", "Minimum", "Maximum", "Total")
$types | ForEach {Get-AzMetric -ResourceId "ResourceId" -MetricName "CpuTime" -AggregationType $_}

Related

Splunk search issue

I have a search query like below.
index = abc_dev sourcetype = data RequestorSystem = * Description="Request Receieved from Consumer Service"
OR Description="Total Time taken in sending response"
| dedup TId
| eval InBoundCount=if(Description="Request Receieved from Consumer Service",1,0)
| eval OutBoundCount=if(Description="Total Time taken in sending response",1,0)
| stats sum(InBoundCount) as "Inbound Count",sum(OutBoundCount) as "Outbound Count"
I am not sure why inbound count is always showing as 0, outbound count works perfectly
There is a typo in your eval InBoundCount=... Received is spelled wrong, and if your events have it spelled correctly it won't match!
If that's not the case:
try running the query for both counts separately and make sure you are getting events. Also, posting some example input events will make our answer be more precise.
Splunk queries are joined by an implicit AND which means that your OR needs to either be included in parenthesis or (if you are using Splunk 6.6 or newer) use the IN keyword like so:
index = abc_dev sourcetype = data RequestorSystem = *
Description IN ("Request Receieved from Consumer Service", "Total Time taken in sending response")
Using IN is more portable in case you want add other strings later on. With some tweaks, you could even use a variation of stats count by Description with this.

Select last 30 days rows from MARA using SSIS

I'm trying to select rows for last date change = 30 days.
I tried LAEDA = ( sy-datum -30 ) in where clause, but it always generated error.I connect to sap Abap database.
The message error:
[EIS-Material 1] Error: ERPConnect.ERPException: Error while
receiving function return values: SYSTEM_FAILURE An error has occurred
while parsing a dynamic entry. at
ERPConnect.RFCAPI.ReceiveFunctionResults(UInt32 connectionHandle,
RFC_PARAMETER[] importing, RFC_PARAMETER[] changing, RFC_TABLE[]
tables, Encoding apiEncoding) at
ERPConnect.RFCFunction.ReceiveFunctionArguments(RFC_TABLE[]&
apiTables) at ERPConnect.RFCFunction.CallClassicAPI() at
ERPConnect.RFCFunction.ExecuteRFC(Byte[] tid) at
XtractKernel.Extractors.TableExtractor.GetPackage(RFCFunction& func)
at XtractKernel.Extractors.TableExtractor.Extract() at
XtractKernel.Extractors.ExtractorBase`1.Extract(ProcessResultCallback
processResult) at XtractIS.XtractSourceTable.PrimeOutput(Int32
outputs, Int32[] outputIDs, PipelineBuffer[] buffers) at
Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100
wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers,
IntPtr ppBufferWirePacket)
So you are using a third party tool to extract data from an SAP system. According to the error message, the toole makes a Remote Function Call (RFC) and handing the SQL to the ABAP backend. Then your where condition must be valid ABAP/Open SQL syntax, regardless of the database behind.
Your call (simplified) would look like this in ABAP (with new #-syntax):
DATA(lf_dat) = sy-datum - 30.
SELECT matnr
FROM mara
WHERE laeda >= #lf_dat
INTO TABLE #DATA(lt_matnr)
.
The problem is, that you are not allowed to make this calculation within the the statement, as far as I know, so you have to use a variable. But since your third party tool only allows you to write a where condition I see no way to handle this, except with a static date in the condition:
laeda >= '20190106' "YYYYMMDD
You can add the ABAP tag to your question to attract more specialists on this ABAP specific topic.
I see in the Xtract IS online help that there's a custom function module named Z_THEO_READ_TABLE installed at ABAP side, which executes the SQL sent by Xtract IS. The module is provided in 2 flavors, one being for ABAP >= 740 SP 5, so I guess it's a version for ABAP SQL Strict Mode.
So, I thought that maybe you could write this ABAP-like Where Clause by using a "host expression", which is valid in ABAP SQL Strict Mode :
LAEDA = #( sy-datum - 30 )
Based on the error message you have, "An error has occurred while parsing a dynamic entry", I guess that this function module does something like SELECT (dyn-columns) FROM (dyn-table) WHERE (dyn-condition), i.e. all elements are dynamically defined at run time.
Unfortunately, the "ABAP documentation sql_cond - (cond_syntax) says that "Host expressions are not allowed in dynamic logical expressions."
So long, impossible to make a where clause as you wish.
There are probably many ways to get around this limit (like creating a SAPquery or BAPI in SAP and calling it from Xtract IS, etc.) but that's another question.
In mySQL / MariaDB, this works:
select ...
from ...
where date >= DATE_ADD(CURDATE(), INTERVAL -30 DAY)
but we need to know what database you are working with.
You may try it, if you use SQL database:
Select DATEADD(Month, -1, getdate())
You cannot specify ABAP formula like that through SAP Open SQL.
Not to directly resolve your challenge (as you have product limitation), here is how dynamic filter is achieved through AecorSoft tool:
(DT_WSTR, 4)(DATEPART("yy" , GETDATE())) + RIGHT("0" + (DT_WSTR, 4)DATEPART("mm" , GETDATE()),2) + RIGHT("0" + (DT_WSTR, 4)DATEPART("dd" , GETDATE()),2)
For complete use case, you can check the blog SAP Table Delta Extract Made Easy through Dynamic Filters

pushgateway or node exporter - how to add string data?

I have a cron job that runs an sql query every day, and gives me an important integer.
And I have to expose that integer to the Prometheus server.
As I've seen I have two options; use the pushgateway or node exporter.
But that metric (integer) that I get from the sql query also need some information (like the company name, and the database that I got it from).
What would be a better way?
For instance this is what I made for my metric:
count = some number
registry = CollectorRegistry()
g = Gauge('machine_number', 'machfoobarine_stat', registry=registry).set(count)
push_to_gateway('localhost:9091', job='batchA', registry=registry)
So how do I add key-value pairs to my metric above?
Do I have to change the job name ('batchA') for every single sql count that I get and expose as a metric to the pushgateway, because I can only see the last one?
Tnx,
Tom
The best way is to set a general name to your metric, for example animal_count and then specialize it with label. Here is an pseudocode:
g = Gauge.build("animal_count", "Number of animal in zoo")
.labelsName("sex", "classes")
.create();
g.labels("male", "mammals")
.set(count);

LDAP query to show all groups that have nested groups

I can see alot of questions relating to needing to get users from nested groups, but what i want to do is query AD for all groups that have a nested group for example:-
if group A has a nested group B show me that group
if group C has no nested groups dont show me it
if group D has a nested group E show me it
i want to know how many AD groups have nested groups and their names basically, can this be done?
Thanks
Unfortunately, what you are looking to do cannot be done with only LDAP queries. If you are running AD on Server 2003 SP2 or later, you can query for all members of a specific group, enumerating nested groups using a matching rule, but you would have to use an external process, like a PowerShell script, to actually get the results you wanted.
The matching rule I was thinking of would be used like this:
(memberOf:1.2.840.113556.1.4.1941:=cn=Test,ou=East,dc=Domain,dc=com)
If you can use PowerShell, and can install Microsoft's ActiveDirectory module from the RSAT tools, you can do it in one line (although, it could take forever) like this:
Import-Module ActiveDirectory; Get-AdGroup -Filter {Name -like "*"} | ? { $m = Get-ADGroupMember $_; $r = Get-ADGroupMember $_ -Recursive; $c = Compare-Object $m $r; !$c.Count } | ft name,distinguishedName -AutoSize

Need to query splunk using rest api call and pull mean and stdev

I am trying to query using Rest API on splunk with the following:
curl -u "<user>":"<pass>" -k https://splunkserver.com:8089/services/search/jobs/export -d'search=search index%3d"<index_name" sourcetype%3d"access_combined_wcookie" starttime%3d06/02/2013:0:0:0 endtime%3d06/10/2013:0:0:0 uri_path%3d"<uri1>" OR uri_path%3d"<uri2>" user!%3d"-" referer!%3d"-" | eval Time %3d request_time_length%2f1000000 | stats stdev%28Time%29 as stdev, mean%28Time%29 as mean, count%28uri_path%29 as count by uri_path'
However I do not get the computed mean and stdev, I only see count. How can I add the mean and stdev?
The query looks about right. I tried a similar query on my end it seemed to give me all 3 aggregates. Only thing I can think of is to make sure you have events that match the search criteria. It could be your time boundaries. Try expanding those or maybe removing one/both of them to see if you get any data for mean and stdev.