How to add condition in splunk data model constraint - splunk

I have a outbound flow that gets data written by App, mem and cards api. cards and mem api is writing logs into applog but App is writing logs in syslog.
In my data model I have sourcetype=app_log as the source type. So in case of all flows except app I am getting write splunk dashboard report but for application I am not getting any data.
So I want to add a condition in data model "constraint" section like
when api is applications then sourcetype=app_log OR sourcetype=sys_log
else sourcetype=app_log
Can anyone assist me how to do this in splunk?

If you need a dual sourcetype, it's usually best to make that part of the root search/event to draw in all relevant data you would like to use in your data model.
A data model is like sheering away wood on a sculpture, so if it's usually better to start with all of the data, and then slowly pick away at what you want to see.
You can add | where clauses as constraints, however you can't add more data if you don't start with it in the root events.
my suggestion would be something like this in your root search:
(index=index1 sourcetype=sourcetype1) OR (index=index2 sourcetype=sourcetype2) field=blah field=blah ....

Related

How to store and serve coupons with Google tools and javascript

I'll get a list of coupons by mail. That needs to be stored somewhere somehow (bigquery?) where I can request and send it to the user. The user should only be able to get 1 unique code, that was not used beforehand.
I need the ability to get a code and write, that it was used, so the next request gets the next code...
I know it is a completely vague question but I'm not sure how to implement that, anyone has any ideas?
thanks in advance
Thr can be multiples solution for same requirement, one of them is given below :-
Step 1. Try to get coupons over a file (CSV, JSON, and etc) as per your preference/requirement.
Step 2. Load Source file to GCS (storage).
Step 3. Write a Dataflow code which read data from GCS (file) an load data to a different Bigquery table (tentative name: New_data). Sample code.
Step 4. Create a Dataflow code to read data from Bigquery table New_data and compare it with History_data and identify new coupons and write data to a file on GCS or Bigquery table. Sample code.
Step 5. Schedule entire process over an orchestrator/Cloud scheduler/Cron tab job.
Step 6. Once you have data you can send it to consumers through any communication channel.

OroCommerce: How to add new product unit?

For some strange reason in so very configurable OroCommerce there is no ability to manage product units and the only few words doc says that its possible to add units via web api. I need to add "days" units and best if do it in code via migration. Is it enough just to make migration like
INSERT INTO `oro_product_unit` (`code`, `default_precision`) VALUES ('day', '0');
and add tranlation messages like
oro.product_unit.day.label.full: day
oro.product.product_unit.day.label.full: day
or need to do smth else?
Product units may be loaded to the database using the data fixtures, like this one that loads default units:
https://github.com/oroinc/orocommerce/blob/4.2.1/src/Oro/Bundle/ProductBundle/Migrations/Data/ORM/LoadProductUnitData.php#L47-L52
In addition, you have to provide translations for the new unit, but there are more messages then you specified in the question:
https://github.com/oroinc/orocommerce/blob/ad94fe9bd63db28eae7d4a73743a4cada4f49080/src/Oro/Bundle/ProductBundle/Resources/translations/jsmessages.en.yml#L26-L35

Binary Sankey Diagram in Tableau - Not All Activities Match The Corresponding Number of KPIs

How do I link my activities variable to only the corresponding KPIs variable?
Using guidance from a number of sources, but primarily the genius of Jeffery Shafer articulated through the SuperDataScience video, I built a Sankey Diagram for my work. For the most part it works, however, I have been trying to figure out how to adjust my Sankey Diagram model to line up each activity with ONLY the corresponding KPIs, but am having no luck.
The data structure looks like this:
You'll note I changed the binary value to "", 2 instead of 0, 1 as it makes visual calculations easier. For the "Viz" variable, I have "Activity" for the raw data set, then I copy/paste/replicate the data to mirror the data (required for the model) but with "KPI" for the mirrored data.
In the following image, you'll see my main issue is that the smallest represented activity still shows as corresponding to all KPIs when in fact it does not. I want activity to line up only with the corresponding KPIs as some activities don't correspond with all, or even any, KPIs.
Finally, here is the model very similar to what the above video link shows:
Can someone help provide insight into how I can adjust the model to fit activities linking only to corresponding KPIs? I appreciate any insight. Thanks!
I have a solution to the issue, thanks to a helpful Tableau support member named Anthony. It was in the data structure. The data was not structured to only associate "Activities" with their "KPI" values within Tableau's requirements, but every "Activities" value with every "KPI" value. As a result, to achieve the desired result, the data needs to be restructured to only contain a row for every valid "Activities" and "KPI" combination. See the visual below where data is removed to format properly:
-------------------------------------->
Once the table is restructured, the desired visual result should configure with the model. It works like a charm!
Good luck out there!

Keeping dValIds For auto-generated dimensions consistent

I am working with Endeca 6.4.1 and have many auto-generated dimensions present in my pipeline (mapped using Dev-studio), the application's indexing is CAS-less. So only FCM is creating Dimensions and assigning dValIds. I am using Endeca SEO, so the dVal Id directly reflects in my URL, and if an auto-gen dimension's value's Id changes, a link to that navigation State is lost.
I have a flat file as the dimension's source, for example
product.feature|neon finish
What I want is that, if the value some day changes to Neon-finish or Neon color, the dValId that was assigned to neon finish should be transferred to the new value. I can keep a custom mapping of the change to track that neon finish has been changed to a new value.
Is there any way to achieve this, may be by using some manipulators?
Please share your thoughts.
There are two basic ways to do this:
1) Update the state files when you change a dimension value (APPDIR/data/state/autogen_dimensions.xml ). This would most likely be a manual process.
2) A more robust but complex solution is to change the dimension values to be some ID number and use a synonym for the display name. Then the display name can change without a change to the id number. This may require some serious changes to your pipeline.
Good luck

Include count of linked records in OData result

I've got a table "Events" with a linked table "Registrations", and I want to create an OData service that returns records from the Events table, plus the number of registrations for each event. The data will be consumed client-side in JavaScript, so I want to keep the size of the returned data down and not include all linked registration records completely.
For example:
ID Title Date Regs
1 Breakfast 01.01.01 12:00 4
2 Party 01.01.01 20:00 20
I'm building the service with ASP.NET MVC4. The tables are in an MSSQL database. I am really just getting started with OData and LINQ.
I tried using the WebAPI OData system first (using classes of EntitySetController) but was getting cryptic server errors as soon as I included the Registrations table in the entity set. ("The complex type 'Models.Registration' refers to the entity type 'Models.Event' through the property 'Event'.")
I had more success building a WCF OData system, and can request event information and information on related registrations.
However, I have no clue how to include the aggregate count information in the event result set. Do I need to create a custom entity set that will be the source for the OData service? I probably included too litte information here for finding a solution, but I don't really know where to look. Can somebody help me?
If you're willing to make an extra request per Event, you could query http://.../YourService.svc/Events(<key>)/Registrations/$count (or http://.../YourService.svc/Events(<key>)/$links/Registrations?$inlinecount=allpages if you're also using the links to the Registration entities).
Examples of both of these approaches on a public service:
http://services.odata.org/V3/OData/OData.svc/Suppliers(0)/Products/$count
http://services.odata.org/V3/OData/OData.svc/Suppliers(0)/$links/Products?$inlinecount=allpages&$format=json
I'm guessing that you'd prefer this information to come bundled together with the rest of the Events response though. It's not ideal, but you could issue a query along these lines:
http://services.odata.org/V3/OData/OData.svc/Suppliers?$format=json&$expand=Products&$select=Products/ID,*
I'm expanding Products (analogous to your Registrations) and selecting Products/ID in order to force the response to include an array that is the same size as the nested Products collection. I don't care about ID -- I just chose a piece of data that would be small. With this JSON response, your javascript client can get the length of the Products array and use that as the number of Products that are linked to the given Supplier.
(Note: to have your service support $select queries using WCF Data Services, you'll need to include this line when you initialize the service: config.DataServiceBehavior.AcceptProjectionRequests = true;)
Edit to add: The approach using $expand and $select won't be guaranteed to give you the correct count if your server does server-driving paging. In general, there isn't a simple single-response way to do what you're asking for in OData v3, but in OData v4, this will be possible with the new expand/select syntax.
i'm using oData v4 and i used this syntax :
var url = '.../odata/clients?$expand=Orders($count=true)';
// ...
a field called Orders#odata.count has been added to the response entity which contains the correct count.
and now to access the JSON property containing a dash you have to do it like this :
var ordersCount = response.value['Orders#odata.count'];
hope this helps.
Can you edit your Event model and add a RegistrationCount property? That'd be the simplest way I think
What I ended up doing was actually very simple; I created a View in SQL Server that returns the table including the registration counts. Never thought about using a view, since I've never used them before...
I used this to get the child count without returning the entities:
/Parents$expand=Children($count=true;$top=0)