Rabbit MQ Queing options - rabbitmq

I have a scenario where i need to get three different type of messages. They all contains same information. Lets take for e.g
I have 3 diffrent dealers of car Ford,Honda, Nissan.
They all send me message about car and its specs.
Would you create three queses
ABCCarCompany.E.Direct.Honda
ABCCarCompany.E.Direct.Nissan
ABCCarCompany.E.Direct.Ford
OR just one
ABCCarCompany.E.Direct.Cars
and have them send the car manufacturer as parameter..
What are pros of creating 3 queses vs 1 ..

I recommend that you send all data into a single exchange, call it CarEx. 1 vs 3 queues depends on how you want to use the data. If you need to do the exact same thing with each car that comes in (like put it in a database) then you only need one queue. If you need to do something different for each car (like put into a database for Ford but send an alert for Nissan) then you would want 3 different queues.
If you have 3 different queues you can route the messages into them based on the routing key.

Related

What is the use of Table Delivery Class in SAP Data Dictionary?

I want to see the difference of Delivery Class 'A' and 'C'. C for data entered only by the customer, but how can I see it on the code?
I created two tables of type 'A' and 'C'. I add data with ABAP code. I thought I couldn't add data to the table I created with C, but they both work the same.
For A Type:
DATA wa_ogr LIKE ZSGT_DELIVCLS2.
wa_ogr-ogrenci_no = 1.
wa_ogr-ogrenci_adi = 'Seher'.
INSERT ZSGT_DELIVCLS2 FROM wa_ogr.
For C Type:
DATA wa_ogr LIKE ZSGT_DELIVERYCLS.
wa_ogr2-ogrenci_no = 1.
wa_ogr2-ogrenci_adi = 'Seher'.
INSERT ZSGT_DELIVERYCLS FROM wa_ogr2.
Datas get trouble-free when I check with debugging.
Is there a live demo where I can see the working logic of C? Can you describe Delivery Class C better?
Tables with delivery class C are not "customer" tables, they are "customizing" tables. "Customizing" is SAPspeak for configuration settings. They are supposed to contain system-wide or client-wide settings which are supposed to be set in the development system and then get transported into the production system using a customizing transport. But if that's actually the case or not depends on what setting you choose when generating a maintenance dialog with transaction SE54. It's possible to have customizing tables which are supposed to be set in the production system directly without a transport request.
Tables with delivery class A are supposed to contain application data. Data which is created and updated by applications as part of their every day routine business processes. There should usually be no reason to transport that data (although you can do that by manually adding the table name and keys to a transport request). Those applications can be SAP standard applications, customer-developed applications or both.
There is also the delivery class L which is supposed to be used for short-living temporary data as well as the classes G, E, S and W which should only be used by SAP on tables they created.
But from the perspective of an ABAP program, there is no difference between these settings. Any ABAP keywords which read or write database tables work the same way regardless of delivery class.
But there are some SAP standard tools which treat these tables differently. One important one are client copies:
Data in delivery class C tables will always be copied.
Data in delivery class A tables is only copied when desired (it's a setting in the copy profile). You switch it off to create an empty client with all the settings of an existing client or to synchronize customizing settings between two existing clients without overwriting any of the application data. You switch it on if you want to create a copy of your application data, for example if you want a backup or want to perform a destructive test on real data.
Data in delivery class L tables doesn't get copied.
For more information on delivery classes, check the documentation.

How to send data to only one Azure SQL DB Table from Azure Streaming Analytics?

Background
I have set up an IoT project using an Azure Event Hub and Azure Stream Analytics (ASA) based on tutorials from here and here. JSON formatted messages are sent from a wifi enabled device to the event hub using webhooks, which are then fed through an ASA query and stored in one of three Azure SQL databases based on the input stream they came from.
The device (Particle Photon) transmits 3 different messages with different payloads, for which there are 3 SQL tables defined for long term storage/analysis. The next step includes real-time alerts, and visualization through Power BI.
Here is a visual representation of the idea:
The ASA Query
SELECT
ParticleId,
TimePublished,
PH,
-- and other fields
INTO TpEnvStateOutputToSQL
FROM TpEnvStateInput
SELECT
ParticleId,
TimePublished,
EventCode,
-- and other fields
INTO TpEventsOutputToSQL
FROM TpEventsInput
SELECT
ParticleId,
TimePublished,
FreshWater,
-- and other fields
INTO TpConsLevelOutputToSQL
FROM TpConsLevelInput
Problem: For every message received, the data is pushed to all three tables in the database, and not only the output specified in the query. The table in which the data belongs gets populated with a new row as expected, while the two other tables get populated with NULLs for columns which no data existed for.
From the ASA Documentation it was my understanding that the INTO keyword would direct the output to the specified sink. But that does not seem to be the case, as the output from all three inputs get pushed to all sinks (all 3 SQL tables).
The test script I wrote for the Particle Photon will send one of each type of message with hardcoded fields, in the order: EnvState, Event, ConsLevels, each 15 seconds apart, repeating.
Here is an example of the output being sent to all tables, showing one column from each table:
Which was generated using this query (in Visual Studio):
SELECT
t1.TimePublished as t1_t2_t3_TimePublished,
t1.ParticleId as t1_t2_t3_ParticleID,
t1.PH as t1_PH,
t2.EventCode as t2_EventCode,
t3.FreshWater as t3_FreshWater
FROM dbo.EnvironmentState as t1, dbo.Event as t2, dbo.ConsumableLevel as t3
WHERE t1.TimePublished = t2.TimePublished AND t2.TimePublished = t3.TimePublished
For an input event of type TpEnvStateInput where the key 'PH' would exist (and not keys 'EventCode' or 'FreshWater', which belong to TpEventInput and TpConsLevelInput, respectively), an entry into only the EnvironmentState table is desired.
Question:
Is there a bug somewhere in the ASA query, or a misunderstanding on my part on how ASA should be used/setup?
I was hoping I would not have to define three separate Stream Analytics containers, as they tend to be rather pricey. After running through this tutorial, and leaving 4 ASA containers running for one day, I used up nearly $5 in Azure credits. At a projected $150/mo cost, there's just no way I could justify sticking with Azure.
ASA is supposed to be purposed for Complex Event Processing. You are using ASA in your queries to essentially pass data from the event hub to tables. It will be much cheaper if you actually host a simple "worker web app" to process the incoming events.
This blog post covers the best practices:
http://blogs.msdn.com/b/servicebus/archive/2015/01/16/event-processor-host-best-practices-part-1.aspx
ASA is great if you are doing some transformations, filters, light analytics on your input data in real-time. Furthermore, it also works great if you have some Azure Machine Learning models that are exposed as functions (currently in preview).
In your example, all three "select into" statements are reading from the same input source, and don't have any filter clauses, so all rows would be selected.
If you only want to rows select specific rows for each of the output, you have to specify a filter condition. For example, assuming you only want records with a non null value in column "PH" for the output "TpEnvStateOutputToSQL", then ASA query would look like below
SELECT
ParticleId,
TimePublished,
PH
-- and other fields INTO TpEnvStateOutputToSQL FROM TpEnvStateInput WHERE PH IS NOT NULL

ER diagram relationship and Bridge Tables

I have to design a database for buses timetables.
Entities:
Bus (idBus*)
Stop (idStop*,stopDescription)
Line (idLine*,lineDescription)
Position (lat,lon)
Some constraints are the following:
Multiple Buses may operate for one Line (therefore BUS:LINE = N:1)
One Line has many Stops , and from one Stop are passing many Lines (therefore STOP:LINE = N:N)
One Bus passes from many Stops and vice versa (therefore BUS:STOP = N:N)
A Stop has One Position (therefore STOP:POSITION = N:N)
A Bus has multiple Positions (therefore BUS:POSITION = 1:N)
E-R DIAGRAM
An example of modelling would be a bridging table for the STOP-POSITION relationship that would look like this:
STOP_POSITION(idStop*,lat,lon) whereas idStop is the Foreign Key.
In general:
If i have an idBus i would like to be able to get the associated idLine.
If i have an idBus and an idStop i would like to have info on the itinerary of the Bus (which is the next stop , time of arrival, direction)
If i have an idBus and an idLine i would like to get the itinerary of the Bus(all the Stops from where the Bus will pass and their order)
Questions
The problem arise when considering the BUS-STOP relationship, because when i consider to know the id of the Stop and the id of the Bus then i will know a number of attributes like Direction,ID of NextStop, TimeOfArrival..
How should i model those attributes?
For example, every Bus is passing from multiple Stops and the progression is denoted by an attribute (e.g progressiveStop). How should i model this attribute?
Does it really make sense modelling the association of LINE-STOP?
Does it really make sense storing dynamic data in the database? I am referring to the BUS-STOP relationship.

Can the target of a conversation receive messages from different initiators using the same conversation?

I like this article: http://technet.microsoft.com/en-us/library/dd576261(v=sql.100).aspx because of the receive top (10000) into a table variable. Processing a table variable with 10000 messages would give me a giant boost in performance.
receive top (10000) message_type_name, message_body, conversation_handle<br>
from MySSBLabTestQueue<br>
into #receive
From reading, the receive provides messages given a single conversation_handle. I have 200+ stores all sending messages with the same message type and contract to the same server. Can I implement the server to get all the messages from these stores on a single call to receive?
Thanks
A target can consolidate multiple conversations into few conversation groups, using the MOVE CONVERSATION. The RECEIVE restricts the result set to one single conversation group so moving many individual conversation into a single group can result in bigger result sets, as you desire.
For the records, initiators can also consolidate conversations using MOVE CONVERSATION, there is nothing role specific here. But initiators can also use the RELATED_CONVERSATION_GROUP clause of BEGIN DIALOG to start the conversation directly in the desired group, achieving consolidation and thus bigger result sets w/o having to use MOVE. This is useful because you can simply reverse the roles in the app, ie. instead of stores starting the dialogs with central server, have the central server start the dialogs with each store (thus reversing the roles) and the central server can start the dialogs in as few conversation groups as it likes, even 1. This removes the need to issue MOVE CONVERSATION.

Database Model - SQL - Best Approach

I'm looking for help with part of a database design.
I have to Model the Database for a Group of Contacts and a Group of Distribution Lists.
Each Contact can be in many Distribution Lists and each Distribution List can have many Contacts. In a normal Instance, I could use a Junction table to achieve this solution.
But there's one more thing to add. Contacts have the option to receive notifications via two different methods which are SMS or Email.
A Contact can request to be sent notification via either or both methods.
The piece of the problem that I am stuck with, is that a Contact may wish to receive notifications differently depending on the specific distribution list.
So we have a problem like this :--
CONTACT A is in DL-A - Receives Notification via SMS
CONTACT A is in DL-B - Recieves Notification via Email & SMS.
I'm trying to avoid having more than one entry for a Contact in my Contacts Table, each contact should be unique.
Can Anyone help?
You could use another junction table:
contactid, distributionlistid, messagepreference
Messagepreference can be email or SMS. Two rows if they want both. New messaging types can be added with no changes to the DB. To be safe, use constants in your code to represent the values you will put into the columns.
Or, add sendemail and sendsms columns to the original junction table, but this has the drawback that you have to change DB structure if you introduce a new messaging type.
So you can add to fields in the junction table:
ContactsDistributions(ContactId, DistributionId, SMSFlag, EmailFlag)
in order to specify the type of notification choosen by the contact for each distribution.
You can add one more field in junction table which will represent how given contact will receive notification from given distribution list.
In this case I would add two fields to the junction table SMS and email both boolean and set to true if and only if they wish to receive notification in that matter. This allows the notifications to be set differently for combination list and contact.
Also depending on how you want to deal with removal from lists you could add a constraint on the junction table so that at least one of the two fields is true so that a notification is always sent although say in google groups I do have access to some lists which I have chosen not to get notifications from.