What is a "CDR"? - definition

I'm reading on Vertica's web site (database vendor) that many of their customer use their database for "CDR" purposes.
Question: What is a "CDR"?
http://www.vertica.com/customers/success

It is the Call Detail Record. A call between points A and B might get routed through a few switches and for each leg of that call a CDR gets generated. The final CDR would be a correlation of the CDRs from each leg. A CDR contains information pertaining to a call, such as:
Calling Number
Called Number
Point Codes or Switch codes of the source switch
Point Codes or Switch codes of the terminating switch
Trunk Access Code
Time stamps of different events during call setup and call tear down
There are lots of other fields in a CDR and most of them vary depending on the standard being used. For example- Most of the fields mentioned above pertain to an ISUP call. The CDR for a GSM/GPRS call would differ.
I hope this helps.
cheers

In that context, it is most likely a Call Detail Record, as the clients are Vonage, Verizon, etc.

Related

ABAP Program to notify Users X amount of days before user account will be disabled

I'm currently learning ABAP and trying to make an enhancement but have broken down in confusion on how to go about building on top of existing code. I have a program that runs periodically via a background job that disables user accounts X amount of days (in this case 90 days of inactive usage based on USR02~TRDAT).
I want to add an enhancement to notify the User via their email address (result of usr02~bname to match usr21~bname to pass the usr21~persnumber and usr21~addrnumber to adr6 which will point to the adr6~smtp_addr of the user, providing the usr02~bname -> adr6~smtp_addr relationship) based on their last logon date being 30, 15, 7, 5, 3, and 1 day away from the 90 day inactivity threshold with a link to the SAP system to help them reactivate the account with ease.
I'm beginning to think that an enhancement might not be a good idea but rather create a new program and schedule the background job daily. Any guidance or information would be greatly appreciated...
Extract
CLASS cl_inactive_users_reader DEFINITION.
PUBLIC SECTION.
TYPES:
BEGIN OF ts_inactive_user,
user_name TYPE syst_uname,
days_of_inactivity TYPE int1,
END OF ts_inactive_user.
TYPES tt_inactive_users TYPE STANDARD TABLE OF ts_inactive_user WITH EMPTY KEY.
CLASS-METHODS read_inactive_users
IMPORTING
min_days_of_inactivity TYPE int1
RETURNING
VALUE(result) TYPE tt_inactive_users.
ENDCLASS.
Then refactor
REPORT block_inactive_users.
DATA(inactive_users) = cl_inactive_users_readers=>read_inactive_users( 90 ).
LOOP AT inactive_users INTO DATA(inactive_user).
" block user
ENDLOOP.
And add
REPORT warn_inactive_users.
DATA(inactive_users) = cl_inactive_users_readers=>read_inactive_users( 60 ).
LOOP AT inactive_users INTO DATA(inactive_user).
CASE inactive_user-days_of_inactivity.
" choose urgency
ENDCASE.
" send e-mail
ENDLOOP.
and run both reports daily.
Don't create a big ball of mud by squeezing new features into existing code.
From SAP wiki:
The enhancement concept allows you to add your own functionality to SAP's standard business applications without having to modify the original applications. To modify the standard SAP behavior as per customer requirements, we can use enhancement framework.
As per your description, it doesn't sound like a use case for an enhancement. It isn't an intervention in an existing process. The original process and your new requirement are two different processes with some mutual logical part - selection of days of inactivity of users. The two shouldn't rely on each other.
Structurally I think it is best to have a separate program for computing which e-mails need to be sent and when, and a separate program for actually sending them.
I would copy your original program to a new one, and modify it a little bit so that instead of disabling a user, it records into some table for each user: 1) an e-mail 2) a date when to send 3) how many days left (30, 15, 7, etc) 4) status if the e-mail was sent or not. Initially you can even have multiple such jobs for each period (30, 15, 7 etc) and pass it as a parameter (which you use inside instead of 90).
This program you run daily as a job and it populates that table with e-mail "tasks" of what needs to be sent today. It just adds new lines, so lines from yesterday should stay in there.
The 2nd program should just read that table and send actual e-mails and update the statuses. You run that program daily as well.
This way you have:
overview: just check the table to see what's going on
control: if the e-mailer dies or hangs, you can restart it and it will continue where it left off; with statuses you avoid sending duplicate mails
you can make sure that you don't send outdated e-mails if in your mailer script you ignore all tasks older than say 2 days
I want to clarify your confusion about the use of enhancements:
You would want to use enhancements in terms of 'something' happens or is going to happen in the system and you would want to change this standard way.
That something, let's call it event or process could be for example an order is placed, a certain user is logging onto the system or a material has been or is going to be changed.
The change could be notifying another system of an order or checking the logged on user with additional checks for example his GUI version and warn him/her if not up-to-date.
Ask yourself, what process on the system does the execution of your program or code depend on. Does anything need to happen before the program is executed? No, only elapsing time.
Even if you had found an enhancement, you would want to use. If this process using the enhancement would not be run in 90 days, your mails would not be sent, because the enhancement would never been called.
edit: That being said, supposing you mean by enhancement 'building on your existing program' instead of 'creating a new one' would be absolutely not the right terminology for enhancement in the sap universe.
I would extend the functionality of your existing program, since you already compute how many days are left and you would have only one job to maintain.

Find SAP modifications of function groups includes

(access key for SAP standard modifications needed for test data of this question)
Introduction:
I want to find a certain type of modification to a certain SAP standard repository object (IDES test dataset).
The modification is located in the include of a function group (it is listed under System-defined Include-files in the functionpool of the function group).
So far, I found the table SMODILOG as a central list of modifications (Log of Customer Modifications to Dev. Env. Objects).
Test Data:
My test data is function group V07A, that has e.g. the include LV07A014 (Part of the LV07ANNN include). This was modified by inserting stuff in its source code (one needs an access key in order to be allowed to do this) such as:
*{ INSERT IDSEXAMPLE 1
* this is a comment, which was added
*} INSERT
Goal:
Subsequently I want to find a table where all modifications like this are listed. I want to find the place of modification, i.e. the object type and program id of the object that was modified.
2 Questions:
I realize that the SAP standard include of a function group has a different object type and program ID than the top-include and uxx-includes.
Whereas the latter are of type PROG and prgmid R3TR (found in object catalog entry), the LV07A014 has an object catalog entry identical to the function group that it belongs to, namely R3TR FUGR. This is already peculiar to me. This seems to me as if the resolution to the sub-level (include level) is missing.
In addition, the modification to LV07A014 is listed in table SMODILOG as having the object type (field sub_type) REPS (the pgmid is not included in SMODILOG). I would expect PROG, as for the other inclueds (LTOP, LUXX).
-> Why is there a difference of object types, programids between L_TOP, L_UXX on the one hand and L_NNN Inclues on the other? (Or am I mistaken?)
-> Where can I find information of all SAP standard modifications in my system and the true object type,pgmid belonging to these modified objects (and not the function group that the modified object belongs to, this resolution does not suffice)?
There is no table or something similar where all modifications are listed. The table E071 is a good source to check pgmid and object type.
Why there is a difference of object types, program IDs between L_TOP, L_UXX on the one hand and L_NNN Includes on the other remains a mystery. SAP...
Why do you need a table? For what? Is it purely academic question or connected with real life tasks?
Have you ever tried SE95 transaction? It lists all modifications that were done in system disregard of object type and name. Yes, and function groups too. They are easily searchable by hierarchy
If it is FUGR include that was modified, then it will be listed in the node Outside of modularization units
Finally RTFM, bro...

Usage of ATG PaymentGroupMapContainer

I looked through documentation and some of the source code too, but am not able to understand the exact use of PaymentGroupMapContainer . I have a sample code - getPaymentGroupMapContainer().addPaymentGroup(ITEM_DESC_GIFT_CARD, giftCard); and in a similar way other payment groups are also added to the container. What if we have multiple gift card, how should this be handled? I am just not able to get the reason why PaymentGroupMapContainer exist
As you've rightly identified, a PaymentGroupMapContainer, can only contain one payment type of type ITEM_DESC_GIFT_CARD. However an order can contain more than one PaymentGroup so to add multiple giftcards to the same order you need to adjust your logic slightly.
First ensure that you create a new PaymentGroup
GiftCard giftCard = (GiftCard) getPaymentGroupManager().createPaymentGroup("giftCard");
Now when you need to apply the giftCard:
getPaymentGroupManager().addPaymentGroupToOrder(order, giftCard);
Obviously there is a lot more to do around getting the GiftCard configured but the above should resolve your original problem about adding multiple GiftCards to a single order.

WCF data services - Limiting related objects returned based on critera

I have an object graph consisting of a base employee object, and a set of related message objects.
I am able to return the employee objects based on search criteria on the employee properties (eg team) etc. However, if I expand on the messages, I get the full collection of messages back. I would like to be able to either take the top n messages (i.e. restrict to 10 most recent) or ideally use a date range on the message objects to limit how many are brought back.
So far I have not been able to figure out a way of doing this:
I get an error if I attempt to filter on properties on the message (&$filter=employee/message/StartDate gives an error ">No property 'StartDate' exists in type 'System.Data.Objects.DataClasses.EntityCollection`1).
Attempting to use Top on the message related object doesn't work either.
I have also tried using a WebGet extension that takes a string list of employee IDs. That works until the list gets too long, and then fails due to the URL getting too long (it might be possible to setup a paging mechanism on this approach)...
Unfortunately the UI control I am using requires the data to be in a fairly specific hierarchical shape, so I can't easily come at this from starting on the message side and working backwards.
Outside of making multiple calls does anyone know of a method to accomplish this with wcf data services?
Thanks!
M.
Looks like the only real way of doing this is in fact to reverse the direction of the query.
So instead of starting from the Employee, I go from the message side. You can filter back on the employee properties, and restrict on the Messages collection. Its not ideal, as it means iterating the collection on return to re-center it on the employee for what I am attempting to do, but it will work. The async nature of silverlight and rich client at least means while an extra iteration is required, it still appears to be reasonably fast.
Another interesting thing to note: the current version of odata/wcf data services does not support querying on properties of inherited classes, so I had to move the start/end date properties up to the base class in order to be able to restrict my search on them.
http://Site/Service.svc/Messages()?&$filter=Employee/OfficeName eq 'Toronto' and (year(StartDate) eq 2010 and month(StartDate) ge 9 )

TSQL Query for analyzing Text

I have a table that has ordernumber, cancelled date and reason.
Reason field is varchar(255) field and it was written by many different sales rep and really hard to group by the reason category I need to generate a report to categorize cancelation reasons. What is the best way to analyse the reasons with TSQL?
Sample of reasons entered by sales rep
cust already has this order going out
cust can not hold for item Called to cancel order
cust doesn't want to pay for shipping
wife ordered same item from different vendor, sent email
cst made a duplicate order, sent email
cst can't hold
Cust doesn't want to go through verification process so is cancelling order
doesn't ant to hold for Bo
doesn't want
Cust called to cancel the order He can no longer get the product he wants
cnt hld
will not comply with export req
cant' hold
Custs request
Cust will not hold for BO
per. cust. request.
BTW I have SQL Server 2005.
part of your problem is that this these aren't truly reason codes. sounds like an issue with your schema to me. if there aren't predefined reason codes to reference and you're allowing free text entry for each reason, then there's really no way to do this directly, outside of pulling distinct reasons back, which is probably not going to be very useful.
just an idea, can you add another column to the table, even if it's in a temp or test environment and then give the business users the ability to assign a code (e.g. 1 for mis-ships, 2 for duplicate orders, 3 for wrong item etc.) to each order cancellation. then perform the analysis on that.
i assume that's what they're expecting from you, but i don't know that i see any better way. you could always perform the analysis yourself if you have the authority/knowledge but this might be painful if you have a ton of cancellations.
edit- i see now that you've tagged this with regex... it would be possible to setup specified keywords to pull out the entries, but there'd have to be some tolerance built in and still manual analysis afterwards for items which don't fall into any specified category due to misspellings etc. /edit
+1 to #jmatthews, you really need to have reason codes that are selected and then possibly allow free-form entry for the full reason.
If this isn't an option you can look into text clustering. Don't expect that to be fast or easy though, it's still an open research topic and is related to both AI and machine learning.
Look at Term Lookup in SSIS, here is an article to read.