I am a total newbie in Pentaho Kettle/Spoon.
I am reading a transaction table with amounts in different transaction currencies.
The requirement is to convert the amounts in standardized currencies(in this case, EUR and USD) using the live exchange rates.
What should be my approach? What are the steps of the transformation need to be considered?
Any input is appreciated.
As far as I understood you have a table with many different currencies and want to convert those to usd and eur right?
I think you will need a webservice or a table that delivers the current conversion rates for this
Can you provide some additional info on the table itself, which amounts to be converted to USD or EUR. Also what would be the final result you are expecting.
Please provide more info.
Related
The Function module in SAP - FAGL_ACCOUNT_ITEMS_GL_API output's the data in 2 decimal places.
In Muscat (Oman) client uses 3 places after decimal to view amount.
How to achieve this here?
Thanks and Regards
CA. Nilesh Zambhuria
SAP always stores the posting amounts with two decimals. If the posting currency has three decimal places (you can check in table TCURX, field CURRDEC or in transaction OY04) it will be like this:
posting amount: 1.234
but in the database will be stored like: 12.34
Similiarly if a currency has no decimal places (like HUF for example), than:
posting amount: 100
in the database: 1.00
SAP will handle this inside SAP (standard transactions, ALV lists, etc.) properly (as long as the number of decimal places was set up correctly - it is not possible to change it in a live SAP system!).
In your case the function module will give back values with two decimal places, like 12.34 But in case the currency has three decimal playes (like OMR), you have to understand it like 1.234.
I am working in an application that handles different types of currencies. The application takes the currencies as input and converts them into USD and stores it in the database. Some data processing need to happen which creates issues with inconsistent rounding. I would like to find out the best way to handle store currencies. Is it better to store them in the native currency or store them all in one type like how the application is doing right now and display them in different currencies based on the culture?
I would have a table
currencies(id, name, sign)
and I would make sure that there are proper conversions:
conversions(id, from_currency, to_currency, value)
You might also want to have a table with conversion histories.
I have a table which holds a list of transactions.
Task: To estimate the next transaction amount.
Problem:
The actual payment periods for each rows is a varible, which can be weekly, monthly or anything choosen by the end user.
To estimate the next payment, based on previous data, can anyone suggest a good method?
At the moment I basically take the figure back to the daily amount then multiple by period i.e. week/month/q/year. Then given the history, choose the result that has the highest incidence (count).
This does not generate an accuarate estimations due to payments within payments that I dont need to care about i.e. £100 real payment but +20 for addition charges that are irrelevant.
Another way is to calculate the average,std,varience between payments then choose the highest probability.
Problem is, i've been unable to code this in SQL.
SELECT [Identifier]
,[DateTranEntered]
,[Type]
,[TranDateFrom],
,[TranDateTo]
,[Amount]
,[ReferenceForTran]
,[CreatedDate]
FROM .[TranTable]
Perhaps something with recursion through the table and calculate every transaction daily amount then with the variance, incidence - choose from the last 'x' what the estimate guess is ?
Problem is I have gotten stuck with the resurive query for this.
Any thoughts about this?
SQL Server Analysis services has a suite of data mining tools that provide algorithms such as Linear Regressions, Decision Trees and Neural Networks. You can learn more about them here: http://msdn.microsoft.com/en-us/library/ms175595.aspx. It sounds like Linear Regressions might be the best place to start for this problem.
For a number of financial instruments, Bloomberg scales the prices that are shown in the Terminal - for example:
FX Futures at CME: e.g. ADZ3 Curncy (Dec-2013 AUD Futures at CME) show as 93.88 (close on 04-Oct-2013), whereas the actual (CME) market/settlement price was 0.9388
FX Rates: sometimes FX rates are scaled - this may vary by which way round the FX rate is asked for, so EURJPY Curncy (i.e. JPY per EUR) has a BGN close of 132.14 on 04-Oct-2013. The inverse (EUR per JPY) would be 0.007567. However, for JPYEUR Curncy (i.e. EUR per JPY), BGN has a close of 0.75672 for 04-Oct-2013.
FX Forwards: Depending on whether you are asking for rates or forward points (which can be set by overrides)... if you ask for rates, you might get these in terms of the original rate, so for EURJPY1M Curncy, BGN has a close of 132.1174 on 04-Oct-2013. But if you ask for forward points, you would get these scaled by some factor - i.e. -1.28 for EURJPY1M Curncy.
Now, I am not trying to criticise Bloomberg for the way that they represent this data in the Terminal. Goodness only knows when they first wrote these systems, and they have to maintain the functionality that market practitioners have come to know and perhaps love... In that context, scaling to the significant figures might make sense.
However, when I am using the API, I want to get real-world, actual prices. Like... the actual price at the exchange or the actual price that you can trade EUR for JPY.
So... how can I do that?
Well... the approach that I have come to use is to find the FLDS that communicate this scaling information, and then I fetch that value to reverse the scale that they have applied to the values. For futures, that's PX_SCALING_FACTOR. For FX, I've found PX_POS_MULT_FACTOR most reliable. For FX forward points, it's FWD_SCALE.
(It's also worth mentioning that how these are applied vaires - so PX_SCALING_FACTOR is what futures prices should be divided by, PX_POS_MULT_FACTOR is what FX rates should be multipled by, and FWD_SCALE is how many decimal places to divide the forward points by to get to a value that can be added to the actual FX rate.)
The problem with that is that it doubles the number of fetchs I have to make, which adds a significant overhead to my use of the API (reference data fetches also seem to take longer than historical data fetches.) (FWIW, I'm using the API in Java, but the question should be equally applicable to using the API in Excel or any of the other supported languages.)
I've thought about finding out this information and storing it somewhere... but I'd really like to not have to hard code that. Also, that would require to spend a very long time finding out the right scaling factors for all the different instruments I'm interested in. Even then, I would have no guarantee that they wouldn't change their scale on me at some point!
What I would really like to be able to do is apply an override in my fetch that would allow me specify what scale should be used. (And no, the fields above do not seem to be override-able.) I've asked the "helpdesk" about this on lots and lots of occasions - I've been badgering them about it for about 12 months, but as ever with Bloomberg, nothing seems to have happened.
So...
has anyone else in the SO community faced this problem?
has anyone else found a way of setting this as an override?
has anyone else worked out a better solution?
Short answer: you seem to have all the available information at hand and there is not much more you can do. But these conventions are stable over time so it is fine to store the scales/factors instead of fetching the data everytime (the scale of EURGBP points will always be 4).
For FX, I have a file with:
number of decimal (for spot, points and all-in forward rate)
points scale
spot date
To answer you specific questions:
FX Futures at CME: on ADZ3 Curncy > DES > 3:
For this specific contract, the price is quoted in cents/AUD instead of exchange convention USD/AUD in order to show greater precision for both the futures and options. Calendar spreads are also adjusted accordingly. Please note that the tick size has been adjusted by 0.01 to ensure the tick value and contract value are consistent with the exchange.
Not sure there is much you can do about this, apart from manually checking the factor...
FX Rates: PX_POS_MULT_FACTOR is your best bet indeed - note that the value of that field for a given pair is extremely unlikely to change. Alternatively, you could follow market conventions for pairs and AFAIK the rates will always be the actual rate. So use EURJPY instead of JPYEUR. The major currencies, in order, are: EUR, GBP, AUD, NZD, USD, CAD, CHF, JPY. For pairs that don't involve any of those you will have to fetch the info.
FX Forwards: the points follow the market conventions, but the scale can vary (it is 4 most of the time, but it is 3 for GBPCZK for example). However it should not change over time for a given pair.
Many of the variables in the data I use on a daily basis have blank fields, some of which, have meaning (ex. A blank response for a variable dealing with the ratio of satisfactory accounts to toal accounts, thus the individual does not have any accounts if they do not have a response in this column, whereas a response of 0 means the individual has no satisfactory accounts).
Currently, these records do not get included into logistic regression analyses as they have missing values for one or more fields. Is there a way to include these records into a logistic regression model?
I am aware that I can assign these blank fields with a value that is not in the range of the data (ex. if we go back to the above ratio variable, we could use 9999 or -1 as these values are not included in the range of a ratio variable (0 to 1)). I am just curious to know if there is a more appropriate way of going about this. Any help is greatly appreciated! Thanks!
You can impute values for the missing fields, subject to logical restrictions on your experimental design and the fact that it will weaken the power of your experiment some relative to having the same experiment with no missing values.
SAS offers a few ways to do this. The simplest is to use PROC MI and PROC MIANALYZE, but even those are certainly not a simple matter of plugging a few numbers in. See this page for more information. Ultimately this is probably a better question for Cross-Validated at least until you have figured out the experimental design issues.