Is there a component I can use to create a stock trading/backtesting application in .NET? I am looking for something similar to TradeScript from ModulusFE:
http://www.modulusfe.com/tradescript/
It's not as easy as the guy above tries to make it sound. You're best to use TradeScript, MQL4, EasyLanguage or even MatLab - there are some back testing add-ons for MatLab. You do NOT want to roll your own. There's a lot more to it than meets the eye.
I don't know if I understand you correctly... but you should be able to write a pretty simple back-testing app pretty quickly. You can even do it in Excel...
In any way, there are a couple of things that you need:
Data source (data set).
Trading signals (most likely given on data set).
You buy or sell given each trading signal and at the end you just tally up the profit. It's just a handful of functions: iterate through the data set, make predictions, for each prediction buy or sell, keep track of P&L.
Related
I'm playing the game Factorio, where you build a factory.
For the time being, I made a kind-of flowchart using libreoffice calc to calculate how many machines I need to produce a certain material.
Example image from the spreadsheet
Each block has a recipe saved (blue). This recipe includes what and how much it produces and needs and how much time it takes.
It takes the demand from the previous Block (yellow) and, using the recipe, calculates how many machines (green) it needs to fulfill this demand.
Based on the amount of machines it calculates its own demands (orange).
Then the following blocks do the same, until it has reached the last block.
Doing this in a spreadsheet does work, but it is quite a tedious task.
I showed this to my dad, as I'm quite proud of what I made, and he said that maybe a database would be more suitable.
I definitely see its advantages. For example I could easily summarize the final demands of raw resources, or the total power consumption, etc.
So I got myself Microsoft Access, and I'm pretty lost now. I know the basics of Databases and some SQL-Coding, but I'm not quite sure how I would make this.
My first attempt was:
one table for machines. It includes the machines production speed and other relevant stats.
one table for recipes. Each recipe clearly states what it produces, what it needs, the amount of each, and whether or not it is a basic. Basic means that it is a raw resources, i.e. the production chain would end with this.
one table for units. Each unit has a machine, a recipe and an amount. For example I would have one unit using basic assemblers to produce iron gears. This unit also says how many machines there are, so it needs more and produces more.
I did manage to make a query that calculates the total in and outputs of all units based on their machine and recipe, as well as a total energy consumption.
However, that is nowhere near the spreadsheet I made.
For now we can probably set the Graphical overlay aside, that would probably be quite a bit overkill. However what I do want to be able to make:
enter how much I want of a certain resources
based on that entry the database would create a new table. The first entry would be the unit that produces the requested resources. The second would fulfill the firsts demand, the third fulfills the seconds demand, and so on.
So in the end I would end up with a list of units that will produce my requested resource.
I hope someone can help me. There are programs out there that already do this kind of stuff, but I want to do this myself. If this is a problem that a database isn't suited for, then please tell me so.
Thanks for any help!
I'm working on the technical architecture for a content solution integration. The data from the solution provider runs to millions of rows and normalised to 3NF. It is updated on a regular schedule (daily most likely) and its data is split down to a very granular level of atomicity.
I need to search and query this data and my current inclination is to leave the normalised data alone and create a denormalised database from its data (OLAP to OLTP). The 'transfer' can be a custom built program that can contain the necessary business logic in addition to the raw copying power and be run at a set schedule as required. The denormalised database would then reduce the atomicity and allow the keyword searches and queries to run efficiently. I was looking at using Lucene .NET for the keyword work on the denormalised database.
So before I sing loudly from the hills that this is the way forward, I wanted some expert opinion on this and what is the perceived "best practise". Is the method I have suggested the best way forward considering the data I will be provided? It was suggested that perhaps I could use a 'search engine' to search the normalised data. This scared the hell out of me, but raised the question; what search engine and how?
Opinions, flames, bad language and help appreciated :)
I have built reporting databases and data warehouses based on data stored in normalized form. There is quite a bit of work involved in the transfer program (ETL). Given your description of the data feed, maybe some of that work has been done for you by the feeder.
Millions of rows isn't a lot, these days. You may be able to get away with report oriented views into the existing database. Try it and see.
The biggest benefit to building an OLAP oriented database is not speed. It's flexibility. "We love this report, but now we want to see it weekly and quarterly instead of monthly. Bam! Done!" "Can you break it down by marketing category instead of manufacturing category? Bam! Done!" And so on.
A resonably normalized model (3NF/BCNF) provides the best average performance and the least amount of modification anomalies for the largest number of scenarios. That's big, so I would start from there. As your requirements are fuzzy, it's seems like the most sensible option.
Actually, the most sensible thing would be to go over the requirements until they are a bit more "crisp" ;)
Also, if you could get your hands on a few early extracts from your data provider you could experiment with it and get a feeling for the data distributions (not all people live in one country, and some countries holds more people than others. Not all people have children, and the number children per person is vastly different depending on the country). This is a major point and it is crucial that the optimizer can make good decisions.
Other than that, I agree with everything Walter said and also gave him my vote.
Sorry I know the question isnt as specific as it could be. I am currently working on a replenishment forecasting system for a clothing company (dont ask why it's in VBA). The module I am currently working on is distribution forecasts down to a size level. The idea is that the planners can forecast the number to sell, then can specify a ratio between the sizes.
In order to make the interface a bit nicer I was going to give them 4 options; Assess trend, manual entry, Poisson and Normal. The last two is where I am having an issue. Given a mean and SD I'd like to drop in a ratio (preferably as %s) between the different sizes. The number of the sizes can vary from 1 to ~30 so its going to need to be a calculation.
If anyone could point me towards a method I'd be etenaly greatfull - likewise if you have suggestions for a better method.
Cheers
For the sake of anyone searching this, whilst only a temporary solution I used probability mass functions to get ratios this allowed the user to modify the mean and SD and thus skew the curve as they wished. I could then use the ratios for my calculations. Poisson also worked with this method but turned out to be a slightly stupid idea in terms of choice.
Tried doing a bit of research on the following with no luck. Thought I'd ask here in case someone has come across it before.
I help a volunteer-run radio station with their technology needs. One of the main things that have come up is they would like to schedule their advertising programmatically.
There are a lot of neat and complex rule engines out there for advertising, but all we need is something pretty simple (along with any experience that's worth thinking about).
I would like to write something in SQL if possible to deal with these entities. Ideally if someone has written something like this for other advertising mediums (web, etc.,) it would be really helpful.
Entities:
Ads (consisting of a category, # of plays per day, start date, end date or permanent play)
Ad Category (Restaurant, Health, Food store, etc.)
To over-simplify the problem, this will be a elegant sql statement. Getting there... :)
I would like to be able to generate a playlist per day using the above two entities where:
No two ads in the same category are played within x number of ads of each other.
(nice to have) high promotion ads can be pushed
At this time, there are no "ad slots" to fill. There is no "time of day" considerations.
We queue up the ads for the day and go through them between songs/shows, etc. We know how many per hour we have to fill, etc.
Any thoughts/ideas/links/examples? I'm going to keep on looking and hopefully come across something instead of learning it the long way.
Very interesting question, SMO. Right now it looks like a constraint programming problem because you aren't looking for an optimal solution, just one that satisfies all the constraints you have specified. In response to those who wanted to close the question, I'd say they need to check out constraint programming a bit. It's far closer to stackoverflow that any operations research sites.
Look into constraint programming and scheduling - I'll bet you'll find an analogous problem toot sweet !
Keep us posted on your progress, please.
Ignoring the T-SQL request for the moment since that's unlikely to be the best language to write this in ...
One of my favorites approaches to tough 'layout' problems like this is Simulated Annealing. It's a good approach because you don't need to think HOW to solve the actual problem: all you define is a measure of how good the current layout is (a score if you will) and then you allow random changes that either increase or decrease that score. Over many iterations you gradually reduce the probability of moving to a worse score. This 'simulated annealing' approach reduces the probability of getting stuck in a local minimum.
So in your case the scoring function for a given layout might be based on the distance to the next advert in the same category and the distance to another advert of the same series. If you later have time of day considerations you can easily add them to the score function.
Initially you allocate the adverts sequentially, evenly or randomly within their time window (doesn't really matter which). Now you pick two slots and consider what happens to the score when you switch the contents of those two slots. If either advert moves out of its allowed range you can reject the change immediately. If both are still in range, does it move you to a better overall score? Initially you take changes randomly even if they make it worse but over time you reduce the probability of that happening so that by the end you are moving monotonically towards a better score.
Easy to implement, easy to add new 'rules' that affect score, can easily adjust run-time to accept a 'good enough' answer, ...
Another approach would be to use a genetic algorithm, see this similar question: Best Fit Scheduling Algorithm this is likely harder to program but will probably converge more quickly on a good answer.
I want to figure out how much money I'd save if I optimise some part of my web app. If I save 100 cpu milliseconds over 50K calls to the app, how much electricity is that not using in a day? How about over a year?
I've tried to find some figures thru google, but my googling mojo is failing me at present.
You can't calculate something that specific. You can only conduct an experiment and see what happens.
But honestly I would rather spend time refactoring code for better maintainability and adding new features the customers will like and pay for, so that I won't have to think about electricity.
When "optimizing" it is always important to focus on what you want to "optimize" - in this case, your electricity bill. I would not even bother looking at changing code in an attempt to affect your electricity bill. I would look at the computer's power supply, cooling fans, heat sink, etc. and optimize those things for energy efficiency (buy new, more efficient components). More than likely it will cost less than several hours of a software engineer "optimizing" code for energy efficiency.