How to get Moving Average and Stochastic data from what user attached in mq4? - moving-average

How can I get data with mq4 Script from the following Graph ?
Moving Average and Stochastic in Metatrader
As you can see I have attached 2 functions : Moving Average and Stochastic.
I started to write a script. But I have no ideea how can I get data from the functions attached to the script so that I can start processing
Is there a handle somewhere that returns an array? Global arrays ?

Related

Stata Create panel dataset with two dataframes, no common variable

I am creating a city-by-day panel from scratch, but I'm having trouble balancing and filling in the data. Every city needs to have an observation every day between 01jan2000 and 31dec2019, my variable of interest is a dummy variable recording whether or not an event took place on that day in that city.
My original dataset only recorded observations if event == 1, and I managed to fill in time gaps using tsfill, but I can't figure out how to balance the data or extend it to start on 01jan2000 and 31dec2019. I need every date and city because eventually it will be merged with data that uses that sample period.
My current approach is to create a balanced & filled in panel and then merge the event data using the date it took place. I have a stata df containing the 7,305 dates, and another containing the 273 cityid's I'm observing. Is it possible to generate a new df that combines these two so all 273 cities are observed every day? essentially there will be 273 x 7,304 observations, no variables of interest.
Any help figuring out how to solve the unbalanced issue using either of these approaches is hugely appreciated.

How to set KOFAX KTM Server global variable value which will be initialized in Batch open, updated in SeparateCurrentPage & used in BatchClose?

I am trying to count a specific barcode value from Project.Document_SeparateCurrentPage and use it in BatchClose to compare if the count is greater than 1 and if it is >1 then send the batch to a specific queue with specific priority. I used a global variable in KTM Project Script to hold the count value which was initialized to 0 in Batch open. It worked fine until unit testing. But our automation team found that out of 20 similar batches, few batches were sent to the queue where the batch should go only if the count satisfies the greater than one condition, though they used only one barcode.
I googled and found that KTM Server script events do not allow to use shared information in different processes(https://docshield.kofax.com/KTM/en_US/6.4.0-uuxag78yhr/help/SCRIPT/ScriptDocumentation/c_ServerScriptEvents.html). Then I tried to use a batch field to hold the barcode count but unable to update its value from Project.Document_SeparateCurrentPage function using pXRootFolder.Fields.ItemByName("BatchFieldName").Text = "GreaterThanOne". The logs show that the batch reads the first page three times and then errors out.
Any links would help. Thanks in advance.
As you mentioned, the different phases of batch/document processing can execute in different processes, so global variables initialized in one event won’t necessarily be available in others. Ideally you should only use global variables if their content can be set from Application_InitializeScript or Application_InitializeBatch, because these events occur in each separate process. As you’ve found out, you shouldn’t use a global variable for your use case, because Document_SeparateCurrentPage and Batch_Close for one batch may occur in different processes, just as the same process will likely execute those events for multiple batches.
Also, you cannot set batch fields from document level events for a related reason: any number of separate processes could be processing documents of a batch in parallel, so batch level data is read-only to document events. It is a bit unintuitive, but separation is a document level event even though it seems like it is acting on the whole batch. (The three times you saw is just an error retry mechanism.)
If it meets your needs, the simplest answer might be to use a barcode locator as part of normal extraction (not just separation), and assign to a field if needed. While you cannot set batch fields from document events, you can read document data from batch events. So instead of trying to track something like a count over the course of document events, just make sure whatever data you need is saved at a document level. Then in a Batch_Close you can iterate the documents and count/calculate whatever you need. (In your case maybe the number of locator alternatives for the barcode locator, across each document.)

is this diagram is daily_seasonality and yearly_seasonality?

I am trying to create a time series model, I just need a help for specifying the type of this blue line in the graph sown below.
I am using fbprophet, the code is:
model = Prophet(interval_width=0.97,daily_seasonality=True,yearly_seasonality=True)
Is this daily_seasonality and yearly_seasonality?
The graph is:
Your plot is quite unclear and we might need more info about the data, from what I see I'd guess it is covid related (pure guess). Seasonality is cumulative so daily_seasonality=True, yearly_seasonality=True leads up to sum of yearly and daily seasonality. However, yearly seasonality will not play a role since you have data from just a year. So in this case, only daily information have value. You might be interested in this article, where authors take in consideration even data from holiday events.

How to add condition in splunk data model constraint

I have a outbound flow that gets data written by App, mem and cards api. cards and mem api is writing logs into applog but App is writing logs in syslog.
In my data model I have sourcetype=app_log as the source type. So in case of all flows except app I am getting write splunk dashboard report but for application I am not getting any data.
So I want to add a condition in data model "constraint" section like
when api is applications then sourcetype=app_log OR sourcetype=sys_log
else sourcetype=app_log
Can anyone assist me how to do this in splunk?
If you need a dual sourcetype, it's usually best to make that part of the root search/event to draw in all relevant data you would like to use in your data model.
A data model is like sheering away wood on a sculpture, so if it's usually better to start with all of the data, and then slowly pick away at what you want to see.
You can add | where clauses as constraints, however you can't add more data if you don't start with it in the root events.
my suggestion would be something like this in your root search:
(index=index1 sourcetype=sourcetype1) OR (index=index2 sourcetype=sourcetype2) field=blah field=blah ....

Advice to create a little script to associate near shipping address in the next 3 weeks

Hi guys I'm searching an advice on how to proceed to create a script that can calculate the distance between all shipping address in the next 3 weeks and if the distance between some of these locations is under 50 km it must send an email to make that thing notice to the operator. I'm starting from sql 2005 to take location and make all combination. Then I was thinking if there is some online service that, with a link structure where I can put the two location name, can retrieve the distance between those two point, then store it in different variable to use to make comparisons. I'll wait for some advice.thank you guys
Your script will probably take in 2 locations as parameters and you can determine the distance using Google Maps Matrix to get your distances from one place to another. Using that info you should be able to send an email using the script.
I'm not really sure what a database has to do with this so I recommend being more specific on your SQL implementation.