Bitcoin Exchange API - more frequent high low - bitcoin

Any way to get more a high-low value more frequent than every 24 hours from say the Bitstamp API ticker?
This link only tells you how to get the value for every 24 hours
https://www.bitstamp.net/api/
(this also seems to be a problem with every other exchange I've tried)

24 hours is compared by time_now - 24 hours so it should give you updated rates every second or may be every minute depends on the configuration of the api file.

Related

Mediatailor metrics in Cloudwatch

I'm looking Mediatailor metrics in Cloudwatch and found that for "Avail" group there are: duration, observedDuration, filledDuration, observedFilledDuracion, fillRate, observedFillRate.
For example for duration, documentation says that duration is a "planned" value and observedDuration is a "observed" value but it is not clear for me. I guess that planned is according with the AD marker in the manifest and observed is from the ad insertion step itself (is it correct?) I guess that "observed" values are more accurate.
Anyway I suppose that "planned" and "observed" values should be similar but usually this is not the case. These are a couple of examples for the values
Filled are similar but duration and fillRate are really different so I don't understand which I should to use
I guess that planned is according with the AD marker in the manifest and observed is from the ad insertion step itself (is it correct?) I guess that "observed" values are more accurate.
Yes, the observed values are what MediaTailor takes action on based on the VAST response from the Ad Decision Server (ADS). Planned is the value received from the Origin via SCTE-35 messaging.
Per the following documentation: https://docs.aws.amazon.com/mediatailor/latest/ug/monitoring-cloudwatch-metrics.html
Duration - The planned total number of milliseconds of ad avails
within the CloudWatch time period. The planned total is based on the
ad avail durations in the origin manifest.
ObservedDuration - The observed total number of milliseconds of ad
avails that occurred within the CloudWatch time period.
Avail.ObservedDuration is emitted at the end of the ad avail, and is
based on the duration of the segments reported in the manifest during
the ad avail.
To continue your example regarding the Duration metric let's say that for a set period of time the Origin server sends a manifest with a single Ad Break for 90 seconds. MediaTailor will perform a request to the ADS asking for 90 seconds of content. The ADS returns a VAST response that includes a 45 second Ad and a 30 second Ad for a total of 75 seconds. MediaTailor will report to CloudWatch that, for the set period of time, the Duration of all Avails planned by the Origin server was 90 seconds, but the Observed Duration provided by the ADS was 75 seconds.
The documentation goes into further detail regarding each metric and even provides some examples.

How cache entry's valid period is calculated in MULE4?

If I cache a payload, how long it will be valid?
There are 2 settings in caching-strategy;
Entry TTL and
Expiration Interval.
If I want to invalidate my cached value after 8 hours, How I should set above parameters?
What is the usage for 'invalidate cache' processor?
Entry TTL is how long an entry should live in the cache. Expiration interval is how frequently the object store will check the entries to see if one entry should be deleted. In your case entryTTL should 8 hours. Be mindful of the units used for each attribute. Expiration interval is a bit more tricky. You may want to check entries much more frequently to avoid them living more than 8 hours before expiring. It may be 10 minutes, 30 minutes, 1 hour or whatever works for you.
I explained it more in my blog: https://medium.com/#adobni/configuring-an-object-store-in-mule-4-5da609e3456a

Calculate number of events per last 24 hours with Redis

Seems, it's common task, but I haven't found solution.
I need to calculate number of user's events (for example, how many comments he left) per last 24 hours. Older data is not interesting for me, so information about comments added month ago should be removed from Redis.
Now I see only one solution. We can make keys that includes ID of user and hour of day, increment it value. Then we will get 24 values and calculate their sum. Each key has 24 hours expiration.
For example,
Event at Jun 22, 13:27 -> creating key _22_13 = 1
Event at Jun 22, 13:40 -> incrementing key _22_13 = 2
Event at Jun 22, 18:45 -> creating key _22_18 = 1
Event at Jun 23, 16:00 -> creating key _23_16 = 1
Getting sum of event at Jun 23, 16:02 -> sum of keys _22_17 - _22_23, _23_00 - _23_16: in our keys it's only _22_18 and _23_16, result is 2. Keys _22_13 and _22_13 are expired.
This method is not accurate (it calculates count of events per 24+0..1 hour) and not so universal (what key will I choose if I need number of events per last 768 minutes or 2.5 month?).
Do you have better solution with Redis data types?
Your model seems fine. Of course, it's not universal, but it's what you have to sacrifice in order to be performant.
I suggest you keep doing this. If you will need another timespan for reporting (768 minutes), you can always get this from mysql where your comments are stored (you do this, right?). This will be slower, but at least the query will be served.
If you need faster response or higher precision, you can store counters in redis with minute resolution (one counter per minute).
You can use redis EXPIRE query after each key creation.
SET comment "Hello, world"
EXPIRE comment 1000 // in seconds
PEXPIRE comment 1000 // in milliseconds
Details here.

GAE Java API Channel

In http://code.google.com/intl/es-ES/appengine/docs/quotas.html#Channel you
can read that with billing enabled the maximum channel created rate is 60
creations/minute. Does it mean that we can created only 86,400
channels/day. It's very low rate, isn't it? And if i have estimated that I
could have peaks of for example: 4,000 creations/minute... What i can do?
60 creations/minute are few creations if the channels are 1to1... Is this
correct?
My interpretation of that section is that you will NOT be able to create 4k connections per minute. Here is how I would think about it: over ANY 1-minute period, no more than 60 channels can be created. For example, you can create 60 channels at time T. Then, for the next 60 seconds you won't be able to create any. Or, you can create 30 at time T. Then, every 2 seconds, create a channel.
I believe another way to think about this is in terms of the token bucket algorithm.
Anyway, I believe you can fill out this form to request a higher limit. There is a link to that form from the docs that you linked to in your question.

Webtrends REST API Limit

Looking at the documentation i found this information :
Data Extraction API requests are rate limited by number of requests and data download volume per unit of time (hour, day, week, or month). If you exceed the limit, a 403 error occurs.
Someone could tell me more about this limit ? How many call per day/month/year ???
Just spoke to webtrends support. The limit is 600 requests per use per hour or 1 Gig per user per hour.