alphavantage: how to get NSE nifty 50 and nifty 100 stock symbols - stock

trying to get historical data of NSE stocks in below intervals:
5Min,1Hour and day.
So how to get Nifty 50 and nifty 100 stock symbols to access data through Alphavantage API.

According to the alpha vantage docs you can request plenty of timeseries. Also your desired timeseries are possible (both intraday requests and the daily requests) You find good explanation how you have to set the API Parameter for each time series.
Examples for different languages are also provided below every time series
As for the Nifty Stocks I don't know. Maybe just try the Tickersymbols in the request.

Related

How increase the product uploading speed in shopify?

I have more than 600k products in my shopify store . the store is taking too much time to upload a products in admin back-end ( 11k product is taken almost 8 hours to complete the upload process )
I have even Used the "shopify product API" to add my product to store .
Even API is taking to much time to insert a product to store .
Now i am in big confusion that which i prefer to upload the product .
whether through "admin back-end" or "shopigy API call .
please suggest me a best way ..
thank you
If you have that many products you should either be looking at Shopify Plus or another platform entirely.
Each product takes one API call to upload and over time your API call limit averages out to 2 per second so 600k products with one variant per product would take 83 hours to upload. Your 11k products should only take 1.5 hours to upload though so unless you have a number of apps running there is something wrong with your API setup.
If you maximize the partition of your products into variants you can upload a product and its variants in a single call. Each product may have up to 100 variants so if you can group your products into variants the theoretical saving could be down to 6k API calls and just under an hour of processing (if you have variant images I think you'd need 3 calls per product/variant/image group - 1 to upload the products/variants/images; one to read the variant and image ids; one to assign the images to the variants.
Shopify Plus has 5 times the API limit (though I can't find an official confirmation of this) so your 600k products could be uploaded in 16 hours.
As #bknights said putting all the variants and combining products as variants is the fastest way.
I'd also like to add this: split your portfolio into lots and using API you can have parallel API calls running.
I have to update 60K variants on my store once a week. As I figured out it used take an entire weekend sometimes to finish of the things. I must add that I use PowerShell for this task. Later I came to realize that while one call is running my program is running idle and by trial and error I came to a conclusion that I can have 4 call made at 250 milli seconds gap each. So I update variants of all the products (each having around 45 variants) in a single call.
This way, the time cut down to less than 1/12th of the total time. Also you can use the API call limit returned by Shopify to calculate the time gaps further. For a non-Plus Shopify account this is the fastest way possible.

What is the difference between parsing betting website for live scores vs official website API?

I want to monitor some live scores on soccer matches. I have 2 ways to do this:
official api from the website(free)
parse websites source code myself and get data from it( need to do it every second)
What is the difference? Is calling API faster?
This can depend on quite a lot external to this specific scenario, but given the context, yes the API's would much faster. The difference is in what data is being sent/received/parsed.
In either scenario you'd need some timer to tick and parse the results (website or API) so there's no performance difference in the "wait code", but the big difference will be in the data itself that is parsed. When you call the API, chances are more likely that you will send a specific parameter or call a specific function that indicates what you're looking for, pseudo-code example:
SoccerSiteApi.GetValue(SCORE, team1, team2);
Or
SoccerSiteApi.GetCurrentScores(team1, team2);
By calling the API, you are only sending and receiving a few hundred bytes (or more depending on data) and getting back exactly what you want, that is, you don't need to parse the scores out of the values sent back since they are the scores, so no processing time is spent doing anything additional with the data itself.
If, however, you were to parse the entire web site, you would need to make an HTTP GET request (and all that entails) to get the entire page (which could be a couple hundred KB or MB depending on content) and then spend processing time extracting the exact data you were looking for, and then doing this every second.
So the biggest difference is amount of data and time spent processing it.
Hope that can help

How can I count the results in Gnip's Powertrack API?

I am looking for a URL to count the results retrieved using Powertrack API. Something similar to what I find using Search API:
https://search.gnip.com/accounts/ACCOUNT_NAME/search/LABEL/counts.json
I've been looking at Gnip's docs but I have found nothing that allows me to count the results.
I tried using other URLs (stream.gnip.com, and using search.gnip.com with 'powertrack' instead of 'search'). I can't paste more than 1 link so I can't show the complete URLs here, sorry.
I also looked at Historical PowerTrack API reference, and I can't find anything there related to this.
Thank you.
The only products that support a counts endpoint are the 30 Day and Full Archive Search APIs.
Because PowerTrack is a streaming API and supports 10's of thousands of concurrent rules, your best bet would be to store the data into a database or document storage system (NoSQL) that would allow for filtered queries to extract the counts you need.
Historical PowerTrack could technically allow you to determine a count for a specific query, just based on the total number of activities returned, but to execute an HPT job for the sole purpose of getting a count would not be cost-effective.
Like Steven suggested you better store it in a (NoSQL) database and perform your own aggregations.
Gnip does provide a Usage API which will give you the total volume per period per source.

BloomBerg API - It is possible to get the top 200 Securities?

I'm currently doing a feeder and i'm using Bloomberg API but I only need to get the top 200 securities for a specific period, is it possible?
Thanks.
As far as I know, there is no way to do a securities search through the API. If you have a well defined universe (say 1000 stocks and want the top 200 based on performance for the day), you can always query the performance for the day field for the 1000 stocks and sort/filter in your application.
ps: I assume you are talking about the open blp api.
I'm pretty sure it's possible since I know it can be done via the Excel API. Use the EQS in the Bloomberg Terminal to create a screen. There, choose the equity universe and the sort criteria. You can now download to Excel the equities which the screen displays. I recommend you ask the Help Desk (F1 F1 on the Terminal) how to download an EQS screen via API v3.

Stock purchases and stock quote data

I apologize for being rather vague here, but I'm working on a project involving stock data and stock purchases. I'm sure I'm going to end up having to get a broker involved, but I was wondering if anyone knows of any documentation on the underlying technology involved with existing trading sites, as well as the channels through which systems like google finance get their information.
Note that I already know of the APIs from yahoo and tdameritrade that send out the data, I'm interested in the channels through which that data travels to them in the first place.
They're most likely getting the data feeds from one or more of the usual suspects (ie, Reuters, Bloomberg and the like). You've probably noticed that the feeds on the publically accessible websites are delayed by 15-20 minutes compared to the real time feeds. Keep that in mind in your application, if you need proper up to date/real time market data it'll cost you a fair penny.
Those firms trading directly on the exchanges obviously have access to the data from the actual exchange - that's what you (have to) use in real time and algorithmic trading. However, the above mentioned companies (and I'm sure there are a few more, these are just the ones that most people are familiar with) are usually the data providers for those trading via intermediaries.
For reference data on stocks (as opposed to actual stock quote data), Mergent ( http://www.mergent.com ) is one of the data suppliers and has been collecting the data for decades. It has a set of APIs at http://www.mergent.com/servius