How can I use my own data for calculations in Pine Script - Tradingview? - service-broker

there I am new to Pine Script and the Rest API. The Rest API is for brokers. Have you implemented the Rest API. I need to do this:
I am getting OHLC data from a different source. I want to send this custom data to PINE script in Trading View and let it use my OHLC data to calculate all indicators.
Send this indicators values back to brokerage Rest API and I do trading or give custom signals from there
Who can help me with this?

Related

YouTube API - get monthly views from foreign channel

I try finding a way to get the viewcount of the last 6 months of any given YouTube channel. The YouTube analytics API is not helpful, cause it only allows for channels I own - the YouTube data API only returns the total view count of the channel lifetime.
Is there a way I can get the view count a channel has made on a monthly basis via the API? Scraping socialblade is my second option, but I'd rather use the Google Api.
Thanks for your help!
Your going to have to do it like socalblade probably does.
Just scan each channel you want to check every month.
YouTube analytics api only stores data for three months I think and you have to be authorized as you mentioned.
The YouTube data api doesn't store data by date its not intended for analytics.
I set up a system for a client a while back that just poles a few channels every day to get stats for them. Its not optimal but it works.

Send Google Analytics Data to eCommerce Server

We want to save in our database (custom-developed shop with C# and ASP.NET) where our customers came from to improve our marketing strategies, so:
Is it possible to send google analytics data to the eCommerce server while performing a checkout?
You need to enable the ecommerce option on your google analytics dashboard.
Then, enter the tracking code on the purchase confirmation page.
You can consult the link below:
https://support.google.com/analytics/answer/1009612?hl=en
It is also possible to identify the user's origin to improve their marketing campaigns. Just follow this:
https://support.google.com/analytics/answer/1033173?hl=en
The Real Time Reporting API enables you to request real time data—for example, real time activity on your property.
However you can extract information such as pages and events, not ecommerce data. Therefore you should make sure to track the checkout funnel so that the information is within an event (with category, action and label). At that point you can use the API I indicated.

How do I trigger data to be sent from our third-party application to NetSuite

I am trying to do an API integration of our third party application with NetSuite. I want to be able to import sales invoice details generated from our third party system (which uses REST API) into the NetSuite invoice form.
The frequency of import is not too crucial- an immediate import will be ideal, but sending data once a day is fine as well.
Furthermore, the invoice on NetSuite needs to be filled with details of which some are already available on NetSuite (ex: tax code, billing address, etc.) and some which need to be imported from the third party application (ex: item name, price, etc.)
How would I go about doing so? Would I need to use our API to get the item and price details, use an API to get the details from NetSuite, merge them together somehow and push that into NetSuite? Is there a less complex way of doing so?
I have heard of user events that trigger when a change occurs on NetSuite, and scheduled scripts that trigger every x period of time, but I have only seen these being used from the POV of sending data FROM Netsuite into a third party system. How would I trigger data to be sent to NetSuite when an invoice is created on our external application?
To summarize, I would like to know how to go about integrating when data is needed from 2 sources (NetSuite itself, and the external application) and how to trigger the data to be imported into NetSuite.
I am a complete beginner to APIs and NetSuite. Tips on what I should look into researching, and any/all help is much appreciated!
In my opinion, the easiest way to implement what you need here, is to use a NetSuite scheduled script to GET the data from the third party API, and then create the invoices required from that data. Instead of thinking about it as the third party application sending data to NetSuite, think about it as NetSuite fetching the data from the third party application.
The scheduled script would need to connect to the third party, authenticate and run a query to get any new invoices using whatever filters make sense (for example, 'date created'). It would then need to iterate over the records returned to create corresponding records in NetSuite.
Scheduled scripts can be set up to run on schedule at any time interval down to 15 minutes, or can be called from another script using the N/task module. They can also be triggered manually for testing.
You can use the Scheduled Script if you are fine with running batch processes at set intervals to pull in updates from the 3rd party, and the third party application has the necessary REST API (or gives you the ability to create new ones). You will need to create a mechanism to only ask for new/updates from a cutofftime and maintain that in NetSuite to avoid pulling in the same data each time.
If you want the third party application to push the data in realtime to NetSuite, you would need to write some code on the 3rd party app that either triggers on specific events (or is timed like the first approach). This can then call a custom RESTLET you would create on NetSuite, and pass along all the information NetSuite needs {itemname, price, etc} in JSON format. The RESTLET code would take the JSON data, add the missing pieces it needs to source from NetSuite objects{taxcode, billing address, etc} and create the invoice. Depending on your design, you could also have it such that the JSON could be a single invoice, or a list of invoices that you club together for performance.
Finally, you may also want to look at iPAAS platforms like Dell Boomi, Celigo, Jitterbit, etc. All of these support NetSuite as one of their endpoints. If they also natively support the 3rd party application you are using, you could achieve your objective with far less coding. All these platforms also support connecting to a REST API though.

Static data query - self service Amadeus API in production environment

I am currently developing a web app using self-service Amadeus API in a production environment, I have questions related to the static data, kindly reply.
Questions:
1. Is there any static data available related to flights schedule or any other detail which we can store on our end and get it synced in some scheduled manner, in place of fetching all data every time using APIs.
2. In case we have static data then what will be the ideal time duration to refresh data.
3. Are we allowed to store real-time data on our end temporarily? If yes then for what duration we can keep a copy of same.
4. Is there any API where we will send a list of Flight/Segment Id and get details only of those selected records. What I mean is we like to know details of 10 specific flights/segments. So can we get the information related to those 10 flights/segments whose id we will pass to API?
5. What's the response time of search API and API which returns details of the flight.
6. What all filters available in search API to filter data.
As of today, ee do not provide static data for flights schedule
/
You are allowed to store the data coming from the flight search API as long as you do not resell it in any way. Keep in mind that this data changes a lot (price/availability)
You can use the Flight Offers Price API for this. Flight Offers Price takes a list of flight-offers (that you get by using the Flight Offers Search API). For those flights-offers the API revalidates the price and the availability.
It depends on where you are based, which API you use and how you use it (filters), you can try our APIs for free in our test environment. Keep in mind that the test environment is limited in terms of the number of API calls, data and has a slower response time than the production environment
Our catalog is fully open (no need to register) you can find the Flight Offers Search reference documentation here listing all the parameters available. The Flight Offers Search API has 2 endpoints:
GET a simpler version of the search, easy and fast to implement but offering less filtering
POST offering a full access to all the functionalities of the flight search

Using a dataset in Watson

I'm working on an university project with IBM Cloud Services. Me and my team have created our virtual assistant through the Watson Assistant service and now we want to use a kinda huge dataset with the assistant. We actually don't know how to implement this dataset and how to use the informations the user gives us to make a SELECT FROM our dataset. I hope for your replies
Here's one general way to accomplish what you're trying to do:
You'll need to collect context variables to determine what song information to send back to the user. One effective way to do this is with slots; here's a guide on that.
An example of context variables collected could look like this:
{
genre: "hiphop",
mood: "upbeat",
instrumental: false
}
So the bot knows from this info to return hiphop songs, with an upbeat tempo, that is not instrumental.
I think you might already have gotten this far, but the next step is going back to your data set to query it and return those list of songs.
There are a few different ways to accomplish that:
You could house the data set within Watson assistant as preset context variables; this probably wouldn't make sense because it's a large dataset. Would only really make sense if it was a few options.
You could query the dataset in an orchestration layer. A message would get sent back from Watson Assistant with an action to query the dataset, before it gets returned to the end user the orchestration layer would make that query and fill in the information returned. This is a little more complex because you need to build and manage that orchestration layer - though there are some services out there that can help with this. Here's a diagram of an orchestration layer with watson assitant:
You could make a query to the dataset from within Watson Assistant using IBM cloud functions. Once you have collected the information in a node, you instruct Watson Assistant to call a cloud function that queries your dataset. What's nice about this method is that everything is housed within WA and cloud functions (no need for an orchestration layer), though there are some limitations like timeouts because Watson Assistant as an API needs to respond "immediately." Here is some more information on making programmatic calls from a dialog node.
Hope this is helpful.