The task is related to Share market.
Currently Omnesys NEST trading terminal provides streaming data for NSE and MCX and so. They also provide an option to link the live share streaming data to EXCEL sheet, and the the market changes are updating to excel sheet for every second.
This is the function used in EXCEL to read the data from NEST terminal:
=RTD("nest.scriprtd",,"mcx_fo|GOLDM15SEPFUT","LTP")
Can anybody help me to extract the live streaming data?
Can you be more specific ? Do you want to develop some live trading strategy using the live data , or do you want to store this data in some database or ....
As an example, you can use pandas DataFrame to retrieve this data into your python program and process it.
UPD:
The NSE NOW terminal is based on Omnesys Nest. I have automated this using AutoIt Check out this link: . The live streaming data from the market watch of Omnesys Nest can be extracted using AutoIt
The github repo for the program is Here
Related
Hey I am trying to create some batch jobs that reads from a couple Salesforce Objects and pushes them to BQ. Every-time batch process runs it will truncate the table in BQ and push all the data in the SF object back into BQ. Is it possible for google data fusion to automatically detect changes in an object in Salesforce(like adding a new column or changing data types of a column) then be registered and pushed to BQ via google data fusion?
For SF side of the puzzle you could look into https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_describeGlobal.htm and If-Modified-Since header telling you if the definition of table(s) changed. That url is for all tables in the org or you run table-specific metadata describe calls with https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_sobject_describe.htm
But I can't tell you how to use it in your job.
You can use the provided answer of #eyescream to be the condition or the trigger for the update to BigQuery. You may push changes to BigQuery using the pre-built plugin Stream Source approach from Datafusion in which, as mentioned in this docmentation, it
tracks updates in Salesforce sObjects. Examples of sObjects are
opportunities, contacts, accounts, leads, any custom object, etc.
You may use this approach to automatically track changes and push them to BigQuery. You can also find the whole Salesforce Streaming Source configuration reference in this documentation as also redirected from google's official documentation.
However, if you want a more dynamic approach for your overall use case, you may also use the integration of BigQuery with Salesforce. However in this approach, you will need to build your own code in which you can also use #eyescream 's answer as the primary condition/trigger and then automatically push the update to your BigQuery schema.
I have data in a csv file & want to do the following with it:
Log into web site
Populate field of the page with the csv data
Navigate to next page
Input the rest of data
Click submit
Repeat for next line
I can do this using UiPath but it's an expensive option for a relatively simple use case.
Any one any suggestions on how do this using a different method?
Thanks,
EddieT
If you're looking for alternatives then you probably would want to investigate APIs or Webhooks. But that all depends on the access rights you have for that particular website.
Try messaging the Developers of the website you need as they might have this service already available.
UiPath may appear expensive but if you calculate the amount of time saved for this one process then you will see the money savings too.
If you can find a couple of other processes you want to automate then I'd highly recommend it.
I have GA360 and I have exported raw Google Analytics data to BigQuery through the integration. I'm just wondering if the DCM dimensions and metrics are part of the export?
Ideally these link
I can't find them as part of the export, what would be the best way to access these dimensions for all my raw data? Core reporting API?
This Google Analytics documentation page shows the BigQuery Export schema for the Google Analytics data that is exported to BigQuery. As you will see, the DoubleClick Campaign Manager (DCM) data is not part of the BigQuery export, unlike the DoubleClick for Publishers (DFP), which does appear (these are all the hits.publisher.<DFP_METRIC> metrics).
Therefore, as explained by #BenP in the comments, you may be interested in the BigQuery Data Transfer service, which does have a feature for DoubleClick Campaign Manager Transfers. Unfortunately, I am not an expert in Google Analytics or DCM, and therefore cannot add much relevant info to the process of linking both sets of data, but maybe you can try combining them yourself and then post a new question for that matter if you do not success in doing so.
What is the best way to import data into google sql DB from a spreadsheet file?
I have to import two file with 4k rows each into a db.
I've tried to load 4k (one file) rows using Appscript and the result was:
Execution succeeded [294.336 seconds total runtime]
Ideas?
Code here
https://pastebin.com/3RiM1CNb
Depends a bit on how often you need this done. From your comment "No, this files will be uploaded two times for month in gdrive.", I think you mean 2 times/month.
If you need this done programmatically, I suggest to use a cronjob and have either App Engine or a local machine run that cronjob.
You can access the spreadsheet with a serviceaccount (add it to the users of that spreadsheet like any other user) using libraries (check the quickstarts on the side for your language of preference) and process the data with that. How to actually process the data depends on your language of choice, but that's simply inserting rows into MySQL.
The simplest option would be to export to CSV and import this into Cloud SQL. Note that you may need to reformat this into something Cloud SQL understands, but that depends on the source data in Google Sheets.
As for the error you're getting, you're exceeding the max allowed runtime for App Script, which is 6 minutes...
I am in the process of creating an app that will log certain amounts of data for teachers. While the data will be kept somehow within the app, is there a way to integrate the use of google forms or sheets with the data?
For example, if I use my app to record data (say that a student was coming to class tardy), is there a way to have that data go to a google sheet/form and always see live data as it's being used through my app?
Thanks!
Mike
You can build a form in your app that posts data to a standalone google-app script. The app-script can then append the data to any google sheet you specify. I'd also like to recommend Google data studio for displaying real-time data, as it will save a lot of time on not coding the data display ui.