importing csv file into SQL using Logic apps 2020 - sql

I have seen several older posts talking about how to import a csv file into SQL using Logic Apps however i have not seen anything thats easy to implement and doesn't require 85+ steps to do.
Has anyone come up with an easy way? I've done this million times using SQL SSIS or other tools to automate but nothing in Logic Apps?
Please let me know if you have a simple solution.

As of now there is no out-of box connector or function in LogicApp which parses a CSV file .
You can always vote for new features feedback: Read a csv file and bulk load into Azure SQL table
Logic App Teams suggest us look at Azure Data Factory to assist with this.
For example: Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool and the blog https://marczak.io/posts/azure-loading-csv-to-sql/ #Hury Shen provided in comment.
We can't import the csv file to into SQL directly, that's these posts look very complex and involve many steps. Just for Logic Apps, there isn't such an easy way for now.

Related

Telling Azure SQL Database to pull data from Excel Online Document?

Good morning! I've found TONS of articles, questions, and guides on how to import data from local excel documents to Azure SQL databases, or how to pull from an Azure database to Excel, but nothing about how I could use SQL to query an excel online document (which would be hosted on SharePoint).
I'm fairly new in my learning - I'd be setting this up via a query in SQL written/executed via Azure Data Studio. The excel file is one that I'd be creating, and hosting via our company's SharePoint system. The Azure SQL database will also be one that I'm constructing myself, which is in progress. I've tried to find walkthroughs, scripts, explanations, something. But it's totally silent. Granted, that could be an indicator that it can't be done, but I figured I'd ask here. Overall, I'm just trying to figure out what is possible, so I can come up with a decent range of simple, easy-to-use means of data input for my team, or, in this case, to capture some of the ways they're tracking their work.
Not sure if this is sufficient detail, please feel free to ask any follow-up questions.
Azure Data Studio is a tool to work with SQL databases, most notably MS SQL. Though you can connect to some other types as well.
Therefore, in order to use Data Studio to query your data, it needs to be in a SQL database. To accomplish that, you need to setup a process to load the data from your Excel document into a table in you database and run that process on a regular basis to update the table. You could look into Azure Data Factory to do that, though I don't see why you should bother to do that just to use Azure Data Studio, when you can browse the data in Excel, use PowerBI, Qlik or any other tool that can connect directly to Excel.

Convert an online JSON set of files to a relational DB (SQL Server, MySQL, SQLITE)

I'm using a tool called Teamwork to manage my team's projects.
The have an online API that consists of JSON files that are accessible with authorisation
https://developer.teamwork.com/projects/introduction/welcome-to-the-teamwork-projects-api
I would like to be able to convert this online data to an sql db so i can create custom reports for my management.
I can't seem to find anything ready to do that.
I need a strategy to do this..
If you know how to program, this should be pretty straightforward.
In Python, for example, you could:
Come up with a SQL schema that maps to the JSON data objects you want to store. Create it in a database of your choice.
Use the Requests library to download the JSON resources, if you don't already have them on your system.
Convert each JSON resource to a python data structure using json.loads.
Connect to your database server using the appropriate Python library for your database. e.g., PyMySQL.
Iterate over the python data, inserting rows into the database as appropriate. This is essentially the JSON-to-Tables mapping from step 1 made procedural.
If you are not looking to do this in code, you should be able to use an open-source ETL tool to do this transformation. At LinkedIn a coworker of mine used to use Talend Data Integration for solid ETL work of a very similar nature (JSON to SQL). He was very fond of it and I respected his opinion, so I figured I should mention it, although I have zero experience of it myself.

Which dashboard analytics will support Parse.com data source?

I've developed an app that uses Parse.com as the back end. I now need a dashboard analytics software package (such as iDashboards) that will enable me to pull data from my Parse.com database classes and present some of that data in a pretty dashboard fashion.
iDashboards looks to be the kind of tool i'm after, but it only supports certain data source inputs such as JDBC, ODBC, SQL, MySQL etc. Not being a database guru by any means, i'm not sure if Parse.com can be classed as any of the above, but from what i've read it doesn't come under any of these categories.
Can anybody recommend a way of either connecting Parse.com to iDashboard, or suggest another dashboard tool that will support Parse.com as a data source?
The main issue you are facing is that data coming out of Parse.com is going to be in json format. Most dashboards are going to prefer csv files.
The best dashboard I am aware of is Tableau and there is a discussion about getting json into Tableau here: http://community.tableau.com/ideas/1276
If your preference is using iDashboards then you need to convert the json coming out of Parse into a csv format that iDashboards can consume. You can do that using RJSON as mentioned in the post above but you'll probably have an easier time of it with a simple php or python script that periodically connects to Parse and pulls out data updates for you and then pushes it to your dashboard of choice.
Converting json to csv in php is addressed here: Converting JSON to CSV format using PHP
The difference is much more fundamental than "unsupported file format". In fact, JSON data coming out of Parse is stored in a so-called denormalized form, which means that a single JSON data file may contain the equivalent of arbitrarily many tables in a relational database. Stated differently, one JSON file may translated into potentially many CSV files, and there's no unique choice of how to perform that translation.
This is a so-called ETL problem, where ETL stands for Extract-Transform-Load. As such, you may be interested in open source ETL tools such as Kettle. Kettle is supported by Pentaho and includes functionality that can help you develop a workflow to turn JSON data into multiple CSV files that can then be imported into iDashboards (or similar). Aside from Kettle, Talend is also widely used for this purpose and has the same ability.
Finally, note that Parse is powered by MongoDB, and exports JSON data that is easily stored and manipulated in MongoDB. As such, a natural fit for reporting on Parse data is any reporting tool built for MongoDB.
As of the time of this writing, there are two such options:
JSON Studio, which is a commercial solution that is built explicitly for MongoDB and has your stated capability to produce dashboards.
SlamData, which is an open source solution, also built for MongoDB, which allows native SQL on the database. The current version does not have reporting capabilities (just CSV export), but the 2.09 version due out in June has reporting dashboards baked in.
An advantage of using a MongoDB reporting tool is that you will not have to wrangle your data into relational form. If it's heavily nested, using arrays, and so forth, it can be quite painful to develop an ETL workflow and keep it in sync with how the data is changing. Instead, all you have to do is built a script to pipe the raw data from Parse into a MongoDB instance (perhaps hosted by MongoLab or equivalent, if you don't want to host it yourself), and connect the MongoDB reporting tool on top.
You might also contact Parse and see if they have a recommended solution for this. It occurs to me they should probably bake some sort of analytical / reporting functionality into their APIs as this is such a common use case.
You can use Axibase Time-Series Database to ingest your data from parse.com and they have built in dashboards and widgets for visualization or you can just export data from ATSD to csv and use iDashboards.

GAE Datastore Large Amounts of Data

Background:
I'm working on a project that's starting out with a large SQL dump that I have to import to a new database. This dump is about 1.5GB of just plain text, so quite a lot of information. My client right now wants me to use Google App Engine and its datastore, which I'm (a) not so fond of and (b) doesn't really play well with SQL dumps. Before I go through the trouble to make that happen...
Question:
What is a cloud-hosted database solution that can efficiently handle large quantities of data (and ideally is lower-cost)? In particular, which would be a database solution to which I could just import my SQL dump as-is?
Does your client has any reasons to use the datastore? If you already have the SQL dump, I think it would be easier to use Google Cloud Sql from GAE.

how create a sql database fom a stongly typed dataset

I'm looking for an easy way to transfer a database schema I have developed inside visual studio as a strongly typed dataset (xsd file) into a corresponding sql server database. Silly me I assumed the process would be forthright, but I can't find out how to do it. I assume I could duplicate the tables column by column, but that seems so error prone. Does anyone know of a way to perform the schema transfer like this? Maybe a tool to translate the xsd file into a corresponding sql server ddl file?
Final thought once I have the schema transferred moving data around between the two data stores will be straight forward, its just getting the schemas synced that has me stumped...
Thanks,
Keith
Why didn't you implement your data model directly in SQL Server ?! It is more common and engineered and I think this is why Microsoft has not provided any wizard or tool for this case. As well you can make your data model as scripts or .sql files and they can be managed via SVN and whenever you need the model implementation you can sue them.