Reading CSV File using Entiry Framework 4.0 - .net-4.0

Is it possible to read CSV file using Entiry Framework 4 such that it should give me an entity that I can use it normally within my application?
Thanks

I don't think that there is a CSV adapter for EF4, but you could always use a Linq-to-CSV concept for small CSV files. The results of your queries could be mapped into your EF objects and written to tables or just used in your data access layer as an additional data source.

Related

How can I export data from azure storage table to .csv file in .Net core C#

is there an azure API to import/export an existing collection from Azure Table Storage in .csv?
The Table Storage REST API does not provide a response as CSV directly, so it's always necessary to transform the data accordingly, as for example the Azure Storage Explorer does using an older version of the azcopy v7.3.
I've built a little C# library that basically does the same. It currently caches all rows in memory though to create the CSV headers so that's something to be aware of.

importing csv file into SQL using Logic apps 2020

I have seen several older posts talking about how to import a csv file into SQL using Logic Apps however i have not seen anything thats easy to implement and doesn't require 85+ steps to do.
Has anyone come up with an easy way? I've done this million times using SQL SSIS or other tools to automate but nothing in Logic Apps?
Please let me know if you have a simple solution.
As of now there is no out-of box connector or function in LogicApp which parses a CSV file .
You can always vote for new features feedback: Read a csv file and bulk load into Azure SQL table
Logic App Teams suggest us look at Azure Data Factory to assist with this.
For example: Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool and the blog https://marczak.io/posts/azure-loading-csv-to-sql/ #Hury Shen provided in comment.
We can't import the csv file to into SQL directly, that's these posts look very complex and involve many steps. Just for Logic Apps, there isn't such an easy way for now.

Importing JSON data into RavenDB

I'm attempting to migrate some data into RavenDB. We have a text file with all the data in JSON format but I don't see a straight forward way to import the data into RavenDB. This data was not created by any RavenDB utility, such as Smuggler or backup, but rather generated by parsing a data dump from another application. The data is the raw JSON, exactly as it's expected for our RavenDB application with the exception of the meta data; there is no metadata.
What is the easiest way to import this data?
Thanks in advance.
Write a powershell script to read the json file one line at a time, then just POST `http://localhost:8080/databases/your-db/docs/your-doc-id'

Which dashboard analytics will support Parse.com data source?

I've developed an app that uses Parse.com as the back end. I now need a dashboard analytics software package (such as iDashboards) that will enable me to pull data from my Parse.com database classes and present some of that data in a pretty dashboard fashion.
iDashboards looks to be the kind of tool i'm after, but it only supports certain data source inputs such as JDBC, ODBC, SQL, MySQL etc. Not being a database guru by any means, i'm not sure if Parse.com can be classed as any of the above, but from what i've read it doesn't come under any of these categories.
Can anybody recommend a way of either connecting Parse.com to iDashboard, or suggest another dashboard tool that will support Parse.com as a data source?
The main issue you are facing is that data coming out of Parse.com is going to be in json format. Most dashboards are going to prefer csv files.
The best dashboard I am aware of is Tableau and there is a discussion about getting json into Tableau here: http://community.tableau.com/ideas/1276
If your preference is using iDashboards then you need to convert the json coming out of Parse into a csv format that iDashboards can consume. You can do that using RJSON as mentioned in the post above but you'll probably have an easier time of it with a simple php or python script that periodically connects to Parse and pulls out data updates for you and then pushes it to your dashboard of choice.
Converting json to csv in php is addressed here: Converting JSON to CSV format using PHP
The difference is much more fundamental than "unsupported file format". In fact, JSON data coming out of Parse is stored in a so-called denormalized form, which means that a single JSON data file may contain the equivalent of arbitrarily many tables in a relational database. Stated differently, one JSON file may translated into potentially many CSV files, and there's no unique choice of how to perform that translation.
This is a so-called ETL problem, where ETL stands for Extract-Transform-Load. As such, you may be interested in open source ETL tools such as Kettle. Kettle is supported by Pentaho and includes functionality that can help you develop a workflow to turn JSON data into multiple CSV files that can then be imported into iDashboards (or similar). Aside from Kettle, Talend is also widely used for this purpose and has the same ability.
Finally, note that Parse is powered by MongoDB, and exports JSON data that is easily stored and manipulated in MongoDB. As such, a natural fit for reporting on Parse data is any reporting tool built for MongoDB.
As of the time of this writing, there are two such options:
JSON Studio, which is a commercial solution that is built explicitly for MongoDB and has your stated capability to produce dashboards.
SlamData, which is an open source solution, also built for MongoDB, which allows native SQL on the database. The current version does not have reporting capabilities (just CSV export), but the 2.09 version due out in June has reporting dashboards baked in.
An advantage of using a MongoDB reporting tool is that you will not have to wrangle your data into relational form. If it's heavily nested, using arrays, and so forth, it can be quite painful to develop an ETL workflow and keep it in sync with how the data is changing. Instead, all you have to do is built a script to pipe the raw data from Parse into a MongoDB instance (perhaps hosted by MongoLab or equivalent, if you don't want to host it yourself), and connect the MongoDB reporting tool on top.
You might also contact Parse and see if they have a recommended solution for this. It occurs to me they should probably bake some sort of analytical / reporting functionality into their APIs as this is such a common use case.
You can use Axibase Time-Series Database to ingest your data from parse.com and they have built in dashboards and widgets for visualization or you can just export data from ATSD to csv and use iDashboards.

entity framework, self-tracking entity and sqlserver file stream

I just start a project where I need to have a WCF services that read and write files.
The architecture is based on DDD using Entity Framework Self-Tracking Entity.
The simple GUI should show a grid with a list of file and then click the row you can download it.
Can I use the file stream sql server 2008 feature with this architecture? Which strategy is the best one to manage this kind of entity?
Thanks.
Filestream will not help you when using EF. EF doesn't use streaming feature, it loads it as varbinary(max). If you wan to take advantage of filestream you must load it from database with ADO.NET directly and you need a streaming service to pass it back to the client in efficient way.