Convert an online JSON set of files to a relational DB (SQL Server, MySQL, SQLITE) - sql

I'm using a tool called Teamwork to manage my team's projects.
The have an online API that consists of JSON files that are accessible with authorisation
https://developer.teamwork.com/projects/introduction/welcome-to-the-teamwork-projects-api
I would like to be able to convert this online data to an sql db so i can create custom reports for my management.
I can't seem to find anything ready to do that.
I need a strategy to do this..

If you know how to program, this should be pretty straightforward.
In Python, for example, you could:
Come up with a SQL schema that maps to the JSON data objects you want to store. Create it in a database of your choice.
Use the Requests library to download the JSON resources, if you don't already have them on your system.
Convert each JSON resource to a python data structure using json.loads.
Connect to your database server using the appropriate Python library for your database. e.g., PyMySQL.
Iterate over the python data, inserting rows into the database as appropriate. This is essentially the JSON-to-Tables mapping from step 1 made procedural.
If you are not looking to do this in code, you should be able to use an open-source ETL tool to do this transformation. At LinkedIn a coworker of mine used to use Talend Data Integration for solid ETL work of a very similar nature (JSON to SQL). He was very fond of it and I respected his opinion, so I figured I should mention it, although I have zero experience of it myself.

Related

How to create a SQL database within VB.net?

I am currently creating a vb.net program in which users upload a song file to the program and then it is saved within the programs files. I have set up the actual saving of the files but would also like to store some meta data of each in a SQL database within my program.
I have looked online and although i now understand the basics of SQL, im still a little fuzzy on how you actually implement this within VB.net. I have already added the library- Imports System.Data.SqlClient but failed to work out how to begin coding in SQL.
The basics of what im trying to acheive is a if statement that will determine wether or not a SQL database has been created in a specific location, and if it hasnt it should create it.
All constructive answers appreciated, thanks.
There are a number of different database engines available. The namespace that you have chosen contains the ADO.NET client classes for Microsoft SQL Server. You would use a connection string to specify how to connect to the database. This would often contain connection information, such as server name, user name, password etc, but it sounds like you want to store data locally.
There is a local version of SQL Server called LocalDB, but I think you would still need quite a lot of the SQL Server components installed for that to work. Although you can package these with your application they may be too large for you, so you may want to look at SQL Server Compact Edition, which is much smaller and allows you to package the whole engine as part of your application and is useful for storing data locally. Compact edition doesn't have quite all of the features that LocalDB does, so you may want to compare the features available for each.
Although you can use the ADO.NET objects to connect to a database, I think most people these days would use a layer on top which transfers data back and forwards between objects in memory and the database. This also allows you to use Linq to query the database in most cases. I personally use Entity Framework. You might want to look into that. There are different ways of configuring EF so you may want to look at a tutorial. Once you have it set up, you will probably find it much easier and safer to work with than writing SQL manually though.

Which dashboard analytics will support Parse.com data source?

I've developed an app that uses Parse.com as the back end. I now need a dashboard analytics software package (such as iDashboards) that will enable me to pull data from my Parse.com database classes and present some of that data in a pretty dashboard fashion.
iDashboards looks to be the kind of tool i'm after, but it only supports certain data source inputs such as JDBC, ODBC, SQL, MySQL etc. Not being a database guru by any means, i'm not sure if Parse.com can be classed as any of the above, but from what i've read it doesn't come under any of these categories.
Can anybody recommend a way of either connecting Parse.com to iDashboard, or suggest another dashboard tool that will support Parse.com as a data source?
The main issue you are facing is that data coming out of Parse.com is going to be in json format. Most dashboards are going to prefer csv files.
The best dashboard I am aware of is Tableau and there is a discussion about getting json into Tableau here: http://community.tableau.com/ideas/1276
If your preference is using iDashboards then you need to convert the json coming out of Parse into a csv format that iDashboards can consume. You can do that using RJSON as mentioned in the post above but you'll probably have an easier time of it with a simple php or python script that periodically connects to Parse and pulls out data updates for you and then pushes it to your dashboard of choice.
Converting json to csv in php is addressed here: Converting JSON to CSV format using PHP
The difference is much more fundamental than "unsupported file format". In fact, JSON data coming out of Parse is stored in a so-called denormalized form, which means that a single JSON data file may contain the equivalent of arbitrarily many tables in a relational database. Stated differently, one JSON file may translated into potentially many CSV files, and there's no unique choice of how to perform that translation.
This is a so-called ETL problem, where ETL stands for Extract-Transform-Load. As such, you may be interested in open source ETL tools such as Kettle. Kettle is supported by Pentaho and includes functionality that can help you develop a workflow to turn JSON data into multiple CSV files that can then be imported into iDashboards (or similar). Aside from Kettle, Talend is also widely used for this purpose and has the same ability.
Finally, note that Parse is powered by MongoDB, and exports JSON data that is easily stored and manipulated in MongoDB. As such, a natural fit for reporting on Parse data is any reporting tool built for MongoDB.
As of the time of this writing, there are two such options:
JSON Studio, which is a commercial solution that is built explicitly for MongoDB and has your stated capability to produce dashboards.
SlamData, which is an open source solution, also built for MongoDB, which allows native SQL on the database. The current version does not have reporting capabilities (just CSV export), but the 2.09 version due out in June has reporting dashboards baked in.
An advantage of using a MongoDB reporting tool is that you will not have to wrangle your data into relational form. If it's heavily nested, using arrays, and so forth, it can be quite painful to develop an ETL workflow and keep it in sync with how the data is changing. Instead, all you have to do is built a script to pipe the raw data from Parse into a MongoDB instance (perhaps hosted by MongoLab or equivalent, if you don't want to host it yourself), and connect the MongoDB reporting tool on top.
You might also contact Parse and see if they have a recommended solution for this. It occurs to me they should probably bake some sort of analytical / reporting functionality into their APIs as this is such a common use case.
You can use Axibase Time-Series Database to ingest your data from parse.com and they have built in dashboards and widgets for visualization or you can just export data from ATSD to csv and use iDashboards.

Xcode iOS phone directory app. core data, sql lite, or

as part of an application I am trying to create, I am looking to data storage solutions. However, I have found many solutions that I can not quite directly apply to the position I am in.
Basically, I want to display in my app, a directory of the staff of my organization. About 100 or so individuals. I want to have generic attributes such as name, email, office#, etc.
However, my goal is to not end up with a static representation of the staff here! (people come and go, switch offices,etc.)
I am looking for the best way (if possible) to maintain a small database that I can administer, and if perhaps, something were to change to someone here, I can make the change and the change will be reflected accordingly.
Please help! I tried submitting my first app but got rejected because I relied on a webview to accomplish this task. This is an internship opportunity and my first real chance at development. Any help will be GREATLY appreciated.
Thanks!!!!!
The iPhone directory app can be used to store data in any format you want (xml, json or a proprietary format), because all you do is save a file. But if you choose to use the iPhone app directory to store data you have to write code to read the file (very simple to do) and parse the information (not so simple because the dificulty scales based on the information complexity).
SQLite is a tool to store structured data, providing you a set of tools to access and use the information. You don't need to parse the information, because SQLite does it for you by using transact sql queries.
By now, because you have a list of individuals, and these people are relationed to offices, I think you should use SQLite.
The Code Data is a object graph management, it's a tool to give you more options over data manipulation, and can make your life very easy if you have a lot of data and very complex data models. I don't think you need that for your particular problem, but I think you should learn it at some point.
UPDATE 1
You application will have something like:
A core database (sql server, oracle, my sql, etc) will hold your individuals information (your cloud database).
A web page (php, asp.net, etc) will display the core database information in json or xml format (your api).
A iphone app will download the information from the web page and store it in the local SQLite. (you have to decide when you will update the local sql lite, like when is opened, once a week, once a moth, twice a day, etc) (your local storage method).
Display the local SQLite individuals information in the app.

how create a sql database fom a stongly typed dataset

I'm looking for an easy way to transfer a database schema I have developed inside visual studio as a strongly typed dataset (xsd file) into a corresponding sql server database. Silly me I assumed the process would be forthright, but I can't find out how to do it. I assume I could duplicate the tables column by column, but that seems so error prone. Does anyone know of a way to perform the schema transfer like this? Maybe a tool to translate the xsd file into a corresponding sql server ddl file?
Final thought once I have the schema transferred moving data around between the two data stores will be straight forward, its just getting the schemas synced that has me stumped...
Thanks,
Keith
Why didn't you implement your data model directly in SQL Server ?! It is more common and engineered and I think this is why Microsoft has not provided any wizard or tool for this case. As well you can make your data model as scripts or .sql files and they can be managed via SVN and whenever you need the model implementation you can sue them.

best way of migrating customised metadata associated with source component into Tridion environment

If we are migrating content from source Content Management System to Tridion, what is the best way of migrating customized metadata associated with the components(content) of source Content Management System into Tridion? Should we directly migrate it to the sql server or is there an option to migrate it in the form of some xml file, etc.?
Migrating directly into SQL Server is unsupported, and the entire system would be unsupported at that point, due to possible data consistency issues.
The most straightforward way is to read the data from the source system, and use the Tridion API to recreate the item.
If migrating metadata, some of the data would likely fit best into a taxonomy, which would mean you'd want to migrate the keywords / structure first, then tag the content as it came into Tridion.
You have a few options when migrating content into Tridion.
I can't understand from the above if you are talking about migrating to SQL server as an intermediate format, or directly into the Tridion database. Importing directly into the Tridion database is definitely not a supported solution, and could lead to unpredictable results.
You need to use the API, either the Core Service or the TOM.NET API (If you have Tridion 2011) or the old TOM API if not.
A popular approach is to export all content into an XML format that you can then process with a .NET application.
There's some good articles on migrating content into Tridion by Ryan Durkin here, and Nuno Linhares here.
As mention before, migrating directly into the Database is not an option if you are planning to use SDL Tridion as the final CMS.
Apart of the supported mechanism chosen for Migrate, play attention about how you are going to structure the metadata in the new CMS, as depending on the volume, structure, hierarchy, relation across metadata items the process can become complex.
Also play special attention at the Blueprint concept, as probably you can merge duplicated values from the old system into only one that is inherited.
Don't think only in how to put the metadata in the system, also how that Metadata will be used and maintained in the new CMS, in this case SDL Tridion
You can check also a recent post about Migration and plan Migration in general, in case adds some more information
Can we automate migrating to SDL Tridion?