usage of data dump of a table for a website - sql

I asked one of our company partner to give us read/write ODBC access so that we can pull raw data, crate views from their Case management system. they mentioned that they can provide us with data dump of the tables with in their website where we would be able to pull data from.
i looked into what i can do by data dump of tables. and found that this is detailed record of tables in the database. it is implemented to take backup of a database or multiple databases available in the server so that their data contents can be renovated in the event of any data loss.
I am looking how can i use this to write my own sql query and get what i need and create views. where can i read more about how else i can use data dump of a table
thanks

Related

Understanding Azure SQL Server External Tables

We are trying to create a cross-database query using Azure's preview Elastic Query. So we will be creating an External Table to make these queries happen.
Unfortunately, I have some apprehension about how the queries will be executed. I don't want a query or stored procedure to fail at run-time because the database connection fails. I just don't understand how the External Tables work.
Azure's External Table docs have good information on how to query and create the table. I just can't find information that specifically spells out how the data exists.
Oracle's version of external tables is just flat files that are referenced. SQL*Loader loads data from external files into tables of an Oracle database. I couldn't find any documentation about Azure doing the same. (Is it implied that they are the same? Is that a stupid question?)
If it is this way (external flat files), when the external table gets updated, does SQL Server update the flat files so our external table stays up to date? Or will I have to delete/create the link again every time I want to run the query for up to date information?
Per Microsoft Support:
Elastic queries basically works as remote queries which means the data is not stored locally but is pulled from the source database every time you run a query. When you execute a query on an external table, it makes a connection to the source database and gets the data.
With that being said, you do not have to delete/create the links. Once you have performed these steps, you can access the horizontally partitioned table “mytable” as though it were a local table. Azure SQL Database automatically opens multiple parallel connections to the remote databases where the tables are physically stored, processes the requests on the remote databases, and returns the results.
There is no specific risk associated with using this feature but it is simply like opening connections to the source database so it can pull data. Besides this you can expect some slowness when executing a remote query but nothing that will cause any other issues with the database.
In case any of the database becomes unavailable, queries that are using the affected DB as source or target will experience query cancellations or timeouts.

Access Azure Data Lake Analytics Tables from SQL Server Polybase

I need to export a multi terabyte dataset processed via Azure Data Lake Analytics(ADLA) onto a SQL Server database.
Based on my research so far, I know that I can write the result of (ADLA) output to a Data Lake store or WASB using built-in outputters, and then read the output data from SQL server using Polybase.
However, creating the result of ADLA processing as an ADLA table seems pretty enticing to us. It is a clean solution (no files to manage), multiple readers, built-in partitioning, distribution keys and the potential for allowing other processes to access the tables.
If we use ADLA tables, can I access ADLA tables via SQL Polybase? If not, is there any way to access the files underlying the ADLA tables directly from Polybase?
I know that I can probably do this using ADF, but at this point I want to avoid ADF to the extent possible - to minimize costs, and to keep the process simple.
Unfortunately, Polybase support for ADLA Tables is still on the roadmap and not yet available. Please file a feature request through the SQL Data Warehouse User voice page.
The suggested work-around is to produce the information as Csv in ADLA and then create the partitioned and distributed table in SQL DW and use Polybase to read the data and fill the SQL DW managed table.

What is an efficient way to manage a vb form which handles huge data(much higher than 2 gb) with access 2013 as database?

I am currently designing a windows form using vb.net. The internet states that 2 gb is the limit for a .accdb file. However i am required to handle data a lot larger then 2 gb. what is the best way to implement this? Is there anyway i could regularly store data to some other access db and empty my main database? (But would this create troubles in migrating data from accdb to the windows form when demanded by the user?)
Edit: I read somewhere that splitting could help. But i dont see how?- it only creates a copy of the database on your local machine in the network.
You can use Linked table of Microsoft SQL Server 2012 Express edition which has 10 GB limit, the maximum relational database size is 10GB.
You can use MySQL Linked table , 2 TB limitation
It's not easy to give a generic answer without further details.
My first recommendation would be to change DBMS and use SQLite which supports roughly 140 TB Limit
If you must use Access then you will need a master database containing pointers to the real location of the data.
E.G.
MasterDB -> LocationTable -> (id, database_location)
So if you need a resource you will have to query the master with the id to get its actual location and then connect to the secondary and fetch the data.
Or you could have a mapping model where a certain range of IDs are in a certain database and you can keep the logic in code and access the db once.
Use SQL Server Express. It's free.
https://www.microsoft.com/en-us/cloud-platform/sql-server-editions-express
Or, if you don't want to use that, you'll need to split your data into different Access databases, and link to what you need. Do a Google search on this and you'll have everything you need to get going.
I agree with the other posts about switching to a more robust database system, but if you really do have to stay in Access, then yes, it can be done with linked tables.
You can have a master database with queries that use linked tables in multiple databases, each of which can be up to 2 GB. If a particular table needs to have more than the Access limit, then put part of the data in one database and part in another. A UNION query will allow you to return both tables as a single dataset.
Reads and updates are one thing, but there is the not-so-trivial task of managing growth if you need to do inserts. You'll need to know when a database file is about to grow beyond 2 GB and create a new one whose tables must then be linked to your master database.
It won't be pretty.

How to insert data from one Azure SQL Database into a different Azure SQL Database?

I realize that Azure SQL Database does not support doing an insert/select from one db into another, even if they're on the same server. We receive data files from clients and we process and load them into a "load database". Once the load is complete, based upon various rules, we then move the data into a production database of which there are about 20, all clones of each other (the data only goes into one of the databases).
Looking for a solution that will allow us to move the data. There can be 500,000 records in a load file and so moving them one by one is not really feasible.
Have you tried Elastic Query? Here is the Getting Started guide for it. Currently you cannot perform remote writes, but you can always read data from remote tables.
Hope this helps!
Silvia Doomra

Pulling data across multiple servers

The company i am working for is implementing Share-point with reporting servers that runs on an SQL back end. The information that we need lives on two different servers. The first server being the Manufacturing server that collects data from PLCs and inputs that information into a SQL database, the other server is our erp server which has data for payroll and hours worked on specific projects. The i have is to create a view on a separate database and then from there i can pull the information from both servers. I am having a little bit of trouble with the syntax for connecting the two servers to run the View. We are running ms SQL. If you need any more information or clarification please let me know.
Please read this about Linked Servers.
Alternatively you can make a Data Warehouse - which would be a reporting data base. You can feed this by either making procs with linked servers or use SSIS packages if they're not linked.
It all depends on a project size and complexity, but in many cases it is difficult to aggregate data from multiple sources with Views. The reason is that the source data structure is modeled for the source application and not optimized for reporting.
In that case, I would suggest going with an ETL process, where you would create a set of Extract, Transform and Load jobs to get data from multiple sources (databases) into a target database where data will be stored in the format optimized for reporting.
Ralph Kimball has many great books on the subject, for example:
1) The Data Warehouse ETL Toolkit
2) The Data Warehouse Toolkit
They are truly worth the read if you are dealing with data