Transferring data from Excel to SQL multiple indexed tables - sql

I have like 20 tables and one general table in SQL. That main table has indexes in in its columns. Using these indexes I create a view by getting the data from other 20 tables.
My question would be what would be the most efficient way to create a process of updating all of those tables accordingly using an Excel source. It should be future proof (new excel data being inputted once a month e.g.).
If it is a SSIS package how would it look, maybe you have any examples of something similar?
Thank you for the help.

I for one do not like SSIS. I find it a pain to troubleshoot, but for some tasks it's fine. If I were you I would:
Use the data import wizard from within Microsoft SQL Studio to import the Excel file.
Simply get the data into a staging table in SQL.
You'll have the option to save this as an SSIS package, good for automation
Now, write a pile of SQL to sort and update the data as you wish. Perhaps make a series of stored procedures
Create a job in SQL that runs your package, and then runs each stored procedure
Writing a solution in this fashion will allow you to troubleshoot each step and make reporting easy. You can just do the whole thing is SSIS but like I said, I'm not a fan of that tool. I like my code on the command line as much as possible for troubleshooting :)

I used this app from windows store to convert Excel into SQL script.
Then send script to our DBA.

Related

Direction to create a frontend/GUI for simple SQL query?

I work on a team that creates adhoc SQL queries for a large database for users that are performing research. These searches are done with Excel via ODC files with ODBC connection.
I'm looking for a program / best language suggestion to create an app for users to run the simpler queries themselves that they request. Basically a window that has a few text boxes for the variables as well as some date range boxes. Runs query and exports to a spreadsheet etc.
So far I've only found tools that run the raw SQL query code itself.
Thanks in advance. -Steve

Crystal Reports vs. SQL Queries

I'm a programmer (mostly C++) who has moved into a non-software workplace. However, I don't have much experience with database stuff at all.
TL;DR: If we compare Crystal Reports to just writing scripts that execute SQL queries and parse the results, is there anything that CR can do that isn't possible via SQL queries & scripts? I'm talking purely in terms of extracting data - not making pretty documents.
Detail:
At my workplace they have a process where you run a bunch of Crystal Reports, modify the date range to the current month, manually export each to excel, delete the rows and columns that aren't needed, and then cut and paste into a summary excel document that is used by management.
To me, this is pretty crazy and stupid. I'd like to automate/script most of it.
So I have two options:
Learn Crystal Reports and try to modify the existing reports to be more automated.
Dump CR and just learn SQL and do the whole thing programmatically with scripts working with CSV files or something.
I'd much rather learn SQL since it's more general and useful. But I need to be assured that I can get the data output that I need (without writing a million lines of code to reproduce CR myself.)
So yeah, I'm looking for an answer like, "The two are equivalent. Anything you can do in CR you can do easily via scripts and SQL," or "If you need to group records into categories based on a parameter and then sum their one of their fields, then CR will do it much more easily than raw code," to push me in one direction or another.
Edit:
Some additional detail. At the moment my crystal reports run a database query, and then crystal does things like, "don't display the records that are returned, instead group the records by Field A and then display the count of how many records in each group."
Is functionality like this difficult to reproduce via SQL coding? I wouldnt want to have to write a python (or whatever) script to parse and manipulate the data from plaintext CSV, for example.
You can't just compare SQL and CR - they have different purpose. SQL (in this context) is data source, CR is pretty output formatter. For excel you would need data, not formatted output. Excel combined with SQL can give you all CR options (dynamic crosstab reports, charts etc) what you can't get directly from SQL data.
BTW, creating SQL views or procedures is often needed to overcome CR limitations; from this standpoint SQL has lot of more options than CR.
I personally would go with SQL+Excel route. In our company we're using simply SQL+CR without postprocessing, sometimes SQL+Excel. Our customers are using different approaches.
But like said by other people, choice of tools depends on more things. Who has to redesign reports? Who will maintain these reports? How often requirements change? Are there more uses for CR reports besides sourcing Excel tables? Who will be waked up at night, if reports do not work?
Management perpective:
In many I will say mostly cases management does not know SQL. So if a manager for E.g.HR wants to know staus about something then how he will get that status?? This is where Crystal reports come into picture, Using crystal reports they do not have to worry about SQL; they will just enter required fields and get their data.
Programmer perspective:
Simple data outputs can be achieved through SQL but consider a scenario where you need to pull details as well as summary. I agree it can be done via SQL but consider the overhead of time and proficiency required to develop such output using sql. I bet it wont be that easy to develop such output using sql as compared to crystal. So I will say learn both SQL and crystal, you will get to choose the tool to apply for your requirement.
You can write SQL and drop it into the Crystal Report. Best of both worlds, and possibly faster performance than the drag-and-drop Crystal functionality.
You will see some response time lag when the report runs.
There are actually a few things that Crystal Reports can do that are very tricky using plain SQL Queries as Crystal Reports can access the entire dataset in a single formula and can do things at runtime.
However unless you have some really crazy complex Crystal Reports I would recommend building a tool in Excel that can one click the info straight into a new sheet.
I did this and it got me a promotion, not kidding :P
I have a custom Excel Addin I can give you code to that basically does this:
On open, connects to the database and downloads a list of menu options connected to views and procedures
Adds these menu options into a new Ribbon tab within Excel
When one is clicked, runs the view and dumps the entire dataset (properly formatted) into a new sheet
Advantage of this is you can update the main menu list and each view it references without making any changes to the file or re-issuing anything to everyone.
Crystal could be helpful if you want to create a document with a specific layout , logos etc. and show some data on it. Export to excel from Crystal repot is not easy - usually there are a lot of empty columns and rows and each report should be tweaked to avoid that.
If you need to export some data from a SQLServer database to excel your best option will be SSIS ( I guess you have a license for SQL Server). If you don't have license for SSIS or you are using for example Access database there are also some inexpensive tools, which can retrieve data from any database ( not just SQLServer) and export it to excel. I would suggest you to check this one: http://www.r-tag.com. It can run Crystal reports and SQL reports so you can start using your crystal reports immediately and start transforming them to SQL reports whenever you have time for that. Both reports could be exported to excel.
i fixed this by editing excel sql, Left(Column_maxLength, 250)
this resolved my issue
in my case if even if i read left 250 character is enough

Programmatic Export with Indeterminate Table Structure from SQL Server 2008 using T-SQL

I have searched for an answer to this, and one seems not to exist.
Problem:
A website is querying a database and unable to return results (as an export to Excel) in a timely fashion. This is primarily due to result set size. I'd like to set up a background process to 'ping' for waiting queries and execute them one-by-one, dumping data into a location to be downloaded from. The 'pinging' task can be handled a whole host of ways. My original ideal solution was a trigger (alternatively, a SQL Server Agent task) that exported the data to the filesystem. But I have run into an issue where I don't know how to set up an amorphous output to the filesystem with a simple T-SQL statement.
SSIS is apparently the standard solution to this. I don't know enough about SSIS to know whether it will handle what I want it to do, but I have been told the queries are too great in number / various in output for that to be a feasible solution.
xp_cmdshell can be run to do a BCP export. This works fine, but apparently opens a security hole.
Previous solutions:
A solution I used years ago, DTS passing data straight to the operating system, seems to have been disabled in SQL Server 2008/ 2012. I also used to be able to use sp_makewebtask to export data directly to the filesystem but no longer can do that either.
Current solution
I am writing a PowerShell script tied to some SQL tables and stored procedures to manage execution. This seems like a non-ideal solution; I'm curious as to whether I have missed something. Is there an easy way to set up SSIS to export data without a structure? A way to create an Excel file on the fly and fill it with data?
The answer seems to be No.
You can export to CSV instead of Excel (because Excel opens CSV files easily), but they don't have any formatting. You can set up SSIS (or BCP in a scheduled task) to export in the CSV file a single column which already contains the commas and the text delimiters, so the data would be presented by Excel in separate columns.

ms Access import table from file in a query

Is there a way to have a msAccess DB query import a table from file?
Yes, as long as the data is organized. You can use VBA or a macro with TransferText or you can use Get External Data from the menu or ribbon, which will guide you through the steps.
EDIT
You can import into a new or existing table from say, CSV, like so:
SELECT * INTO NewTable
FROM [Text;HDR=Yes;FMT=Delimited;Database=C:\Docs].Test.csv
The solution will vary depending on the format of the file. If it's simple enough, checking out the options on the External Data tab (MS Access 2007) in the Import section, may do the trick.
For complex integrations, I'll often use SQL Server Integration Services (SSIS) to migrate the data into Access, where I can then process it with SQL queries. Of course, SSIS is a much "heavier" solution with a bit of a learning curve, but its been handy when the wizards aren't flexible enough.

Anyone know a program to automatically dump a bunch of test (dummy) data into a sql database? [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
Tools for Generating Mock Data?
I am running SQL Server 2005 and I want to dump some dummy data into a large table with about 50 columns (I did not design it :P) - anyone know a tool to do this automatically? I want the rows to have all sorts of different data, and I would rather not write a script if there is already a tool out there - thanks.
Check out Sql Data Generator by Red Gate.
Do you want the tool to generate the data, or just get it into the database? If you just want to be able to easily dump in some canned data, use Excel.
Edit: These options only apply if you already have data from another system that you want to import in. After re-reading your question I don't think my answer is relevant...
Inside SQL Server 2005 (and 2008), you have a few built-in (aka free) options to import data:
1) Management Studio
This is by far the simplest option for importing data.
1) Connect to your database inside Management Studio.
2) In Object Explorer, right click on your database then select Tasks -> Import Data
3) From here you can specify your data source, do some minor transformations on the columns, and specify what table(s) you want to dump the data to.
4) You can save this as a job in case you want to repeat this (even automate this) as a future task.
2) SQL Server Integration Services
You can use Business Intelligence Development Studio (BIDS) to create an SSIS project and have much more granularity over your data source, the transformations on your data, and how you want to import it into your destination database(s).
This tool takes some time to master, but is very useful once you get the hang of it.
And, like the previous option, SSIS packages can be put into a job and set as an automated task.
3) Command line BCP Utility
If you're interested I'd recommending Googling this option. It is a time consuming option, and there are easier ways to get data into a system. :)
I hope that helps.