Hi guys I have a simple DB that has two fields in it (time and number 1-3), the data needs to be exported and shown in simple charts (horizontal bars from 0 to max time from my DB)? What is the best way to do that?
The easiest was to establish a direct data access from excel to SQL-Server and use Excel's abilities for the graphics.
If you need the data "exported", it is quite easy to get a table as CSV-list. Again this can be opened with Excel directly to do the graphical work there.
Depending on your environment you might think about any reporting tool, obviously the first choice was SSRS or PowerBI, which is part of the box.
You might even use a SELECT * FROM YourTable and just copy-and-paste the full result into Excel.
The main things to think about:
One-time action or regularly
Grade of automatism
Size of data / Count of rows
Location / Access-rights / Linkability of your systems
Existing tools
Related
I have my data model's in excel sheets and my actual database is postgresql 9.5, I would like to make some automated process that should compare the tables in the database and the data model's in the excel and make the changes in the db automatically or at least list out the differences between them. How can I do this? Can It be done using vba macros? or is there any other alternative? Please Give your suggestions on this.
Comparison is one of the bigger weaknesses in Excel. My approach would be something like this:
Make use of Postgresql's built in functionality to describe its data model
and copy that to Excel (or via ODBC if you want to over-engineer it)
Reshape the output of step 1 to something that has the same format as your Excel based data model
Do the comparison (either in Excel or in an external diff tool)
Step 1 and 2 can be done in VBA with a lot of string manipulation, but can also be a copy/paste operation, depending on what tools you have available.
The transformation in step 2 can also be handled with Get & Transform (in newer Excel) or PowerQuery (in older Excel).
I will briefly explain what I have and need here, and later if I can, I will edit this post and add a reproducible example.
My project:
Query data from Oracle databases into one worksheet in Excel, then use a LOOKUP procedure to copy data into an editable table in a second worksheet. The second worksheet needs to be in a table format for filtering, and have a drop down option to filter the data by date ranges. The data needs to be refreshed 1-2 times a week only by 1-2 approved staff members.
Concerns:
Per suggestion I installed Power Query for Excel 2010, which required dependencies before it could work. The convenience factor is great and it makes it so that SQL queries can be edited without messing around in VBA code. However, the dependencies setup (Oracle client for data connections) limits casually deploying this as a solution.
The data connections and queries and the data lookup could all be done in VBA and assigned macros.
Questions:
Should I use Power Query to query the data and then a VBA for the second sheet LOOKUP and date range filtering -- or should this all be written in VBA Excel Macros?
Which is more future proof friendly? Are there any advantages for using Power Query that would make this task more edit friendly for non-coders?
Thanks!
This probably can be solved with PowerQuery only, without VBA. I wouldn't recommend you storing queries in Excel table, the best is to move it on a server. A view or a function would be suitable. Querying the database, editing this view/function will work for only for only approved users.
This is more secure and will require only 1 Excel workbook. In PowerQuery, you can refer old copy of the table at the moment you refresh it, therefore you can keep entered data and get new.
Your project seem to me as an ad-hoc solution.
We work with a lot of data at my job, and I want to try and find a way to limit the amount of copying from SSMS to an Excel sheet that goes to the client.
What I want to be able to do, using SSIS if possible or any other possible way (Maybe power query?), is to copy the data pulled via a SQL query to an Excel workbook sheet.
For example, I want to do a count on the amount of members by state, I'd have the query run and the results copied to the sheet called "State" in the Excel work book.
Example code:
SELECT C.State, COUNT(*) as Count
FROM [dbo].Input I
Join Cassresults C on C.ID = I.ID
group by C.State
order by Count desc
The Excel workbook will never change for the client. The only thing that may change are the queries, but those are easily updated.
Is there a way to actually do this or am I nuts for thinking so? I hope I explained it well enough.
SSAS, SSIS, PowerQuery, PowerBI, Excel PowerPivot, SSRS, and Excel Data Querys all are geared for this type of use. I would definitely NOT recommend VBA as your users will constantly get a security warning and it is more complex than needed.
For Excel probably a good starting location go to the data tab and click "From Other Sources" and check out the different source types. From Micrsoft Query gives you the ability to write a query or copy from SSMS.
The only thing is will Data Sources Change? If so every workbook you create and distribute will the become obsolete and need to be changed. SSRS is a good choice to allow users to grab the report (and export to Excel) that they need.
When doing SSAS it is great as well but start with PowerPivot in Excel, again data connections move Sharepoint data connection library is a way to combat that.
This is like a BI and reporting design question and you will get a plethora of answers.
I made a GUI for generation of SQL, something much similar to MS Access Visual Query Designer, the purpose was to let our Customer Service team make their own reports. But even after designing the whole thing and I can see that they are unsure on how to proceed for generating a new report.
SQL is much intuitive for me after long experience, but things like grouping, aggregate functions, various date/string functions are not easy for a non programmer.
How can I make it easier for a non programmer to build the SQL using a GUI?
Maybe you can apply a simplifying transcription of the SQL. Think something like https://ifttt.com/wtf. This in combination with the visual part might make it easier to understand what's happening.
Please follw up :
1) combo box for selecting database
2) combo box for selecting table for selected database
3) One Grid is display columns for selecting table.
4) user select,update column display name, order of display, aggregate of column option, group on column option.
5) finally display result in grid as per selection of column setting.
problem
how to best parse/access/extract "excel file" data stored as binary data in an SQL 2005 field?
(so all the data can ultimately be stored in other fields of other tables.)
background
basically, our customer is requiring a large volume of verbose data from their users. unfortunately, our customer cannot require any kind of db export from their user. so our customer must supply some sort of UI for their user to enter the data. the UI our customer decided would be acceptable to all of their users was excel as it has a reasonably robust UI. so given all that, and our customer needs this data parsed and stored in their db automatically.
we've tried to convince our customer that the users will do this exactly once and then insist on db export! but the customer can not require db export of their users.
our customer is requiring us to parse an excel file
the customer's users are using excel as the "best" user interface to enter all the required data
the users are given blank excel templates that they must fill out
these templates have a fixed number of uniquely named tabs
these templates have a number of fixed areas (cells) that must be completed
these templates also have areas where the user will insert up to thousands of identically formatted rows
when complete, the excel file is submitted from the user by standard html file upload
our customer stores this file raw into their SQL database
given
a standard excel (".xls") file (native format, not comma or tab separated)
file is stored raw in a varbinary(max) SQL 2005 field
excel file data may not necessarily be "uniform" between rows -- i.e., we can't just assume one column is all the same data type (e.g., there may be row headers, column headers, empty cells, different "formats", ...)
requirements
code completely within SQL 2005 (stored procedures, SSIS?)
be able to access values on any worksheet (tab)
be able to access values in any cell (no formula data or dereferencing needed)
cell values must not be assumed to be "uniform" between rows -- i.e., we can't just assume one column is all the same data type (e.g., there may be row headers, column headers, empty cells, formulas, different "formats", ...)
preferences
no filesystem access (no writing temporary .xls files)
retrieve values in defined format (e.g., actual date value instead of a raw number like 39876)
My thought is that anything can be done, but there is a price to pay. In this particular case, the price seems to bee too high.
I don't have a tested solution for you, but I can share how I would give my first try on a problem like that.
My first approach would be to install excel on the SqlServer machine and code some assemblies to consume the file on your rows using excel API and then load them on Sql server as assembly procedures.
As I said, This is just a idea, I don't have details, but I'm sure others here can complement or criticize my idea.
But my real advice is to rethink the whole project. It makes no sense to read tabular data on binary files stored on a cell of a row of a table on database.
This looks like an "I wouldn't start from here" kind of a question.
The "install Excel on the server and start coding" answer looks like the only route, but it simply has to be worth exploring alternatives first: it's going to be painful, expensive and time-consuming.
I strongly feel that we're looking at a "requirement" that is the answer to the wrong problem.
What business problem is creating this need? What's driving that? Try the Five Whys as a possible way to explore the history.
It sounds like you're trying to store an entire database table inside a spreadsheet and then inside a single table's field. Wouldn't it be simpler to store the data in a database table to begin with and then export it as an XLS when required?
Without opening up an instance Excel and having Excel resolve worksheet references I'm not sure it's doable at all.
Could you write the varbinary to a Raw File Destination? And then use an Excel Source as your input to whatever step is next in your precedence constraints.
I haven't tried it, but that's what I would try.
Well, the whole setup seems a bit twisted :-) as others have already pointed out.
If you really cannot change the requirements and the whole setup: why don't you explore components such as Aspose.Cells or Syncfusion XlsIO, native .NET components, that allow you to read and interpret native Excel (XLS) files. I'm pretty such with either of the two, you should be able to read your binary Excel into a MemoryStream and then feed that into one of those Excel-reading components, and off you go.
So with a bit of .NET development and SQL CLR, I guess this should be doable - not sure if it's the best way to do it, but it should work.