Is there a way to fill a temp table/other SQL variable after a SPROC has been executed?
I have the SQL results sitting in the SSMS results window, but I don't want to re-run the SPROC to fill a temp table because it takes over an hour.
I can export to CSV and re-import (using OPENROWSET, which is always difficult), but I was curious if there are any more elegant solutions?
I've run into this several times and have not found anything simple..
There's no way.
Also, watch out exporting from SSMS to CSV because
a)It formats things (especially dates and numbers) and
b)It truncates columns (43679 chars max, which might sound a lot but gets used up quickly if XML).
So your export might not be a true representation of results.
Related
I've discovered the SQL VIEW in Ms Access to execute some queries, but I need to execute about 20.000 UPDATE queries I have in a .sql file.
When I paste in the SQL VIEW it says the "Text is too long to modify".
How can I run those UPDATE's ?
The limit to the number of characters in an Access SQL query is "about 64000" - see here https://support.office.com/en-us/article/Access-2010-specifications-HA010341462.aspx. And unfortunately you cannot execute multiple statements in a query. I think this will mean quite a bit of work for you in VBA. Here is an example approach (pseudocode):-
open file
read line into variable
while not EOF
currentdb.execute variable, dbfailonerror
read next line
wend
close file
Probably a nasty surprise for you if you are used to executing huge batches of statements using other RDBMS!
An alternative suggestion: we don't know exactly what your file looks like or where it comes from, but if it is generated from another RDBMS which you have access to, then I would very strongly recommend that you set up an ODBC connection to it, and query out the data you need (either by linking the tables or writing a pass through query), then inserting into your local Access tables. This will be many orders of magnitude faster than executing thousands of individual statements.
If your only source of the data is the SQL statements then you may still be better of if you can parse the SQL text into relevant columns (for example PK, and value to be updated, or if inserting, then all column values), then save as a csv file, import into Access, add keys as necessary, and then run a single update statement as an updateable query against the imported data and the existing tables. Dumping the file into Excel and using the various string functions may enable you to parse the data quite quickly.
There may be an easier way but you could write VBA code that reads the text file line by line and then uses DoCmd.RunSQL to run each query.
I am working on developing an application for my company. From the beginning we were planning on having a split DB with an access front end, and storing the back end data on our shared server. However, after doing some research we realized that storing the data in a back end access DB on a shared drive isn’t the best idea for many reasons (vpn is so slow to shared drive from remote offices, access might not be the best with millions of records, etc.). Anyways, we decided to still use the access front end, but host the data on our SQL server.
I have a couple questions about storing data on our SQL server. Right now when I insert a record I do it with something like this:
Private Sub addButton_Click()
Dim rsToRun As DAO.Recordset
Set rsToRun = CurrentDb.OpenRecordset("SELECT * FROM ToRun")
rsToRun.AddNew
rsToRun("MemNum").Value = memNumTextEntry.Value
rsToRun.Update
memNumTextEntry.Value = Null
End Sub
It seems like it is inefficient to have to use a sql statement like SELECT * FROM ToRun and then make a recordset, add to the recordset, and update it. If there are millions of records in ToRun will this take forever to run? Would it be more efficient just to use an insert statement? If so, how do you do it? Our program is still young in development so we can easily make pretty substantial changes. Nobody on my team is an access or SQL expert so any help is really appreciated.
If you're working with SQL Server, use ADO. It handles server access much better than DAO.
If you are inserting data into a SQL Server table, an INSERT statement can have (in SQL 2008) up to 1000 comma-separated VALUES groups. You therefore need only one INSERT for each 1000 records. You can just append additional inserts after the first, and do your entire data transfer through one string:
INSERT INTO ToRun (MemNum) VALUES ('abc'),('def'),...,('xyz');
INSERT INTO ToRun (MemNum) VALUES ('abcd'),('efgh'),...,('wxyz');
...
You can assemble this in a string, then use an ADO Connection.Execute to do the work. It is frequently faster than multiple DAO or ADO .AddNew/.Update pairs. You just need to remember to requery your recordset afterwards if you need it to be populated with your newly-inserted data.
There are actually two questions in your post:
Will OpenRecordset("SELECT * FROM ToRun") immediately load all recordsets?
No. By default, DAO's OpenRecordset opens a server-side cursor, so the data is not retrieved until you actually start to move around the recordset. Still, it's bad practice to select lots of rows if you don't need to. This leads to the next question:
How should I add records in an attached SQL Server database?
There are a few ways to do that (in order of preference):
Use an INSERT statment. That's the most elegant and direct solution: You want to insert something, so you execute INSERT, not SELECT and AddNew. As Monty Wild explained in his answer, ADO is prefered. In particular, ADO allows you to use parameterized commands, which means that you don't have to put-into-quotes-and-escape your strings and correctly format your dates, which is not so easy to do right.
(DAO also allows you to execute INSERT statements (via CurrentDb.Execute), but it does not allow you to use parameters.)
That said, ADO also supports the AddNew syntax familiar to you. This is a bit less elegant but requires less changes to your existing code.
And, finally, your old DAO code will still work. As always: If you think you have a performance problem, measure if you really have one. Clean code is great, but refactoring has a cost and it makes sense to optimize those places first where it really matters. Test, measure... then optimize.
It seems like it is inefficient to have to use a sql statement like SELECT * FROM ToRun and then make a recordset, add to the recordset, and update it. If there are millions of records in ToRun will this take forever to run?
Yes, you do need to load something from the table in order to get your Recordset, but you don't have to load any actual data.
Just add a WHERE clause to the query that doesn't return anything, like this:
Set rsToRun = CurrentDb.OpenRecordset("SELECT * FROM ToRun WHERE 1=0")
Both INSERT statements and Recordsets have their pros and cons.
With INSERTs, you can insert many records with relatively little code, as shown in Monty Wild's answer.
On the other hand, INSERTs in the basic form shown there are prone to SQL Injection and you need to take care of "illegal" characters like ' inside your values, ideally by using parameters.
With a Recordset, you obviously need to type more code to insert a record, as shown in your question.
But in exchange, a Recordset does some of the work for you:
For example, in the line rsToRun("MemNum").Value = memNumTextEntry.Value you don't have to care about:
characters like ' in the input, which would break an INSERT query unless you use parameters
SQL Injection
getting the date format right when inserting date/time values
Within a stored procedure, I need to take a whole CSV file as a string, then pick out all the values in one "column" to do a further query on the database.
I cannot use a saved doc - so i think that rules out openrowset, and the whole thing has to be done within a stored procedure.
Have spent hours googling and trying, but can find a good answer. One possible was http://www.tainyan.com/articles/entry-32/converting-csv-to-sql-data-table-with-stored-procedure.html but it doesnt work and i can find the error.
How should this be done please?
I don't really like this but it will work, provided your csv column remains at the same column index. I'd be wary of the performance of this but it might work.
See Fiddle here: http://sqlfiddle.com/#!3/336b7/1
Basically convert your csv file to xml, cast to an xml type, then perform queries on the xml.
Using SQL Server 2000 and Microsoft SQL Server MS is there a way to create a delimited string based upon an unknown number of columns per row?
I'm pulling one row at a time from different tables and am going to store them in a column in another table.
A simple SQL query can't do anything like that. You need to specify the fields you are concatenating.
The only method that I'm aware of is to dynamincally build a query for each table.
I don't recall the structure of MSSQL2000, so I won't try to give an exact example, maybe someone else can. But there -are- system tables that contain table defintions. By parsing the contents of those system tables you can dynamically build the necessary query for each source data table.
TSQLthat writes TSQL, however, can be a bit tricky to debug and maintain :) So be careful how you structure everything...
Dems.
EDIT:
Or just do it in your client application.
We need to have a semi complex report in CRM that displays some accumulated lead values. The only way I see this report working is writing a stored procedure that creates a couple of temporary tables and calculates/accumulates data utilizing cursors. Then is the issue of getting the data from the stored procedure to be accessible from the Reporting Server report. Does anyone know if that's possible? If I could have the option of writing a custom SQL statement to generate report data, that would be just excellent.
Any pointers ?
Edit:
To clarify my use of cursors I can explain exactly what I'm doing with them.
The basis for my report (which should be a chart btw) is a table (table1) that has 3 relevant columns:
Start date
Number of months
Value
I create a temp table (temp1) that contains the following columns:
Year
Month number
Month name
Value
First I loop through the rows in the first table and insert a row in the temptable for each month, incrementing month, while setting the value to the total value divided by months. I.e:
2009-03-01,4,1000 in table1 yields
2009,03,March,250
2009,04,April,250
2009,05,May,250
2009,06,June,250
in the temp1 table.
A new cursor is then used to sum and create a running total from the values in temp1 and feed that into temp2 which is returned to the caller as data to chart.
example temp1 data:
2009,03,March,250
2009,04,April,200
2009,04,April,250
2009,05,May,250
2009,05,May,100
2009,06,June,250
yields temp2 data:
2009,03,March,250,250
2009,04,April,450,700
2009,05,May,350,1050
2009,06,June,250,1300
Last column is the running totals, which starts at zero for each new year.
Have you considered using views. Use a heirarchy of views if it is very complicated. Each view would represent one of your temporary tables.
EDIT Based on comments
I was thinking of SQL views, basically the same SQL as you would have written in your stored procedures.
I haven't done this - just thinking how I would start. I would make sure when the stored procedures populate the temporary tables they use the Filtered views for pulling data. I would then set the access to execute the SP to have the same security roles as the Filtered views (which should be pretty much to allow members of the PrivReportingGroup).
I would think that would cover allowing you to execute the SP in your report. I imagine if you set up the SP before hand, the SSRS designer has some means of showing you what data is available and to select an SP at design time. But I don't know that for sure.
First, since most cursors are unneeded, what exactly are you doing in them. Perhaps there is a set-based solution and then you can use a view.
Another possible line of thought, if you are doing something like running totals in the cursor, is can you create a view as the source without the running total and have the report itself do that kind of calculation?
Additionally, SSRS reports can use stored procs as a data source, read about how in Books online.
I found the solution. Downloaded Report Builder 2.0 from Microsoft. This allows me to write querys and call stored procedures for the report data.
Microsoft SQL Server Report Builder link