I'm new to VBA and very new to SQL so I don't quite understand why I am having this problem. I'm trying to import excel files into an access database. For a few reasons I have to store my excel data into an array and then upload that array into a temporary table and then append the pertinent parts of the table. To my current understanding, the best way to get my data from my array into a new temporary table would be to create that table and then populate it row by row using SQL. My problem is that when I try to create the table, I get an error. I've simplified my code, I believe this contains what you need to know.
Dim strSQL As String
strSQL = "CREATE TABLE TestTable (`Value` VARCHAR(20))"
CurrentDb.Execute strSQL
This works fine
Dim strSQL As String
strSQL = "CREATE TABLE TestTable (`Value` DECIMAL(4,2))"
CurrentDb.Execute strSQL
This results in run-time error 3292 "Syntax error in field definition"
Dim strSQL As String
strSQL = "CREATE TABLE TestTable (`Value` DECIMAL(4,2))"
CurrentProject.Connection.Execute strSQL
This works fine
Is there a reason why I should ever use one over the other? Simply changing to CurrentProject seems to fix any problems I have but I just want to understand what is happening here. I have done my own research but any answer I find goes over my head (again, I am relatively new to VBA). I apologize in advance, there may be no answer that I am currently able to comprehend
IIRC, CurrentDb.Execute supports a different ANSI SQL standard, I think ANSI 89, although I could be wrong. CurrentProject.Connection.Execute supports ANSI 92, which allows that statement to execute correctly. It's a different spec of the SQL language. If you've ever used or will use SQL server this becomes obvious pretty quickly.
You should notice that your SQL statement won't work either trying to run it as a plain SQL query either. To run this successfully as a query, one option is to enable ANSI 92 as it isn't enabled by default in Access. See below for the relevant option to toggle.
That being said, changing the SQL standard may be less than ideal if you are familiar with Access' SQL syntax, however it might be very helpful if you already have experience with SQL Server. Also worth noting, changing the ANSI option in Access won't change how CurrentDB.Execute runs executes SQL statements in VBA, you'll still need the current approach, or running a stored query.
Related
I am working on developing an application for my company. From the beginning we were planning on having a split DB with an access front end, and storing the back end data on our shared server. However, after doing some research we realized that storing the data in a back end access DB on a shared drive isn’t the best idea for many reasons (vpn is so slow to shared drive from remote offices, access might not be the best with millions of records, etc.). Anyways, we decided to still use the access front end, but host the data on our SQL server.
I have a couple questions about storing data on our SQL server. Right now when I insert a record I do it with something like this:
Private Sub addButton_Click()
Dim rsToRun As DAO.Recordset
Set rsToRun = CurrentDb.OpenRecordset("SELECT * FROM ToRun")
rsToRun.AddNew
rsToRun("MemNum").Value = memNumTextEntry.Value
rsToRun.Update
memNumTextEntry.Value = Null
End Sub
It seems like it is inefficient to have to use a sql statement like SELECT * FROM ToRun and then make a recordset, add to the recordset, and update it. If there are millions of records in ToRun will this take forever to run? Would it be more efficient just to use an insert statement? If so, how do you do it? Our program is still young in development so we can easily make pretty substantial changes. Nobody on my team is an access or SQL expert so any help is really appreciated.
If you're working with SQL Server, use ADO. It handles server access much better than DAO.
If you are inserting data into a SQL Server table, an INSERT statement can have (in SQL 2008) up to 1000 comma-separated VALUES groups. You therefore need only one INSERT for each 1000 records. You can just append additional inserts after the first, and do your entire data transfer through one string:
INSERT INTO ToRun (MemNum) VALUES ('abc'),('def'),...,('xyz');
INSERT INTO ToRun (MemNum) VALUES ('abcd'),('efgh'),...,('wxyz');
...
You can assemble this in a string, then use an ADO Connection.Execute to do the work. It is frequently faster than multiple DAO or ADO .AddNew/.Update pairs. You just need to remember to requery your recordset afterwards if you need it to be populated with your newly-inserted data.
There are actually two questions in your post:
Will OpenRecordset("SELECT * FROM ToRun") immediately load all recordsets?
No. By default, DAO's OpenRecordset opens a server-side cursor, so the data is not retrieved until you actually start to move around the recordset. Still, it's bad practice to select lots of rows if you don't need to. This leads to the next question:
How should I add records in an attached SQL Server database?
There are a few ways to do that (in order of preference):
Use an INSERT statment. That's the most elegant and direct solution: You want to insert something, so you execute INSERT, not SELECT and AddNew. As Monty Wild explained in his answer, ADO is prefered. In particular, ADO allows you to use parameterized commands, which means that you don't have to put-into-quotes-and-escape your strings and correctly format your dates, which is not so easy to do right.
(DAO also allows you to execute INSERT statements (via CurrentDb.Execute), but it does not allow you to use parameters.)
That said, ADO also supports the AddNew syntax familiar to you. This is a bit less elegant but requires less changes to your existing code.
And, finally, your old DAO code will still work. As always: If you think you have a performance problem, measure if you really have one. Clean code is great, but refactoring has a cost and it makes sense to optimize those places first where it really matters. Test, measure... then optimize.
It seems like it is inefficient to have to use a sql statement like SELECT * FROM ToRun and then make a recordset, add to the recordset, and update it. If there are millions of records in ToRun will this take forever to run?
Yes, you do need to load something from the table in order to get your Recordset, but you don't have to load any actual data.
Just add a WHERE clause to the query that doesn't return anything, like this:
Set rsToRun = CurrentDb.OpenRecordset("SELECT * FROM ToRun WHERE 1=0")
Both INSERT statements and Recordsets have their pros and cons.
With INSERTs, you can insert many records with relatively little code, as shown in Monty Wild's answer.
On the other hand, INSERTs in the basic form shown there are prone to SQL Injection and you need to take care of "illegal" characters like ' inside your values, ideally by using parameters.
With a Recordset, you obviously need to type more code to insert a record, as shown in your question.
But in exchange, a Recordset does some of the work for you:
For example, in the line rsToRun("MemNum").Value = memNumTextEntry.Value you don't have to care about:
characters like ' in the input, which would break an INSERT query unless you use parameters
SQL Injection
getting the date format right when inserting date/time values
I am converting old VB6 code to VB.NET with ADO.NET (OleDB). This is my query that will create a blank table when ran in VB.NET, but then works when ran directly in Access. This code presumably also works in VB6, as I am using the same SQL:
SELECT qryAsOf.name, qryAsOf.type, 0 as opt, 0 as swap
INTO qryCon
FROM qryAsOf
LEFT JOIN qryLinked on qryAsOf.c = qryLinked.lc
I feel like this has something to do with the left join and select into being together, but like I said it is only VB that has an issue with it, Access handles it perfectly.
Thanks :)
Edit: more details --
Without the INTO line, this query returns all 600+ rows.
This DOES NOT WORK:
cm.CommandText = "CREATE TABLE qryCon (etc...)"
cm.ExecuteNonQuery()
cm.CommandText = "INSERT INTO qryCon SELECT ..." '(rest of query above without INTO line)
cm.ExecuteNonQuery
This DOES WORK:
cm.CommandText = "CREATE PROC qryCon AS SELECT ..." '(same select as above without INTO, again)
cm.ExecuteNonQuery
The CREATE PROC that does work is fine, except I need to insert data into it later, so I get errors about how I need an updatable table. I really want the end qryCon to be a table, but I can't seem to get that to work :(
*However, when I do something like this (using the stored proc (renamed) above which, if viewed in Access, is full of data)
cm.CommandText = "SELECT * FROM storedProc"
dr = cm.ExecuteReader
while dr.Read
cm.CommandText = "INSERT INTO qryCon VALUES (dr.GetValue(0), dr.GetValue(1), dr.GetValue(2), dr.GetValue(3))
cm.ExecuteNonQuery
end while
This DOES NOT WORK! By the way, I removed the concatenation in the query for readability. It is correct in the actual project.
Your VB6/Access syntax functions where you would iterate through the query object itself when parsing the results.
For .Net, you want to leverage the newer ADO.Net library and use a DataReader (or DataAdapter if you need to work with the results more than once in your code without having to make a second trip to the database) object in order to process your query. You can find a nice overview of how to work with them on the MSDN site.
EDIT:
After your question was updated, you want your SQL query to be this:
SELECT qryAsOf.name, qryAsOf.type, 0 as opt, 0 as swap
FROM qryAsOf
LEFT JOIN qryLinked on qryAsOf.c = qryLinked.lc
and then you'll iterate through the results and parse the values through each loop (as found in the MSDN link with something like
If reader.HasRows Then
Do While reader.Read()
Console.WriteLine(reader.GetInt32(0) & vbTab & reader.GetString(1))
Loop
Else
Console.WriteLine("No rows found.")
End If
Select into syntax may not be supported in the middleware, which tends to be somewhat generic. The statement might have to be executed in"passthrough" i.e. native-syntax mode.
Since my earlier comment about pass-through mode appears to have been downvoted, perhaps by #onedaywhen, I include this link:
http://msdn.microsoft.com/en-us/library/ms681754(v=vs.85).aspx
Jet OLEDB:ODBC Pass-Through Statement
(DBPROP_JETOLEDB_ODBCPASSTHROUGH)
Indicates that Jet should pass the SQL
text in a Command object to the back
end unaltered.
The documentation seems to suggest that the SQL text can be altered by the middleware unless this feature is invoked. So I repeat, you might want to make sure that oleDB supports SELECT INTO syntax. I'm not saying that SELECT INTO is not supported -- just that it is an issue worthy of the OP's attention.
P.S. When talking to a database via a middle layer that is designed to support a variety of back-ends, it is a good idea, I think, to use pass-through mode when the statement contains a function like IIF() or a UDF, or any syntax, such as SELECT INTO, which is not universally supported.
P.P.S. Here's some additional info that might be relevant:
http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlconnectionstringbuilder.aspx
MS Access has limited capabilities to manage raw SQL queries: the editor is quite bad, no syntax highlighting, it reformats your raw SQL into a long string and you can't insert comments.
Debugging complex SQL queries is a pain as well: either you have to split it into many smaller queries that become difficult to manage when your schema changes or you end-up with a giant query that is a nightmare to debug and update.
How do you manage your complex SQL queries in MS Access and how do you debug them?
Edit
At the moment, I'm mostly just using Notepad++ for some syntax colouring and SQL Pretty Printer for reformatting sensibly the raw SQL from Access.
Using an external repository is useful but keeping there's always the risk of getting the two versions out of sync and you still have to remove comments before trying the query in Access...
For debugging, I edit them in a separate text editor that lets me format them sensibly. When I find I need to make changes, I edit the version in the text editor, and paste it back to Access, never editing the version in Access.
Still a major PITA.
I have a few tips that are specific to SQL in VBA.
Put your SQL code with a string variable. I used to do this:
Set RS = DB.OpenRecordset("SELECT ...")
That is hard to manage. Do this instead:
strSQL = "SELECT ..."
Set RS = DB.OpenRecordset(strSQL)
Often you can't fix a query unless you see just what's being run. To do that, dump your SQL to the Immediate Window just before execution:
strSQL = "SELECT ..."
Debug.Print strSQL
Stop
Set RS = DB.OpenRecordset(strSQL)
Paste the result into Access' standard query builder (you must use SQL view). Now you can test the final version, including code-handled variables.
When you are preparing a long query as a string, break up your code:
strSQL = "SELECT wazzle FROM bamsploot" _
& vbCrLf & "WHERE plumsnooker = 0"
I first learned to use vbCrLf when I wanted to prettify long messages to the user. Later I found it makes SQL more readable while coding, and it improves the output from Debug.Print. (Tiny other benefit: no space needed at end of each line. The new line syntax builds that in.)
(NOTE: You might think this will let you add add comments to the right of the SQL lines. Prepare for disappointment.)
As said elsewhere here, trips to a text editor are a time-saver. Some text editors provide better syntax highlighting than the official VBA editor. (Heck, StackOverflow does better.) It's also efficient for deleting Access cruft like superfluous table references and piles of parentheses in the WHERE clause.
Work flow for serious trouble shooting:
VBA Debug.Print > (capture query during code operation)
query builder > (testing lab to find issues)
Notepad++ > (text editor for clean-up and review)
query builder > (checking, troubleshooting)
VBA
Of course, trouble shooting is usually a matter of reducing the complexity of a query until you're able to isolate the problem (or at least make it disappear!). Then you can build it back up to the masterpiece you wanted. Because it can take several cycles to solve a sticky problem, you are likely to use this work flow repeatedly.
I wrote Access SQL Editor-- an Add-In for Microsoft Access-- because I write quite a lot of pass-through queries, and more complex SQL within Access. This add-in has the advantage of being able to store formatted SQL (with comments!) within your Access application itself. When queries are copied to a new Access application, formatting is retained. When the built-in editor clobbers your formatting, the tool will show your original query and notify you of the difference.
It currently does not debug; if there was enough interest, I would pursue this-- but for the time being the feature set is intentionally kept small.
It is not free for the time being, but purchasing a license is very cheap. If you can't afford it, you can contact me. There is a free 14-day trial here.
Once it's installed, you can access it through your Add-Ins menu (In Access 2010 it's Database Tools->Add Ins).
Debugging is more of a challenge. If a single column is off, that's usually pretty easy to fix. But I'm assuming you have more complex debugging tasks that you need to perform.
When flummoxed, I typically start debugging with the FROM clause. I trace back to all the tables and sub-queries that comprise the larger query, and make sure that the joins are properly defined.
Then I check my WHERE clause. I run lots of simple queries on the tables, and on the sub-queries that I've already checked or that I already trust, and make sure that when I run the larger query, I'm getting what I expect with the WHERE conditions in place. I double-check the JOIN conditions at the same time.
I double-check my column definitions to make sure I'm retrieving what I really want to see, especially if the formulas involved are complicated. If you have something complicated like a coordinated subquery in a column definition
Then I check to see if I'm grouping data properly, making sure that "DISTINCT"'s and "UNION"'s without UNION ALL don't remove necessary duplicates.
I don't think I've ever encountered a SQL query that couldn't be broken down this way. I'm not always as methodical as this, but it's a good way to start breaking down a real stumper.
One thing I could recommend when you write your queries is this: Never use SELECT * in production code. Selecting all columns this way is a maintenance nightmare, and it leads to big problems when your underlying schemas change. You should always write out each and every column if you're writing SQL code that you'll be maintaining in the future. I saved myself a lot of time and worry just by getting rid of "SELECT *"'s in my projects.
The downside to this is that those extra columns won't appear automatically in queries that refer to "SELECT *" queries. But you should be aware of how your queries are related to each other, anyway, and if you need the extra columns, you can go back and add them.
There is some hassle involved in maintaining a code repository, but if you have versioning software, the hassle is more than worth it. I've heard of ways of versioning SQL code written in Access databases, but unfortunately, I've never used them.
If you're doing really complex queries in MS Access, I would consider keeping a repository of those queries somewhere outside of the Access database itself... for instance, in a .sql file that you can then edit in an editor like Intype that will provide syntax highlighting. It'll require you to update queries in both places, but you may end up finding it handy to have an "official" spot for it that is formatted and highlighted correctly.
Or, if at all possible, switch to SQL Server 2005 Express Edition, which is also free and will provide you the features you desire through the SQL Management Studio (also free).
Expanding on this suggestion from Smandoli:
NO: DoCmd.RunSQL ("SELECT ...")
YES: strSQL = "SELECT ..."
DoCmd.RunSQL (strSQL)
If you want to keep the SQL code in an external file, for editing with your favorite text editor (with syntax coloring and all that), you could do something like this pseudo-code:
// On initialization:
global strSQL
f = open("strSQL.sql")
strSQL = read_all(f)
close(f)
// To to the select:
DoCmd.RunSQL(strSQL)
This may be a bit clunky -- maybe a lot clunky -- but it avoids the consistency issues of edit-copy-paste.
Obviously this doesn't directly address debugging SQL, but managing code in a readable way is a part of the problem.
Similar to recursive, I use an external editor to write my queries. I use Notepad++ with the Light Explorer extension for maintaining several scripts at a time, and Notepad2 for one-off scripts. (I'm kind of partial to Scintilla-based editors.)
Another option is to use the free SQL Server Management Studio Express, which comes with SQL Server Express. (EDIT: Sorry, EdgarVerona, I didn't notice you mentioned this already!) I normally use it to write SQL queries instead of using Access, because I typically use ODBC to link to a SQL Server back end anyway. Beware that the differences in the syntax of T-SQL, used by SQL Server, and Jet SQL, used by Access MDB's, are sometimes substantial.
Are you talking here about what MS-Access calls 'queries' and SQL call 'views' or about the 'MS-Access pass-through' queries which are SQL queries? Someone could get easily lost! My solution is the following
free SQL Server Management
Studio Express, where I will
elaborate and test my queries
a query table on the client
side, with one field for the query
name (id_Query) and another one
(queryText, memo type) for the
query itself.
I then have a small function getSQLQuery in my VBA code to be used when I need to execute a query (either returning a recordset or not):
Dim myQuery as string, _
rsADO as ADODB.recorset
rsADO = new ADODB.recordset
myQuery = getSQLQuery(myId_Query)
'if my query retunrs a recordset'
set rsADO = myADOConnection.Execute myQuery
'or, if no recordset is to be returned'
myADOConnection.Execute myQuery
For views, it is even possible to keep them on the server side and to refer to them from the client side
set rsADO = myADOConnection.execute "dbo.myViewName"
Well to my knowledge there are 2 options:
Notepad++ with Poor man's t-sql formatter plugin ..i know there is already a mention for SQL Pretty Printer but i haven't used it..so my workflow is ..i create the query in Access..i copy paste it to Notepad++ ...i format it..i work on it ...back to Access..only issue..it pads in some cases spaces in this case : [Forms]![AForm].[Ctrl] and they become [Forms] ! [AForm].[Ctrl] but i am used to and it doesn't bother me..
SoftTree SQL Assistant (http://www.softtreetech.com/sqlassist/index.htm) bring just about everything you wanted on a SQL editor...i have worked a bit in the past(trial) but its price tag is a bit stiff
I'm trying to use ADO to create several tables at once, into MS Access. Is it possible to do multiple statements in the one operation? For instance:
...
// I have omitted the field details
CString sQuery = "CREATE TABLE [Table1] (..., PRIMARY KEY ([ID])); \nCREATE TABLE [Table2] (..., PRIMARY KEY ([ID]));";
oRecordset.Open(oDatabase.m_pConnection, sQuery)
This fails due to a "Syntax Error in CREATE TABLE statement", although each of the create statements work on their own perfectly. Is there a way of doing this sort of thing? There will also be statements to add constraints, add indexing, etc., and I'd really like to be able to do it so that I don't have to split up the string into separate parts.
ADO isn't the issue: the ACE/Jet engine simply does not support multiple SQL statements within a single operation. In other words, ACE/JET SQL lacks procedural syntax found in most 'industrial-strength' SQL products. See #David-W-Fenton's answer for more detail.
Bottom line: You will need to issue a Connection.Execute for each CREATE TABLE statement i.e. client side procedural code. But they can (perhaps should) all be run in the same transaction, of course.
ADO to MS Access does not support batch SQL statements. You need to run each statement as a separate execution.
People who think you can send multiple SQL statements to Jet in a batch just aren't thinking.
Jet is a file-server database engine -- there is no centralized server process controlling interaction between clients and the actual data store. Instead, clients are all running individual instances of Jet and cooperatively editing a file in a way that is controlled by the Jet locking file (LDB). Without a centralized process to serialize and prioritize the SQL statements, you wouldn't want Jet to be able to process multiple statements in a batch.
Those who are offering the suggestion of using ADO and separating the statements with a CrLf should code it up and give it a try and then get back to us about how useful their speculative advice actually is.
If you're sample set of commands is typical, just do something like this in VBA or the language of your choice:
public sub ExeuteBatch(BatchString as String)
var s as string
var abatch as array
sbatch = replace(sbatch, "\n", "")
abatch = split(BatchString, ";")
for each s in abatch
** adodb execute s here **
next s
end sub
That's off the top of my head, but you should be able to take it from there I hope.
Crude but it works - create the necessary number of queries with one SQL statement each, then use a Macro to run the queries successively. That's about as good as can be done with ADO/Jet.
I don't know if ADO is constructed over JET OleDB Engine, which I suppose, if it is this way, The Jet Engine doesn't support execution of multiple statements in one single batch, we tryed separating with ; and with the GO reserved word, but it does not work.
I think you can run multiple commands in one ADO Command.
You just need proper line feeds between then. i.e. \n doesn't work.
Try something like this:
(Using VB Syntaxish)
MyQuery = "Select * from Whatever " & vbLf <br>
MyQuery = MyString & "Select * from SomethingElse " & vbLF
oRecordset.Open(oDatabase.m_pConnection, MyQuery )
I have a table (readings) already connected by ODBC in Access that opens very quickly when I click on it.
However, when I try to run this in VBA I it locks up and never displays anything:
Dim strSql As String
strSql = "SELECT readings.ids " & _
"INTO ids_temp " & _
"FROM readings " & _
"WHERE readings.ids > 1234;" //This is id in middle of list
DoCmd.SetWarnings False
DoCmd.RunSQL strSql
DoCmd.SetWarnings True
For some reason this crashes the whole system. Any ideas?
Rather than using DoCmd, t's usually handled by your existing connection to create a Command object, which accepts SQL statements to use with the Command.Execute method.
Reading the documentation for DoCmd, it appears to primarily be intended for eexecuting Macros from the Access UI menus.
Does you Database have ids_temp table locally? If ids_temp table is Linked table it will delete the table, because select into CREATES NEW TABLE. If you want to add to table try INSERT INTO command. You can clean table before inserting the data.
So the error was actually my fault the id I was using was causing the Query to return about 6 million results. But, this method actually works great, I just create the table and link a list box on a different form to the table, then I just show the form. I do some closes and updates in between but overall it seems to work well. Thanks for the help
Let me say that DoCmd.RunSQL is never advisable, particularly with SetWarnings turned OFF, as you have no idea whether the result is what you expect or not. You've told VBA not to report errors, so you never know if all the records were inserted or not.
It's very easy to replace DoCmd.RunSQL with my SQLRun() function, posted here:
How can I get a value from the update query prompt in Access VBA?