Retrieve more than 255 characters from a query field - vba

I have a query that displays more than 255 characters in a field, and I want to put that data into a variable that I can process. Unfortunately, MS Access truncates the field value return to 255 characters.
(At least when using this method:)
MyVar = Nz(rst.Fields("myfield").Value)
Most of the workarounds I've found online suggest to create a table, modify the desired field setting to Long Text, and then migrate the data from the query to the table, but I'm getting the same results. The Long Text field is still truncated during the DoCmd execution.
(At least when I do it this way:)
CurrentDB.Execute "Insert Into target_table Select myquery.* From myquery"
Other suggestions mention to change the field to group by "First", but the field reverts to "Expression" when run because the field definition includes a function that runs during the query to manipulate the results.
The query also isn't mine and is rather complicated, using other field expressions that call other functions across tables. I would like to avoid reverse engineering the entire thing just to update a table if at all possible. The data is already on my screen, looking at me - I just want to be able to use it.
It's a very Microsoft-ish solution for an MS product to display some data and then tell you it can't find the stuff it just gave you (I'm looking at you, file explorer), but I'm hoping someone here might have a viable suggestion. Perhaps some other query-to-table methods that don't truncate? Some other query setting, or field retrieval method?

Try this:
Dim MyVar As String
Dim Value As Variant
Value = rst.Fields("myfield").Value
MyVar = Nz(Value)
In any case, this works:
MyVar = Nz(DLookup("myfield", "myquery", "Id = " & someId & ""))
However, most likely your myquery is the limiting factor. It must be a straight select query to not truncate memo-fields.

Related

Finding multiple exact substrings in an SQL query (Crystal Report)

I'm writing a Command for a Crystal Report that queries an SQL Database. The Command will use parameters/inputs that are generated from a different program. I've put parameters directly in Commands before, but this one has to be handled differently.
Said input will be a string that is numbers with an & in between such as this: "6&12&15", order is irrelevant in this case. For understanding purposes, we'll say that the numbers are product ID's and are unique. When a user wants to search for multiple products in this database, the string above will be how it looks.
I have used the following code in the past for non-number based strings and it works well because of how other fields are set up:
CASE WHEN '{?WearhouseState}' = '' THEN 1
WHEN CHARINDEX(Products.WearhouseState,'{?WearhouseState}',0)>0 THEN 1
ELSE 0
END = 1
That code will search for the field's value as a substring essentially anywhere in the given input parameter, which works for things like a state because "Texas" is never going to be a substring of any other state. However, this doesn't work so well with numbers. For example, if a product has an ID of 3, then the search will return that record if the parameter is '31', which I do not clearly want (it would also return product 1 as well).
For the mean time, I have been splitting the string up with a delimiter in Crystal Reports which works fine, but slows down the overall time to create the document. Most of the parameters I use I tend to put right in the query and it drastically improves the speed. The Crystal code is as follows:
{?ProductID}="" or {Command.ProductID} in split({?ProductID},"&")
This works exactly as intended but again, time is of the essence. Any additional information can be provided. It is technically InterSystems SQL so keep that in mind because I know the commands/clauses can vary between SQL.
I'd do the split string operation in SQL Server instead of CR. See e.g. T-SQL split string for a working code sample. Note that this logic does not need to run as a function, but you could also include it directly in your CR command.

SQL Teradata Variable Input

I'm currently writing a statement in which often times a few variables are given in as parameter, so I wanted to not code those in, rather make a pop up when I execute the statement. Like you can see in the Picture.
Pop Up window for inout
Now this works if I code:
WHERE x.test = '?InputVar1'
and put in one variable in.
I want to be able to put in more than one.
I tried replacing = with IN and type in more variables in the pop up seperated by a ",", didnt display the data even though I thinks rows were processed.
I hope a made my problem clear and hoping for some advice. I'm really new in SQL and teradata. Thanks!
This variable is simply used for a Search & Replace, when test is actually a VarChar you have to key in foo','bar (without the outermost quoteswhich are already around '?InputVar1'.
Might be easier to define it as WHERE x.test IN (?InputVar1) and then specifiy 'foo','bar'.
When it's numeric 1,2,3,4 will work.

Access VBA simple sql select statement

Coming from years of web development where PHP and SQL statements were so simple, this recent task I've been required to undergo with MS Access and VBA is absolutely doing my head in at how much it complicates SQL statements. Mind you I have no prior knowledge about VBA so it could be extremely simple and I'm not just getting it, but all I want to do is
"SELECT type FROM tblMatter WHERE id='$id'"
When I wear my PHP cap, I want to think okay, we are going to have one row of data, that's going to be an array, and I want one object out of that array. Simple.
VBA, however, complicates the $#!t out of it. So far my code looks something like this
Dim matterSQL As String
Dim matterRS As Recordset
matterSQL = "SELECT type FROM tblMatter WHERE id'" & id & "'"
Set matterRS = CurrentDb.OpenRecordset(matterSQL)
MsgBox matterRS![type]
CurrentDb is defined much much earlier in the code to open the connection to the database, and the error is on the line containing OpenRecordset with the error: Data type mismatch in criteria expression.
As I said, I'm new to VBA so I don't know what the heck I'm doing, and all the documentation on the internet is nowhere near helpful. But all I want to do is to get one piece of data from the table.
Thanks in advance!
Edit: I needed to build upon this with another query that takes the info from the last query to run. Same kind of ordeal:
Dim costSQL As String
Dim costRS as Recordset
costSQL = "SELECT email FROM tblScaleOfDisb WHERE category=" & category
Set costRS = CurrentDb.OpenRecordset(costSQL)
MsgBox costRS![email]
This time I'm getting an error on the line containing OpenRecordset with the error: Too few parameters. Expected 1.
Which I don't understand because the code is practically the same as the first half of the question. What have I done wrong?
You are missing = in the condition
try below
matterSQL = "SELECT type FROM tblMatter WHERE id='" & id & "'"
Also if id is numeric you don't need '
matterSQL = "SELECT type FROM tblMatter WHERE id=" & id
Too few parameters. Expected 1.
This happens when the field name in your sql query do not match the table field name
if the field name are correct i believe the the datatype of category is not numeric then you have to use '
costSQL = "SELECT email FROM tblScaleOfDisb WHERE category='" & category &"'"
Always try to use parameterised query to avoid SQL injection
You must understand or prepare few things before you start coding on a new platform. such as
Using Keywords/ Reserved keywords
Capturing Errors
basic arithmetic operations/ string operations.
Available functions / methods
Ways of cleaning your variables after using it
in your case you also need to learn about MS ACCESS SQL. Which is pretty similar to standard SQL but (limited to and) strongly influenced by MS Access internal functions.
SQL execution will return n Rows as result. Each row will have n number of columns. You need to understand how you need to loop through result sets.
Please do have some error capturing method. I will help you to understand the direction before spending hours in Google.
in your first SQL: you have a reserved keyword Type. use square brackets to escape reserved keywords. In where condition numeric fields must not have string quotes and strings must have them.
Tip: You can use the MS Access visual query builder to build your query and copy the SQL to VBA.
list of reserved keywords in MS ACCESS: https://support.microsoft.com/en-us/kb/286335
list of functions: http://www.techonthenet.com/access/functions/
Error handling : http://allenbrowne.com/ser-23a.html
Clean/close your objects after usage by explicitly setting as nothing: Is there a need to set Objects to Nothing inside VBA Functions

Parse a comma delimited field into seperate fields (MS ACCESS VBA 2003)

I inherited a database where user input fields are stored as a comma delimited string. I know. Lame. I want a way to parse these fields in a SELECT query where there are three segments of varying number of characters. Counter to all the recommendations that I insert fields into a new table or create a stored procedure to do this, this is what I came up with. I'm wondering if anyone sees any flaw in doing this as a select query (where I can easily convert from string to parsed and back again as need be).
Field_A
5,25,89
So to get the left segment, which is the most straightforward:
Field_1: Left$([Field_A],InStr([Field_A],",")-1)
To get the right-most segment:
Field_3: Right$([Field_A],Len([Field_A])-InStrRev([Field_A],","))
Middle segment was the trickiest:
Field_2: Mid([Field_A],InStr([Field_A],",")+1,InStrRev([Field_A],",")-InStr([Field_A],",")-1)
So the result is:
Field_1 Field_2 Field_3
5 25 89
Any consenting opinions?
Well, if you insist on going down this road......
This might be easier and more adaptable. Create a function in a module:
Public Function GetValueFromDelimString(sPackedValue As String, nPos As Long,
Optional sDelim As String = ",")
Dim sElements() As String
sElements() = Split(sPackedValue, sDelim)
If UBound(sElements) < nPos Then
GetValueFromDelimString = ""
Else
GetValueFromDelimString = sElements(nPos)
End If
End Function
Now in your query you can get any field in the string like this:
GetValueFromDelimString([MultiValueField],0) AS FirstElement, GetValueFromDelimString([MultiValueField],1) AS SecondElement, etc.
I feel like I am buying beer for a minor, encouraging this type of behavior :)
It sounds like you're not asking for information on how to parse a comma-delimited field into different fields, but rather looking for people to support you in your decision to do so, yes?
The fact, as you've already discovered, is that you can indeed do this with skillful application of functions in your SQL field definitions. But that doesn't mean that you should.
In the short run, it's an easy way to achieve your goals as data manager, I'll grant you that. But as a long-term solution it's just adding another layer of complexity to what seems like a poorly-designed database (I know that the latter is not your fault -- I too have inherited my share of "lame" databases).
So I applaud you on "getting the job done" for now, but would advise you to listen to "all the recommendations that you insert fields into a new table" -- they're good recommendations. It'll take more planning and effort, but in the long run you'll have a better database. And that will make everything you do with it easier, faster, and more reliable.
This is an old thread, but someone might search it. You can also do the same strategy as an update query. That way you can keep the original CSV and have 3 new destination fields that can be calculate and recalculated depending on your application purposes.

Large number of UPDATE queries slowing down page

I am reading and validating large fixed-width text files (range from 10-50K lines) that are submitted via our ASP.net website (coded in VB.Net). I do an initial scan of the file to check for basic issues (line length, etc). Then I import each row into a MS SQL table. Each DB rows basically consists of a record_ID (Primary, auto-incrementing) and about 50 varchar fields.
After the insert is done, I run a validation function on the file that checks each field in each row based on a bunch of criteria (trimmed length, isnumeric, range checks, etc). If it finds an error in any field, it inserts a record into the Errors table, which has an error_ID, the record_ID and an error message. In addition, if the field fails in a particular way, I have to do a "reset" on that field. A reset might consist of blanking the entire field, or simply replacing the value with another value (e.g. replacing the string with a new one that has all illegals chars taken out).
I have a 5,000 line test file. The upload, initial check, and import takes about 5-6 seconds. The detailed error check and insert into the Errors table takes about 5-8 seconds (this file has about 1200 errors in it). However, the "resets" part takes about 40-45 seconds for 750 fields that need to be reset. When I comment out the resets function (returning immediately without actually calling the UPDATE stored proc), the process is very fast. With the resets turned on, the pages take 50 seconds to return.
My UPDATE stored proc is using some recommended code from http://sommarskog.se/dynamic_sql.html, whereby it uses CASE instead of dynamic SQL:
UPDATE dbo.Records
SET dbo.Records.file_ID = CASE #field_name WHEN 'file_ID' THEN #field_value ELSE file_ID END,
.
. (all 50 varchar field CASE statements here)
.
WHERE dbo.Records.record_ID = #record_ID
Is there any way I can help my performance here. Can I somehow group all of these UPDATE calls into a single transaction? Should I be reworking the UPDATE query somehow? Or is it just sheer quantity of 750+ UPDATEs and things are just slow (it's a quad proc server with 8GB ram).
Any suggestions appreciated.
Don't do this in sql; fix the data up in code, then do you updates.
If you have sql 2008, then look into table-value parameters. It enables you to pass an entire table as a parameter to a s'proc. From their you just have the one insert/update or merge statement
If your looping through the lines and doing individual updates/inserts this can be really expensive... Consider using SqlBulkCopy which can speed up all your inserts. Similarly, you can create a DataSet, make your updates on the dataset and then submit them all in one shot through a SqlDataAdapter.
I believe you are doing 50 case statements on every update. Sounds like that would be slow.
It is possible to solve this problem with inject proof code via parameterized querys and a string constant table.
Quick and dirty example code.
string [] queryList = { "UPDATE records SET col1 = {val} WHERE ID={key}",
"UPDATE records SET col2 = {val} WHERE ID={key}",
"UPDATE records SET col3 = {val} WHERE ID={key}",
...
"UPDATE records SET col50 = {val} WHERE ID={key}"}
Then in your call to SQL you just pick the item in the array corresponding to the col you want to update and set the value and key for the parameterized items.
I'm guessing you will see a significant improvement... let me know how it goes.
Um. Why are you inserting numeric data into VARCHAR fields then trying to run numeric checks on it? This is yucky.
Apply correct data typing and constraints to your table, do the INSERT, and see if it failed. SQL Server will happily report errors back to you.
I would try changing the recovery model to simple and look at my indexes. Kimberly Tripp did a session showing a scenario with improved performance using a heap.