Creating stored queries programmatically with vb.net, odbc and access - sql

i have developed a small ms-access based software with vb.net.
I've added auto-update capabilities to the software (mostly by using clickonce) to simplify the release of new features.
Every version of the software executes the update routine which may update also the existing database.
Lately i've made few changes on the database structure adding few stored queries, so i want the autoupdate code to programmatically add these new queries to the existing database and make it perfectly up to date.
I haven't already found a solution to add stored queries to an ms-access database using odbc...
I also tried to use the "CREATE PROC" sql statement but it does not seem to work with access databases, even if i create the query form the Microsoft Office Access front-end.
I've found some examples that uses ADODB, but i'm using odbc to remain both x86 and x64 compliant.
PS: sorry for my bad english... I hope i've been clear enough

Stored queries in Jet/ACE are of two types, SELECT queries and what Access calls "Action" queries. SELECT queries correspond to VIEWS and action queries to SPROCs. So, if it's a DML statement, you'd create it as an SPROC, while if it's a SELECT, as a VIEW.
The one thing I'm not sure of is how parameters interact with this. I don't use Jet/ACE except from Access, so this is not something that I'm experienced with doing, so don't really have the answer for that.

If you can set a reference to DAO (which is actually much "closer to the metal" of Jet/ACE than ADO is), check the CreateQueryDef method and the QueryDefs collection.

Related

How to minimize bloating Access database having queries and Macros, no VBA code?

I am new to Access, I am a C programmer who also worked with Oracle. Now I am creating an Access database for a small business with Access front-end and SQL Server back-end. My database has about 30 small tables (a few hundreds records each) and a rather complicated algorithm.
I don't use VBA code because I don't have time for learning VBA, but I can write complicated SQL statements, so I use a lot of queries and macros.
I am trying to minimize the daily growth of my database. I've thought about splitting the Access DB. It doesn't make sense because my DB is rather small. After compacting its size is about 5 MB. The regular compact procedure is not convenient because my client's employees work from home any time they wish. So I need to create a DB that would bloat as slowly as possible.
I did some research and found a useful info: "the most common causes of db bloat are over-use of temporary tables and over-use of non-querydef SQL" (http://www.access-programmers.co.uk/forums/showthread.php?t=48759). Could somebody please clarify that for me? I have 3 questions about that:
1) I cannot help using temporary tables, I tried re-using the same table names in 2 ways:
a) first clear all records and then run an append query or
b) first run a Macro command "DeleteObject" (to free the space in full) and then re-create the temporary table.
Can somebody please advise which way is better in order to reduce the DB growth?
2)After running a stored query I cannot free the space like I did in C using VBA statement "query.close" (because I don't use VBA). But I can run Macro command "close query" after each "OpenQuery". Will it help or just double the length of my Macros?
3)Is it correct that I shouldn't use Macro commands RunSQL for simple SQL statements and create stored queries instead? Even though it will create additional stored queries.
Any help would be appreciated!
Ah the joys of going back to lego after being a brickie! :)
1) Access is essentially a text-based file system. When you delete a record or a table, is persists in the file but with a flag which marks it to be ignored. When you compact an Access db, the executable creates a new file, and moves everything unmarked into that, then deletes the old file. You can see this actually happening if you use Windows Explorer to monitor the folder during this process.
You mention you are using SQL Server, is there a reason you are not building the temp tables on the server? This would be a faster, cleaner and all-round more efficient solution - unless we've missed something. Failing that, you will have to make the move from macros, but truthfully, if you can figure out C, then VBA will be like writing a memo!
http://www.access-programmers.co.uk/forums/showthread.php?t=263390
2) issuing close commands for saved queries in Access has no impact on the file-bloat issue, they just look untidy
3) yes, always used saved queries, since this allows Access to compile the SQL in advance, and optimise execution.
ps. did you know you can call SQL Server Stored Procs from within an Access saved query?
https://accessexperts.com/blog/2011/07/29/sql-server-stored-procedure-guide-for-microsoft-access-part-1/
If at all possible, you should look for ways to dispense with the Access back-end, since you already have SQL Server as the backend - though I suspect you have your reasons for this.

Using Strongly typed DataSet in VB.Net Project

Is using strongly typed dataset is good.
Currently I am working on a project developed using VB.Net in Visual Studio 2010.
Previously they were using Sql queries directly into SqlCommand of System.Data.SqlClient, but then after i shifted everything to Strongly Typed DataSet and started using TableAdapters every where..
Now i just wanna ask that is this way is good for a project...
Or Should i shift back to old ones using Just SqlCommands
Or Is there any way to make Sql DataBase in a good way because its an ERP and most of the code is for Data Access..
We use strongly typed datasets all the time now.
After shifting to this behaviour it felt really bad to have SQL-querys in code instead of having it done by the table adapter. But there is a bit overhead with datasets so I guess booth ways are good for different solutions.
Its really nice to have intellisence on all fieldnames and if you change a tableadapter so it returns something different you get design-time errors everywhere where you need to change the code to reflect the change, instead of finding out runtime when the customer is running the program.
There are so many win win-things with strongly typed datasets so I'll never go back.
Table adapters .... make a lot of mess with bigger databases, also updating the table structure also causes confusion.
I would recommend to use some auto code generators for the CRUD Operations.
To me your old pattern looks better than switching altogether to table adapters and strongly typed datasets.
If you ever want to move your data across the wire to other platforms (silverlight, web services, wcf services, etc), then using any kind of dataset will box you into a corner.
The way that we have resolved this is to have classes whose list of properties match the database exactly. To move the data in and out of the database, we use reflection to either match stored procedure parameters or generate dynamic SQL statements, depending on the circumstance and platform.
When a database table is changed, the developer making the change is also responsible for updating the class structure and vice-versa.
In order to reduce the amount of hand-coding required, we use the code generation capabilities of CodeSmith to generate classes from the database and create the basic implementations of our standard add/update stored procedures that require field enumeration.
As an added benefit, this approach removes the tight link between the database and business object structure. We are able to use our same data access code and business object classes against SQL Server, Oracle, Sqlite, and SqlServerCE databases. This code is used to create applications in Windows, PocketPC, Web, iPad, and Android apps; all of the mobile apps use local databases specific to the platform, but using the common data access code.
It is a bit more work to setup initially, but it will pay significant dividends in the long run.

Is there an existing piece of software that allows you to (easily) build queries throught a webpage?

I would like to build arbitrary queries to a database, by allowing the user to build queries "on the fly". For every object/table, being able to select its attributes, and then "building" the query (that would translate into a SQL statement) and finally launching it, all through a web interface.
The ticketing system "rt" does that, for example, and another example would be the http://gatherer.wizards.com/Pages/Advanced.aspx webpage.
I'm currently programming in rails but any existing solution that implements this (or something similar) would be welcome.
Just be careful when creating dynamically generated queries like this that will need to be executed via sp_executesql (example: ms sql server), etc..... make sure you cover all of your bases to ensure that your application isnt vulnerable to SQL injection attacks as this type of development will essentially get one in a lot of trouble if its done incorrectly.. I would recommend storing all queries in a table and only reading queries from this table to help isolate the queries that are being ran in your application. Just identify them with a label, and allow the EU to choose the label from a dropdown list control on the frontend.
Good luck and I'm not sure of any software that will help assist
Not quite sure what your use case is here but i would say check out the
Doctrine ORM ( Object Relational Mapper )
**Edit
After reading more and looking at the example. I would only suggest Doctrine for a large website.
Then use Doctrines DQL syntax with some javascript/jquery magic for the forms.
Note that the queries you're referencing aren't arbitrary: they're on a very specific problem domain, on a specific set of sql tables.
That said, if I were you I'd look into how people are building sql queries with javascript. Something like these:
http://code.google.com/p/django-querybuilder/
http://css.dzone.com/articles/sqlike-sql-querying-engine?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+zones%2Fria+(RIA+Zone)
http://thechangelog.com/post/4914956307/rel-arel-ported-to-node-js-with-some-changes
That'll at least get you a good idea of the underlying data structures.

What's the steps for SQL optimization and changes without reflect live system?

we have a big portal that build using SharePoint 2007 , asp.net 3.5 , SQL Server 2005 .. many developers work in it since 01/2008 and we are now doing huge analysis for current SQL Databases [not share-point DB ] to optimize and enhance it.
The main db have about 330 table and 1720 stored procedure (SP) created from 01/2008 till now
Many table names / Columns is very long and we want to short it
we found SP names is written in 25 format :( , some of them are very complex and also we want to rename
many SP parameters need to be renamed
one of the biggest table is Registered user table, that will be spitted in more than one table for some optimization, many columns name will be changed
I searched for the way that i can rename table names ,columns and i found SQL refactor tool but i still trying it ..
my questions :
Is SQl Refactor is the best tool for renaming ? or is there any other one ?
if i want to make it manually, is there any references or best practice for that ?
How can i do such changes in fast and stable way .. i search for recommendations and case studies if exist ?
This is why people have written coding standards (with defined naming conventions) and have code reviews!! Make sure you implement those procedures right now, to prevent his from getting any worse in the future.
Also for around $300, SQL Refactorâ„¢ is an excellent tool. If you were to use search and replace, you'd have countless errors and spend hours and hours editing code. I wouldn't even consider using anything other than SQL Refactor, and would never even try using a manual search and replace method on something as large as you describe.
You can use Visual Studio 2005 Database Edition, 2008 Database Edition or 2010 ultimate to load up your DB schema. This provides refactor capabilities, as well as database "builds" that check references in stored procedures, views and functions to ensure all tables and columns referenced actually exist.

sql Merge databases

I have a Databse "Product" in in sql 2008.I have another Databse "ORDER" in sql 2008.
Both exist in different servers.
Now the requirement is to Merge both databases, and test pointing the applications to this new DB.
Can anyone suggest the best way to accomplish this without losing the information?
I have 2 options.
1) Script the DB objects.(script both the DB and run this scripts inthe new DB)
2) Export DB
Which one in this is best or should i use any other methods to avoid errors.
I am new to SQL so please guide me with correct options.
Thanks
SNA
In my opinion the best way to achieve what you want is by just Exporting the database.
I think this is the best option because it's alot more safe then scripting the db's into a new one (a way to just get alot of frustration and errors).
Just try the exporting of your database first before trying to do anything with scripting (which obviously also takes alot more time). So try your fast solution first, and see if it will work.
(I see you are using sql-server 2008) Are you also using the management studio? If so, you can go into the tables in edit-mode and try to copy / paste rows into the new tables. I don't know how big your tables / DB's are, but this could also be an option.
Greetings,
Younes
As you say, two options are scripting or using the SQL server export/import wizard.
I've used both (for the same database as it happens)
A third option is to use Visual StudioTeam System 2008 Database Edition GDR.
In terms of a one time export and import then I'd recommend going with the wizard. This is very safe and also very straightforward. Particulary as you are new to SQL server, you want to take the approach that minimizes the risk.
The only downside to doing it this way is that it is perhaps a little less transparent than the other methods.
On the project where I merged databases I ended up using the scripting method but that was mainly because I had a project that was already using GDR to merge incremental database updates, so adding in a data merge script to that was a simple task - all changes needed to go through DBAs who unfortunately weren't very SQL literate (I know!) so keeping all the processes similar was a must.
I also took some of my learnings from scripting the data and applied them to setting up my reference data scripts, so the effort of scripting was not a one time cost.
Either way, the most important tip I can give is to back up the databases before doing any work on them.