I have a lengthy SQL INSERT command that I would essentially like to convert to a local Xamarin database, using SQLite. Is there any easy way around this or do I manually have to create an object from each value which is then entered into the local SQLite database?
If you want to pre-seed your database and ship it with the app, then as Jason suggested, you can use SQLite Manager.
If you want the mobile app to seed the database on first load, you will need to create the objects (tables) first and run the below for each table you have.
SQLiteConnection conn = ....
conn.CreateTable<TableClass>();
Then if you just want to run the insert queries you have, you can use the Execute method.
public int Execute(string query, params object[] args)
However SQLite isn't as powerful as MS SQL and if your insert queries are complex, you will have to modify them and possibly your DB Schema, depending upon column types.
Related
I am modifying an access 2010 application so that it uses SQL server to run its queries. The data has been transferred to the server some times ago, and used as linked tables, but that proves a bit slow and non-optimal. So I'm trying to put the queries on the server.
I have no problem for simple queries, views,... and I'm using stored functions when there is a need for simple parameters (dates, ids,...).
But now I have a process in that application that selects a bunch of ids in the database, stores them in a local table, does a bunch of actions on them (report with sub report, print preview, print, update of the original records with the date of print when the user confirms that everything printed OK), and empties the local table if all actions succeed.
I can't simply use an SQL server table to store the ids since many people use the application at the same time, and the same table is used in several processes; I can't use temporary tables since they disappear as soon as access goes to the next action; and I can't find a way to use a local table as a parameter to server stored procedures. Basically I'm stuck.
Can anyone help? Is there a way to do that (pass a bunch of values as a table to a server stored function)? Or another process that would achieve the same result (have a table on the server specific to the current user, or a general table and somehow identify the lines belonging to current user, or anything else)?
There are 2 methods that I use. Both work very well for multi-user apps. Here are the basics. You'll need to work out the details.
Create a table (tblSessions) in SQL Server with an identity column SessID (INT NOT NULL).
Create a function in Access to generate and retrieve a new SessID.
Create a second SS table tblWork with 2 columns SessID, YourID. Add appropriate indexes and keys. Link this table to your Access app. Then instead of inserting the IDs for your query into an Access temp table, insert them into tblWork along with a new SessID. You can now join tblWork to other SS tables/views to use as the datasource for you reports. Be sure to add the SessID that you generated to your where clause.
Create a stored procedure to generate the data for your reports. Include a parameter #YourIDList VARCHAR(MAX). Call the proc via a passthrough queryand pass the list of your IDs as a comma (or whatever you prefer) separated string to #YourIDList. In the proc, split #YourIDList into a temp table. SS2016+ has a STRING_SPLIT function. For older versions, roll your own. There are plenty of examples available. Then join the temp table to the other tables you need to generate your output. Use the PT query as your report datasource, or dump it into an Access temp table and use that as you report datasource.
Let say, I am connected to two different databases (one SQLite and one Oracle) over ODBC in one program. Is it possible to execute a query on one database and insert the resultset as a new table in the second database directly by just passing something like a data cursor, i.e. without the hurdle to create insert statements with explicit values from the result set and executing those on the destination database?
As far as I am aware you cannot do that. You can use something like a join engine to do it (e.g., http://www.easysoft.com/products/data_access/odbc_odbc_join_engine/index.html) but it might be overkill if you are doing it just the once.
If you use Oracle then you can use Oracle Heterogeneous Services which can work with ODBC sources. Have a look at: http://www.dba-oracle.com/t_heterogeneous_database_connections_sql_server.htm
I'm trying to implement simple queries like SELECT * FROM TABLE_X WHERE XID = #id, but the problem that I'm having is that these queries would run on different databases (SQL Server and Oracle) for different application instances.
How to do it without to have to write each database a new set of queries?
Dapper is really closed to the database, and allow you to leverage pure sql tricks specific for a specific database. In my opinion you should use a query object pattern, so you will have an interface in front of each extraction /commit that would possibly change for SQL/Oracle.
I've downloaded the code of SqlMapper.cs and hacked SetupCommand to check if the command is from Oracle or SQL Server.
That was what I did:
if (cnn.GetType().Name.ToLowerInvariant().Contains("oracle"))
{
sql = sql.Replace('#', ':');
}
I am trying to copy a table in a database into another database on another connection in VB.NET, using OleDb. If they were on the same connection I would just use SELECT INTO, but they are not. I have two different OleDbConnection and cannot see an easy way to do this.
Right now I am attempting to just copy the database into a DataTable using an OleDbDataAdapter, and then loop through the DataTable and insert every record into the target database one at a time. This obviously takes a ton of time for the large DB I could potentially be dealing with, and I have to deal with escaping strings, null values, etc.
Is there an easier way to do this?
Thanks,
Logan
edit - just to make this more clear: I have two OleDbConnection objects, one is linked directly to a local .mdb file on my computer (JET). The other is linked to a database on our servers (SQLOLEDB). I am wanting to do this:
"SELECT * FROM fromDB INTO toDB"
But I can't because fromDB and toDB are on different connections, and the OleDbCommand object is only attached to one. The only way I can see how to do this is to connect to fromDB, copy it into a DataTable, connect to toDB, and copy all of the data in the DataTable row by row into toDB. I was wondering if there is an easier way to do this.
If you are constrained to this architecture, one idea is to write a stored procedure on the server that accepts a large chuck of row data in one call. It could then write out the row data to a file for a future bulk-insert, or it could attempt to insert the rows directly.
This also has the benefit of speeding things up over high latency connections to the server.
Also, if you use parameterized statements, you can avoid having to escape strings etc.
If you are just copying from one to the other, why don't you do it in SQL?
You can create a Synonym within one database pointing at a table, view or stored proc on another database (on another server). You can then insert into this synonym just like you could into a table in the same db.
http://www.developer.com/db/article.php/3613301/Using-Synonyms-in-SQL-Server-2005.htm
Is it possible to search and replace all occurrences of a string in all columns in all tables of a database? I use Microsoft SQL Server.
Not easily, though I can thing of two ways to do it:
Write a series of stored procedures that identify all varchar and text columns of all tables, and generate individual update statements for each column of each table of the form "UPDATE foo SET BAR = REPLACE(BAR,'foobar','quux')". This will probably involve a lot of queries against the system tables, with a lot of experimentation -- Microsoft doesn't go out of its way to document this stuff.
Export the entire database to a single text file, do a search/replace on that, and then re-import the entire database. Given that you're using MS SQL Server, this is actually the easier approach. Microsoft created the Microsoft SQL Server Database Publishing Wizard for other reasons, but it makes a fine tool for exporting all of the tables of a SQL Server database as a text file containing pure SQL DDL and DML. Run the tool to export all of the tables for a database, edit the resulting file as you need, and then feed the file back to sqlcmd to recreate the database.
Given a choice, I'd use the second method, as long as the DPW works with your version of SQL Server. The last time I used the tool, it met my needs (MS SQL Server 2000 / 2005) but it had some quirks when working with database Roles.
In MySQL, you can do it very easily like this:
update [table_name] set [field_name] = replace([field_name],'[string_to_find]','[string_to_replace]');
I have personally tested this successfully on a production server.
Example:
update users set vct_filesneeded = replace(vct_filesneeded,'.avi','.ai');
Ref: http://www.mediacollege.com/computer/database/mysql/find-replace.html
A good starting point for writing such a query is the "Search all columns in all the tables in a database for a specific value" stored procedure. The full code is at the link (not trivial, but copy/paste it and use it, it just works).
From there on it's relatively trivial to amend the code to do a replace of the found values.