Exporting from SQL Server 2008 - sql

I have a SQL Server database from a POS System. I need to export data to a new POS system. All I need is Products, Prices and Barcodes.
My problem is the barcodes are stored in a different table. I need to export multiple tables and merge them together if this is possible. I have no problem exporting each table and then importing but I am missing the barcodes as they are in a different table.
Can this be done with query builder or scripting?

You can write cross database queries by specifying 3 part names.
If you have a table in DB1 (dbo.Barcodes) as source and a table with the same structure in DB2 (dbo_NewBarcodes), the following query skeleton could be used:
INSERT INTO DB2.dbo.NewBarcodes (
col1, col2, col3
)
SELECT col1, col2, col3 FROM DB1.dbo.Barcodes
If the structure of the two tables are different, construct your select query to transform the columns from the source table to match with the columns in the destination table.
Please note, that the order and count of the columns are important.
EDIT
If the source and destination databases are on different servers, you can either build the database on the source server, than create a backup and restore it on the destination server, or you can use cross server queries (see OPENROWSET, OPENQUERY and Linked Servers)
If there are some data in the destination table and there could be conflicts with the data in the source table, please check the MERGE INTO statement.

Related

COPY INTO versus INSERT INTO on snowflake transformations, with existing tables

So as far as I can tell, it's generally considered more efficient to use COPY INTO versus INSERT INTO in Snowflake. Is this true for existing tables being transformed? And is it even possible for tables already existing in snowflake? For example,
INSERT INTO TEST_TABLE
SELECT *
FROM SOURCE_TABLE_1
UNION ALL
SELECT *
FROM SOURCE_TABLE_2
Doing something like the above query in an insert is very straightforward, but would it technically be more efficient to use a COPY INTO here? And how would the syntax for that work?
COPY INTO TEST_TABLE
FROM (SELECT *
FROM SOURCE_TABLE_1
UNION ALL
SELECT *
FROM SOURCE_TABLE_2)
Doesn't appear to work, is there a way to get it to do so?
Thanks,just trying to learn :)
COPY INTO has two flavours:
data ingestion: COPY INTO table
data unloading COPY INTO location
Both uses named internal/external stage or storage location as one side of operation.
Thus COPY INTO is not inteded to perform data movement between tables already present in Snowflake.
COPY INTO TEST_TABLE
FROM (SELECT *
FROM SOURCE_TABLE_1
UNION ALL
SELECT *
FROM SOURCE_TABLE_2)
Assuming that source_table1 and source_table_2 are stages and not pernament table, it will not work either. Snowflake supports subset of operations during data load: Transforming Data During a Load
The COPY command supports:
Column reordering, column omission, and casts using a SELECT statement. There is no requirement for your data files to have the same number and ordering of columns as your target table.
The ENFORCE_LENGTH | TRUNCATECOLUMNS option, which can truncate text strings that exceed the target column length.
There is a fundamental difference between COPY and INSERT.
COPY is used to load data from a staged file to a Snowflake table. This means you are loading the file from either an internal or an external stage into your Snowflake target table: https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html
INSERT is used to load a table in Snowflake with - like in your example - data from another table. But note: The source data is in a Snowflake table already and not a file in one of your stages: https://docs.snowflake.com/en/sql-reference/sql/insert.html
This means: Your first SQL query would work, your second one - in case SOURCE_TABLE_1 and _2 are permanent tables - would fail.

How to duplicate a MySQL table, including indexes and data using workbench?

I have a large amount of data in my table and I need that same table with different name but same data and structure.
I'm using workbench for this because it is faster than phpmyadmin.
How can I copy table data and structure using workbench?
You can execute:
CREATE TABLE newtablename SELECT * FROM sourcetablename

How to compare table columns and match them in SQL Sever 2012

This is a bit of a HW assignment at work.
What I'm doing at work right now is database archiving. I take a source DB and move all (or specific portion) of data into a new archive DB using a stored procedure.
Problem is, not all of the columns in the source DB match the destination DB.
How can I compare the tables in the DBs for missing columns and then match them? So if source DB has Table 1 that has 4 columns and destination DB has Table 1 that only has 2 columns, how can I compare both Table 1's and then have it add/delete columns to match each other?
SQL Server 2012/SQL Management Studio
You can compare the table Columns by querying Information_Schema.Columns or sys.Columns.
The below query will provide number of missing columns
SELECT c2.table_name,c2.COLUMN_NAME
FROM archdb.[INFORMATION_SCHEMA].[COLUMNS] c2
WHERE table_name='archtable' and c2.COLUMN_NAME
not in (SELECT column_name
FROM orgdb.[INFORMATION_SCHEMA].[COLUMNS]
WHERE table_name='orgtable')
This link will provide information about number of ways and tools available to compare sql server tables

SQL 2012 Data importing and merging data

I am doing a project on database programming using SQL Server 2012 and also visual studio. I have created some tables and I have a excel file with lots of data. The specification at this stage is to merge two excel sheets of data that has 11 columns each (same columns for both files with different data) into a separate table that is then used later on in the project for paging etc.
My original vision was to create two tables, one table that had one excel tab of data and another table exactly the same except for the name to house the second data set, and then using a union join to merge the two tables into one. However importing straight to the tables is impossible (possibly due to the existence of a composite key in column 1 of the data) so I then created two new tables altogether that now does contain the data from the excel sheets however this doesn't meet the specification of merging the files into the table (as the data is still in two tables not one, and also it has to be in a certain table that was created by DDL earlier). Also it doesn't solve the problem as there doesn't seem to be a way to query those tables into an existing table (or is there?)
Anyway, thanks for reading, hopefully I have included enough information, if it seems I've missed something, feel free to ask. I think the ideal solution for this would involve joins of some description such as union but there doesn't seem to be anyway to then relate that join back to the existing table.
Basically, you will want to do something similar this:
SET IDENTITY_INSERT MainTable ON
INSERT INTO MainTable (col1, col2, col3 ... col11)
SELECT col1, col2, col3 ... col11
FROM Table1
UNION
SELECT col1, col2, col3 ... col11
FROM Table2
SET IDENTITY_INSERT MainTable OFF
You haven't specifically mentioned you have an IDENTITY field for your Primary Keys but I'm assuming you are, hence including the SET IDENTITY_INSERT commands.

Comparing the data of two tables in the same database in sqlserver

I need to compare the two table data with in one database.match the data using some columns form table.
Stored this extra rows data into another table called "relationaldata".
while I am searched ,found some solutin.
But it's not working to me
http://weblogs.sqlteam.com/jeffs/archive/2004/11/10/2737.aspx
can any one help how to do this.
How compare two table data with in one database using redgate(Tool)?
Red Gate SQL Data Compare lets you map together two tables in the same database, provided the columns are compatible datatypes. You just put the same database in the source and target, then go to the Object Mapping tab, unmap the two tables, and map them together.
Data Compare used to use UNION ALL, but it was filling up tempdb, which is what will happen if the table has a high row count. It does all the "joins" on local hard disk now using a data cache.
I think you can use Except clause in sql server
INSERT INTO tableC
(
Col1
, col2
, col3
)
select Col1,col2,col3from tableA
Except
select Col1,col2,col3 from tableB
Please refer for more information
http://blog.sqlauthority.com/2008/08/07/sql-server-except-clause-in-sql-server-is-similar-to-minus-clause-in-oracle/
Hope this helps