I have read access to a complete database but I cannot write.
This causes a problem since I want to compare the database data with external data. (e.g. spreadsheets)
The most efficient solution would be if I can create a new table in that database with the spreadsheet its data.
I it possible to create a table which I can write to and disable writing on the rest of the database?
Based on Mitch his answer I found this explanation. http://databases.aspfaq.com/database/should-i-use-a-temp-table-or-a-table-variable.html This seems to work an solve my problem.
Related
This is a question about how you would go about tackling it.
Every week or so, we get some client delivering an Excel file that needs its contents to be uploaded to their CRM package. It's always something different. For instance, now it's a list of all of their product-barcodes and the current stock. They want us to update the stock of all of their products this one time.
Since it's always something different some client requires, we haven't taken the time to automate this yet (there are other priorities) and we've been doing it by hand. We have automated the most received requests already.
What we do now when such a request comes in, is find the table that the data belongs to in the database, and then use Excel to create INSERT or UPDATE sql scripts that we can copy paste into SSMS to execute.
The way I would do it, is by first writing my INSERT STATEMENT in one cell, and then use excel functions on each row of data to concatenate my insert statement with all the values that are in that row in Excel.
This is quite error prone, time-consuming and I was wondering if anyone can offer any tips on what they would do? How would you handle a question like that? Is there a quicker way of doing it that you can think of?
Mind you: it's always a different question. Today it has to do with products, tomorrow it could be a list of VAT-numbers that they want to see uploaded so all of their clients now have the correct VAT number.
I'm very curious how you would handle this.
Since request is not about automation, I can suggest alternate solution which is still manual, but requires less work.
If you are using any tools for database access like TOAD or Sqldeveloper , there is facility to import data directly from excel.
What you can do is to import data into a separate schema in production or any other database, by creating a temporary table. Further use sql queries for any data massaging and update in target table.
Here are 2 sample threads
How to import excel data into Toad 9.5 table
SQL Developer for importing from Excel
Note: threads may refer Oracle database, but its no different in case of mssql too. Ability of tool remains same.
I need to do some data migration between two oracle databases that in different servers. I've thought of some ways to do it like writing a jdbc program but i think the best way is to do it in SQL itself. I can also copy the entire table over to the database I am migrating to but these tables are big and doesnt seem like a "elegant" solution.
Is it possible to open a connection to one DB in SQL developer then connect to the other one using SQL and writing update/insert functions on tables as if they were both in the same connection?
I have read some examples on creating linked tables but none seem to be oracle specific or tell me how to open the external connection by supplying it the server hostname/port/SID/user credentials.
thanks for the help!
If you create a Database Link, you can just select a from different database by querying TABLENAME#dblink.
You can create such a link using the CREATE DATABASE LINK statement.
It depends if its a one time thing or a normal process and if you need to do ETL (Extract, Transform and Load) or not, but ill help you out based on what you explained.
From what i can gather from your explanation, what you attempt to accomplish is to copy a couple of tables from one db to another, if they can reach one another then its really simple, you could just create a DBLINK (http://www.dba-oracle.com/t_how_create_database_link.htm) and then do a SELECT AS INSERT from either side using the DBLINK for one of the tables and the local table as the receiver or sender. Its pretty straight forward.
But if its a one time thing i would just move the table with expdp and impdp since that will be a lot faster and a lot less strain on the DB.
If its something you need to maintain and keep updated, why not just add the DBLINK and use that on both sides, this will be dependent on network performance though.
If this is a bit out of you depth or you cant create dblinks due to restrictions, SQL Developer has had a database copy option for a while and you can go as far a copying individual tables, but its very heavy on the system where its being run (http://deepak-sharma.net/2014/01/12/copy-database-objects-between-two-databases-in-oracle-using-sql-developer/).
These days I am importing quite a lot of databases from my server and working on them locally. In the process, I am making a number of changes to the table structure and in the process using some complex SQL statements to add the table columns.
Keeping track of everything in a separate file is beginning to be a pain and am wondering if there is a way to do this directly in the SSMS so that I can store the instructions along with the database. Is there any way this can be done or do I have to resort to writing documentation outside SQL Server?
Of course, I can always create a stub table called comments and put everything there but I was looking for a way to associate comments with a particular database or tables. Any suggestions would be greatly appreciated.
SQL-Server handles commenting on database objects through Extended Properties:
http://msdn.microsoft.com/en-us/library/ms190243.aspx
Firstly, let me apologize for the title, as it probably isn't as clear as I think it is.
What I'm looking for is a way to keep sample data in a database (SQL, 2005 2008 and Express) that get modified every so often. At present I have a handful of scripts to populate the database with a specific set of data, but every time the database is changed all the scripts have to be more or less rewritten and I was looking for some alternatives.
I've seen a number of tools and other software for creating sample data in a database, some free and some not. Are there any other methods I haven’t considered?
Thanks in advance for any input.
Edit: Also, if anyone has any advice at all in dealing with keeping data in sync with a changing application or database, that would be of some help as well.
If you are looking for tools for SQL server, go visit Red Gate Software, they have the best tools. They have a data compare tool that you can use to keep lookup type tables up-to-date and a SQL compare tool that you can use to keep the tables synched up between two datbases. So using SQL data compare, create a datbase with all the sample data you want. Then periodically refresh your testing db (or your prod db if these are strictly lookup type tables) using the compare tool.
I also like the alternative of having a script (you can use Red Gate's tool to create scripts) because that means you can store this info in your source control and use it as part of a deployment package to other servers.
You could save them in another database or the same db in different tables distinguished by the name, like employee_test
Joseph,
Do you need to keep just the data in sync, or the schema as well?
One solution to the data question would be SQL Server snapshots. You create a snapshot of your initial configuration, so any changes to the "real" database don't show up in the snapshot. Then, when you need to reset the table, select from the snapshot into a new table. I'm not sure how it will work if the schema changes, but it might be worth a try.
For generation of sample data, the Database project in Visual Studio has functionality that will create fake/random data.
Let me know if this make sense.
Erick
In our web app, we are creating session table in database to store temporary data. So the temp table will be created and destroyed for every user. I have some 300 users for this web app. So for every user these table will be created and destroyed.
i heard that this way of design is not good due to performance issues.
I am using MS Sql server 2005. Is there any way to store a result set temporarily without creating any table.
Please suggest me some solution.
Thanks.
Either:
use a single permanent database table for all users, with a UserID column to filter on
or
just use the session handling ability of your web platform to store the info
It sounds as if you are creating and dropping permanent tables. Have you tried using real temp tables (those with table names beginning with #). OR table variables if you havea small data set. Either of these can work quite well. If you use real temp tables, you need to make sure your tempdb is sized large enough to accomodate the usual amount of users, growing tempdb can cause delays.
I think, a solution at GenerateData is what you are looking for. You can create test/sample databases their and delete them when needed.
Depending on what you're actually doing (and whether you can refactor it) it may be more appropriate to use table variables which are highly performant in general.
There is a question of whther the DB is really an appropriate place to even be trying to persist data sets if it's for your applications benefit - if the question isn't just academic perhaps it would be better to keep the object representation of the data in your app memory?