I want to practice some SQL locally on specific tables that I have.
What I need is simply to take a table, upload it to a software I can run SQL on and work with it. nothing more. no servers, no other users.
I tried a few different products but just can't find one that allows this option without creating a server and setting up connections.
Please help :)
Thanks!
I think something like SQLite would work well for your purpose. SQLite is serverless
You can then use a shell or DOS prompt to create a db for it, create your table(s) for the db, and then upload your data to the table(s).
https://www.sqlite.org/quickstart.html
sql fiddle, maybe this is what are you looking for.
Related
I need to do some data migration between two oracle databases that in different servers. I've thought of some ways to do it like writing a jdbc program but i think the best way is to do it in SQL itself. I can also copy the entire table over to the database I am migrating to but these tables are big and doesnt seem like a "elegant" solution.
Is it possible to open a connection to one DB in SQL developer then connect to the other one using SQL and writing update/insert functions on tables as if they were both in the same connection?
I have read some examples on creating linked tables but none seem to be oracle specific or tell me how to open the external connection by supplying it the server hostname/port/SID/user credentials.
thanks for the help!
If you create a Database Link, you can just select a from different database by querying TABLENAME#dblink.
You can create such a link using the CREATE DATABASE LINK statement.
It depends if its a one time thing or a normal process and if you need to do ETL (Extract, Transform and Load) or not, but ill help you out based on what you explained.
From what i can gather from your explanation, what you attempt to accomplish is to copy a couple of tables from one db to another, if they can reach one another then its really simple, you could just create a DBLINK (http://www.dba-oracle.com/t_how_create_database_link.htm) and then do a SELECT AS INSERT from either side using the DBLINK for one of the tables and the local table as the receiver or sender. Its pretty straight forward.
But if its a one time thing i would just move the table with expdp and impdp since that will be a lot faster and a lot less strain on the DB.
If its something you need to maintain and keep updated, why not just add the DBLINK and use that on both sides, this will be dependent on network performance though.
If this is a bit out of you depth or you cant create dblinks due to restrictions, SQL Developer has had a database copy option for a while and you can go as far a copying individual tables, but its very heavy on the system where its being run (http://deepak-sharma.net/2014/01/12/copy-database-objects-between-two-databases-in-oracle-using-sql-developer/).
Hi i have requirement to create a database from which no data goes outside not in csv format or as a dump file.
If mysql crashes the data should be gone but no recovery should exist..
It may looks like stupid idea to implement but clients requirement is like that only..
So any one help me how to restrict mysqlbump client program and INTO OUTFILE commands for all users except root user. Other users will have select insert update delete and etc database leve privileges but not global level privileges..
Can any one help me on this ?
I'm not sure what are you looking for, But if you have ssh access to server, I propose to use filesystem backup or some useful tools like innobackupex instead of mysqldump.
For big data mysqldump isn't good solution.
u must restrict every new mysql user to Select_priv, Lock_tables_priv, file,alter, crete tmp table , execute,create table . so the user cant do any things. even in mysqldump , they cant get export. use mysql.user table. or use some tools like navicat .
You can't. From the perspective of the MySQL server, mysqldump is just another client that runs a lot of SELECT statements. What it happens to do with them (generate a dump file) is not something that the server can control.
For what it's worth, this sounds like an incredibly stupid idea, as it means that there will be no way to restore your client's data from backups if a disaster occurs (e.g, a hard drive fails, MySQL crashes and corrupts tables, someone accidentally deletes a bunch of data, etc). The smartest thing to do here will be to tell the client that you can't do it - anything else is setting yourself up for failure.
I have an application that stores data in an Oracle database. I want to copy selected rows from a table in this database to a table in a Sybase database (archiving records). Can I do this directly (i.e. without storing and loading results from a file)?
I've mostly looked into SQL*Plus
SQL*Plus COPY Command (http://docs.oracle.com/cd/B19306_01/server.102/b14357/apb.htm)
Copying Data from the Oracle Database Server to Sybase (http://docs.oracle.com/cd/A95432_01/a80982/ch5.htm#153526)
Copy Command (http://www.oracleutilities.com/SQLPLus/copy.html)
Oracle® Database Gateway for Sybase User's Guide (http://docs.oracle.com/cd/B28359_01/gateways.111/b31048/toc.htm)
I also understand the following: "However, INSERT is the only option supported when copying to Sybase. The SQL*Plus COPY command does not support copying to tables with lowercase table names." However, I haven't been able to do this in SQL*Plus. I'll keep trying, but if anyone has an example of how to do it here, I'd very much appreciate it.
If this is not possible, is Oracle Data Pump (http://www.oracle.com/technetwork/database/enterprise-edition/index-093639.html) my best alternative?
Thank you!
Sincerely,
Deepyaman
Your best bet may be to use some form of ETL tool to handle this if the size of your data is reasonable, rather than getting into the details of setting up the gateways, etc, between systems.
There are many options - Talend Open Studio (free), Informatica, or Microsoft SSIS all should be able to handle this.
The robust way to do this is create a flat file(txt,csv) or an INSERT sql from your "COPY_FROM_DATABASE". And then load it into corresponding table. You might have to do a bit of formatting in this sql in order to run it on a different server. I personally like INSERT sql better.
I was asked in the interview tell me the different ways of exporting database from one sql server to another, I knew only about creating a .bak file and then restoring it to another sql server which I told them. However, they asked me about a single SQL INSERT command which will perform this task.
I have googled it and can not find it. Please tell me if there is any such command ?
I have never heard of such a command and this is the MS support article that tells you how to move database between servers. It gives three options none of which are a single insert statement, the closest is using sp_detach_db and sp_attach_db.
Well with a SQL Statement you can do a backup and a restore. Doing it with one SQL INSERT... I've never heard something like this. Maybe one table. But not the whole database.
The other way would be to use the "Copy Database Wizzard".
I am doing also interviews and sometimes you just ask stuff that does not exist or does not work and see what is happening.
If you had a linked server already, I would guess you could use sp_msforeachtable around an INSERT INTO server2.tbl SELECT * FROM tbl.
But that's not going to handle referential integrity order dependencies or scenarios where you might need IDENTITY INSERT, disabling triggers or whatever. Handling trivial cases is usually, by definition, trivial.
you need to say linked server
http://www.databasejournal.com/features/mssql/article.php/3085211/Linked-Servers-on-MS-SQL-Part-1.htm
http://www.databasejournal.com/features/mssql/article.php/3691721/Setting-up-a-Linked-Server-for-a-Remote-SQL-Server-Instance.htm
I'm trying to find out if this is possible, but so far I haven't found out any good solutions. What I would like to achieve is write a stored procedure that can clone a database but without the stored data. That means all tables, views, constraints, keys and indexes should be included but without any data. Can it be done?
Sure - your stored proc would have to read the system catalog views to find out what objects are in the database, determine their potential dependencies, and then create a single or a collection of SQL scripts which re-create the database, and execute those.
It's possible - not very nice and easy to do. Especially the dependencies between objects might cause more headaches than first meets the eye....
You could also:
use something like SQL Server Management Studio (if you're on SQL Server - you didn't specify) and create the scripts manually, and just re-execute them on a separate server
use a "diff" tool like Redgate SQL Compare to compare two servers and have the second one brought up to date
I've successfully used the Microsoft SQL Server Database Publishing Wizard for this purpose. It's pretty straightforward, no coding needed. Here's a sample call:
sqlpubwiz script -d DatabaseName -S ServerName -schemaonly C:\Projects2\Junk\ DatabaseName.sql
I believe the default is to create both data and schema, but you can use the schemaonly parameter.
Download it here
In SQL Server you can roll through the system tables (sys.tables, sys.columns, etc.) and construct things one at a time. It's going to be very manual and error prone at the beginning, but it should become systematic pretty quickly.
Another way to do it is to write something in .Net using SMO. Check out this link:
http://www.sqlteam.com/article/scripting-database-objects-using-smo-updated