DBeaver (or similar): is there a way to disactivate a row temporarily? - sql

I have an application that accesses data from an SQL database and would like to make a test what would happen if some of the rows weren't present. I'm visualizing the data using DBeaver and it would be great if I could mark some lines to 'deactivate', i.e. delete them from the DB but still keep them as new lines locally, which I can readd later on.
Is DBeaver able to do that? Is there another tool which can?

You can simply EXPORT your data from the table, and import them again when you finish your test.
from what I know there is no way to hide rows.
if you want to export data from your table:
RIGHT CLICK ON THE SELECTED TABLE
EXPORT
CHOOSE YOUR PREFERRED EXPORT FORMAT
You could also make a backup of the table in its current state, and when you have finished your work you can do a rollback.
BACKUP:
RIGHT CLICK ON THE SELECTED TABLE
INTRUMENT
BACKUP

Related

How do I transfer a schema and all of its tables to a new database?

I have database a with schema foo which contains 20 tables. I want to move all of the contents of schema foo into database b without overriding the current content in database b.
Is there also a way to do it in pgadmin?
I found this link and perhaps it will be quite similar. But this particular link is for transferring a table.
Copy a table from one database to another in Postgres
You can script the first database with all its data once scripted you can run the th script within the other database it should work as long as you dont have tables in the second database with the same name
so in pg admin follow these steps to script the
-Right click on the database and click Backup.
-Selece a filepath and filename on where you want to save your script
-Select Plain as the format in the format dropdown.
-Go to Options and check "schema and data" in tab # 1.
-Then click on Backup.
-Then click Done.
-Then right click on your 2nd database and create a new query.
-Find where you saved the script and copy the script to the query
-run the query and should be all good
if you are unsure about this just create 2 practice databases and practice on those before you do it on the main one

How to Synch SPSS and SQL Data

I imported SQL data to SPSS for analysis and made lots of changes in the field names, types and ...
However, it happens a lot that some records in SQL database are added/removed and these changes must be reflected into my SPSS file. I know that I can export back my SPSS to SQL, make changes there in SQL, and import back SQL data to SPSS. But, every time I import it, all the lables, classifications and ... revert back.
So, I am wondering if at all there is a way to synch records (or even fields) from SQL without importing new SQL table in a fresh SPSS file?
Thanks
You need to re-import the table, but as long as your changes are made in syntax, you can just reexecute that syntax to reapply the changes. Alternatively, if you can identify the changes based on date or other criteria, you could just pull those changes, apply your syntax, and merge with the main file using MATCH, ADD, or UPDATE.

Better way to export data from one database to another (SQLSever)

I'm importing a big flat file (about 400.000 rows and 255+ columns) into SQL Server Management Studio through the import wizard.
To get the right variables I use Suggest types, but I have found that I need to search through all the rows to get the right variable types. It takes a very long time. Is there a way to avoid this or do it faster?
Furthermore, my real goal is to transfer data from one sql server database to another on another computer. I do this by exporting it as a flat file. But maybe this is stupid since I lose the information about the correct format?
Thanks!
According to Copy one database to another database:
There are several ways to do this, below are two options:
Option 1
Right click on the database you want to copy
Choose 'Tasks' > 'Generate scripts'
'Select specific database objects'
Check 'Tables'
Mark 'Save to new query window'
Click 'Advanced'
Set 'Types of data to script' to 'Schema and data'
Next, Next
You can now run the generated query on the new database.
Option 2
Right click on the database you want to copy
'Tasks' > 'Export Data'
Next, Next
Choose the database to copy the tables to
Mark 'Copy data from one or more tables or views'
Choose the tables you want to copy
Finish
Backup your database and restores it on the other server (the tager server must be equal or higher version) or simply copy the database files to other server and attached it (when copying database files, you must ensures that either you have detached the database or stop the sql server service).

Applying changes easily in Access Database

I have got a backup of a live database (A copy of an ACCDB format Access database) in which I've worked, added new fields to existing tables and whole new tables.
How do I get these changes and apply that fast in the running database?
In MS SQL Server, I'd right-click > Script Table As > Alter To, save the query and run it wherever I desire, is there an as easy way as that to do it in an Access Database ?
Details:
It's an ACCDB MS-Access database created on Access 2007, copied and edited in Access 2007, in which I need to get some "alter" scripts to run on the other database so that it has all the new columns and tables I've created on my copy.
For new tables, just import them from one database into the other. In the "External Data" section of the ribbon, choose the Access icon above "Import". That choice starts an import wizard to allow you to select which objects you want imported. You will have a choice to import just the table structure, or both structure and data.
Remou is right that you can use DDL ALTER TABLE statements to add new columns. However, DDL might not support every feature you want for your new columns. And if you want not just the empty columns added, but also also any data from those new columns, you will probably need to run UPDATE statements to get it into your new columns.
As far as "Script Table As", see if OmBelt's Export Table to SQL tool for MS Access can do what you want.
Edit: Allen Browne has sample ALTER TABLE statements. See CreateFieldDDL and the following one, CreateFieldDDL2.
You can run DDL in Access. I think it would be easiest to run the SQL with VBA, in this case.
There is a product called DbWeigher that can compare Access database schemas and synchronize them. You can get a free trial (30 days). DbWeigher will write a script of all schema differences and write it out as DDL. The script is thorough and includes relationships, indexes, validation rules, allow zero length, etc.
A free tool from the same developer, DBWConsole, will let you execute a DDL script against any Access database. If you wrote your own DDL scripts this would be an easy way to apply the changes to your live database. It even handles some DDL that I don't know how to process in VBA (so it must be magic). DBWConsole is included if you downloaded the trial version of DBWeigher. Be aware that you can't make schema changes to a table in a shared Access database if anyone has the table open.
DbWeigher creates a script of all differences between the two files. It can be a lot to manually parse through if you just want a few of the changes. I built a parser for DbWeigher script files so they could be filtered by table, to extract just the parts I wanted. I contacted the DbWeigher author about it but never heard back. It's safe to say that I have no affiliation with this developer.

Import from Excel to MySQL database using SQuirrel

I have an Excel spreadsheet with a few thousand entries in it. I want to import the table into a MySQL 4 database (that's what I'm given). I am using SQuirrel for GUI access to the database, which is being hosted remotely.
Is there a way to load the columns from the spreadsheet (which I can name according to the column names in the database table) to the database without copying the contents of a generated CSV file from that table? That is, can I run the LOAD command on a local file instructing it to load the contents into a remote database, and what are the possible performance implications of doing so?
Note, there is a auto-generated field in the table for assigning ids to new values, and I want to make sure that I don't override that id, since it is the primary key on the table (as well as other compound keys).
If you only have a few thousand entries in the spreadsheet then you shouldn't have performance problems (unless each row is very large of course).
You may have problems with some of the Excel data, e.g. currencies, best to try it and see what happens.
Re-reading your question, you will have to export the Excel into a text file which is stored locally. But there shouldn't be any problems loading a local file into a remote MySQL database. Not sure whether you can do this with Squirrel, you would need access to the MySQL command line to run the LOAD command.
The best way to do this would be to use Navicat if you have the budget to make a purchase?
I made this tool where you can paste in the contents of an Excel file and it generates the create table, and insert statements which you can then just run. (I'm assuming squirrel lets you run a SQL script?)
If you try it, let me know if it works for you.