I have been using DBeaver as a replacement for SQL Server Management Studio, and I am loving it. The only thing I cannot figure out is how to associate a script with a database. For example, in SSMS I can right click a database (let's call it A) in the object explorer and click "New Query" (or ctrl+n) to open a script that is active within database A. If I open another script in a different database (B), that script is associated with the database B. When I switch back to the original script, I am back to working with database A without having to manually select from the database list or executing a USE statement. Obviously, switching back to the second script will make database B active again.
Unfortunately, in DBeaver, there appears to be only one active database for all scripts. Is there a way to set this up in DBeaver to act like SSMS in this manner?
Edit: DBeaver refers to individual databases as catalog/schema. That is what I am trying to associate with individual scripts.
I am using DBeaver community and i have using with multiple DB's. You have to just select Data base from drop down to change DB, it is for current script that open .
i current not have my company system connected , find one sample image from internet .
I don't know that DBeaver has this capability, but you could always Preface your scripts with the USE DBNAME statement...
UPDATE
Version 6.3.1 (2019-12-22) now supports this by default! Here is the first note of the description for 6.3.1:
SQL editor:
Active database/schema change now affects current editor only
Note: This does seem to have changed some behavior such as "Set active" from the database navigator as well as the "Auto-sync xx with navigator." These two used to take affect to the active database/schema. Now they are tied to the current connection.
Original Answer
For anyone who finds this in the future, I did find a work around that provides the desired behavior. The answer is to use projects. Unfortunately, this means creating a duplicate connection to the server.
There is also another catch here. If you want to set a schema for a specific script, you should avoid setting the schema via the "Database Navigator." This will, however, work if you want to change the schema for a script that is already in the active project. If the script is not in the active project, and you set an active schema through the Database Navigator, it will change the active schema for all open scripts associated with that server. To keep it simple, I try to avoid the Database Navigator all together.
All is not lost by avoiding the Database Navigator. By default, the "Projects" window is a tab right next to the Database Navigator. If you expand (click the + next to the name) [Project Name] -> Connections -> [Connection Name], you will have your list of databases/schema right there. Use this as your new Database Navigator and you are all set.
Related
Our project is moving from MySQL to MS SQL and after a long time working with MySQL Workbench I really miss some features in SQL Server Management studio (2014).
Do you know whether they exist in SSMS or there is an alternative/replacement application for SSMS to work with database?
Functions are listed below:
Generate update data script to review and to be able to copy-paste it. Do not update data when I move to another row when the table is opened for editing.
Some changes are still made in database in our project, and sometimes it's easier to add some rows manually in 5 tables, get the script, test it and run the script at production environment. I don't want to write a script for each update and I don't want to make a mistake when copying data to production server using edit table option.
Review update table script BEFORE the changes were made, not after (I am talking about Tools - Options - Designer - Auto generate change scripts).
Upload a file using select file dialog into a binary field.
Again, I know about using OPENROWSET function, just interested how to do it as I used to.
Ability to view large text fields in a convenient way in SSMS. Now I have to copy data from a field and paste it into notepad. (For example, error message with a long trace log)
Save a few tabs with some useful scripts and open all of them when I open SSMS.
Is there any way to organize tabs to be able to work with 10+ tabs more effectively? Now only 6 of them can be shown on the screen (compate it to 15 tabs in MySQL WB).
Simple 'search field' (like Ctrl+F in Excel) to be able to search data in all fields displayed on the screen.
I would appreciate any ideas.
Thank you.
Consider the following scenario:
One PC is running an Access database. An old legacy script will copy over all the contents from this Access database to a SQL Server (A) over LAN. But before it does this, it will delete all contents from the destination database, so it doesn't have to deal with existing records. Previously existing records are never altered.
On the SQL Server (A), replication is defined and it acts as a publisher. It will publish/replicate the data to SQL Server (B) over WAN.
This all works very well, but the only problem I'm facing is when there's a problem during the copying of contents to SQL Server (A), SQL Server (B) will be empty or missing records, even the records that were replicated a long time ago.
There are two solutions I've already considered:
Adapt the legacy script to only copy over new records, since the updates are incremental of nature.
Configure the replication to avoid DELETE statements.
The first solution is in this scenario not possible. The application is closed-source, and there's really nothing we can change.
The second solution would be ideal, but (A) would try to replicate records that already exist on (B), and I'm not quite sure how to handle that.
Surely there's a sound concept to this problem, I just haven't figured it out yet..
In SSMS go to Replication -> Local Publications and right click on your publication and select Properties. In the Publication Properties window click on 'Articles' and select the relevant article. Go to "Article Properties" and select "Set Properties of Highlighted Table Article". In the article properties window, change the "DELETE delivery format" to "Do not replicate DELETE statements".
After the change click OK and you will see a prompt. As the article property has been changed the subscriptions need to be reinitialized. Click "Mark for Reinitialization" which causes the snapshot to be applied to the subscriber.
In SSMS, navigate to Replication and right click and select "Launch Replication Monitor". Go to your publication and click View Details to see the snapshot progress.
I have two versions of the same database, say DB1 and DB2. DB1 is a copy made of the mdb and the log file a month ago. The database structure and data has changed since then. I need to switch back and forth between these two copies in SQL Server Management Studio.
The structure of the Customer table in these versions is different. So it is easy to see which version is loaded in Management Studio.
I detach DB1 and attach DB2 and do select * on Customer and see the structure still belonging to DB1. How do I switch to DB2 properly?
I am using the right use DB statement and have the right db selcted in the dropdown on the left hand side for selecting databases.
The drop-down at the top controls which DB you are using, if the DBs are on the same instance of SQL Server. If they are different instances, use the "change connection" button at the top left, then pick your DB from the drop-down.
Somehow Management studio was caching the location of the file. When attaching the database, I had to go to the "current file path", and edit the wrong path and point to the right one. This had to be done even when after clicking the add button I had already chosen the right mdb file with the right path.
I've got an .rpt file that I did not write and can find no documentation about. I want to be able to review the SQL that is generated from this report so that I can figure out, well, what data it was pulling and what WHERE clause parameters were used.
I can open it up and see the report layout. But when I select Database|Show SQL Query... the report tries to connect to the data source. The problem is, the data source being used is unknown to me, probably an ODBC connection used by whoever wrote the query. All I can do at that stage is 'Cancel' and I'm back to looking at the report designer.
Am I missing something? Can I get to the SQL query without connecting to the datasource? It seems like viewing the selection criteria shouldn't be dependent on a data connection.
Thanks.
version: Crystal Reports 2008
I know that this is an old thread, but I encountered this same problem. Effectively we used to have a database/application that has since been aquired by an external agency.
Although they now have the database/application they don't have access to crystal reports, so we can't just send them the old report that we used to run. Likewise we can't run it as we don't even have the database set up anywhere.... So instead our plan was just to extract the SQL code generated by the report and forward that on.
We experienced the same problem, but the solution is actually pretty simple.
If you don't have access to the original data source, just create a new 'blank' datasource (such as an ODBC connection). As long as the connection to the datasource works (i.e. it is some kind of valid datasource this it works fine). When running the 'Show SQL' option point the report to this datasource. As long as you don't try to actually run the report (and only show the SQL) the operation wont fail. This worked for our situation anyway. (Crystal Reports 2008)
(I can give more details if it helps in any way.)
It should be possible to find out some details about the existing datasource, by selecting Database > Set Datasource Location... .
As well as enabling you to change the datasource location, this should show you some information about the current datasource, such as which type of datasource is being used, and possibly (dependant on the type of driver) the name of the database. It is likely to be less helpful if (as you surmise) the datasource is ODBC, but if it uses a native driver there may be something useful.
Without the password, I'm not sure how much you can do. It seems "Show SQL Query" requires to report to run first, then generate the SQL plan.
It's not ideal, but you could go to Database > Visual Linking Expert to at least see the tables and how they are joined, and the go to the Record Selection Formula Editor and see what the custom WHERE statements are.
Viewing the SQL of a Command in a Crystal Report File
There are times you have just the report file, but not the associated database structure that the report uses.
This is common when dealing with example reports of functionality you wish to mimic.
This is a workaround ONLY to allow you to see the SQL of a Command that a Crystal Report is based on, when you don't have the underlying database connection that the report is based on.
In essence, the dialog box has to be satisfied before it will show the SQL, so we fool it with a legitimate Data Source, just not one that would work with the SQL that is actually in the SQL Command.
Why does a report use a command? Doesn't Crystal Reports have the ability to link tables?
When a Crystal Report is based on a record set that is too complex for the table linking functionality within Crystal Reports, the report can instead be based on a SQL Query, usually developed/tested in another editor tool and pasted into the command. This allows advanced SQL functions to be utilized.
If you don't already have a Data Source on your computer set up that you can connect to, you will need to build one first.
A simple Microsoft Access .mdb file saved in a simple location will suffice.
I placed mine with the path C:\A_test\test.mdb to make it easy to find.
If you don't have one, google for a sample mdb file and download it, saving it with a name and location you can remember. (You won't ever actually open this file, but just connect to it.)
Once you have the file saved, open the ODBC Administrator and create a New Data Source.
(you can get to the ODBC Administrator quickly from Start > type ODBC in the Search)
On the User DSN tab, click the Add button.
Scroll down the driver list to Microsoft Access Driver (*.mdb), select it and click the Finish button.
In the Data Source Name box, type a name (I used MyTest).
Click the Select Button and select the mdb file you saved from a previous step, click OK.
Click OK again. You will see your new Data Source listed by the name you gave it. Click OK.
You now have the data source you will need for the next steps.
Open the Crystal Report you want to see the SQL command for, and click on Database Expert button or Database>Database Expert Menu.
Under Selected Tables, right click on the Command and choose View Command
The Data Source Selection Box appears. Select the Data Source you created (or one you already use) and click the Finish button. The View Command box should open with the SQL in the left pane. Copy the SQL into your favorite text editor.
Whats happening is that the crystal reports needs a database to connect to regardless if its the original source DB or not.
Create a local database or use a database stored on a server, added it to your ODBC Datasources and use it when connecting. After a successful connection you should be able to view the SQL query without an error.
So basically I'm building an app for my company and it NEEDS to be built using MS Access and it needs to be built on SQL Server.
I've drawn up most of the plans but am having a hard time figuring out a way to handle the auditing system.
Since it is being used internally only and you won't even be able to touch the db from outside the building we are not using a login system as the program will only be used once a user has already logged in to our internal network via Active Directory. Knowing this, we're using a system to detect automatically the name of the Active Directory user and with their permissions in one of the DB tables, deciding what they can or cannot do.
So the actual audit table will have 3 columns (this design may change but for this question it doesn't matter); who (Active Directory User), when (time of addition/deletion/edit), what (what was changed)
My question is how should I be handling this. Ideally I know I should be using a trigger so that it is impossible for the database to be updated without an audit being logged, however I don't know how I could grab the Active Directory User that way. An alternate would be to code it directly into the Access source so that whenever something changes I run an INSERT statement. Obviously that is flawed because if something happens to Access or the database is touched by something else then it will not log the audit.
Any advice, examples or articles that may help me would be greatly appreciated!
Does this work for you?
select user_name(),suser_sname()
Doh! I forgot to escape my code.
Ok, it's working here. I'm seeing my windows credentials when I update my tables. So, I bet we missed a step. Let me put together a 1,2,3 sequence of what I did and maybe we can track down where this is breaking for you.
Create a new MSAccess database (empty)
Click on the tables section
Select external data
Pick ODBC database
Pick Link to the datasource by creating a linked table
Select Machine datasource
Pick New...
System Datasource
Pick SQL Server from the list and click Next, Finish.
Give the new datasource a name and description, and select (local) for the server. Click Next.
Pick "With Windows NT authentication using the network login ID". Click Next.
Check Change the default database to, and pick the DB. Click Next. Click Finish.
Test the datasource.
Pick the table that the Trigger is associated with and click OK.
Open the table in Access and modify one of the entries (the trigger doesn't fire on Insert, just Update)
Select * from your audit table
If you specify SSPI in your connection string to Sql, I think your Windows credentials are provided.
I tried playing with Access a bit to see if I could find a way for you. I think you can specify a new datasource to your SQL table, and select Windows NT Authentication as your connection type.
Sure :)
There should be a section in Access called "External Data" (I'm running a new version of Access, so the menu choice might be different).
Form this there should be an option to specify an ODBC connection.
I get an option to Link to the datasource by creating a linked table.
I then created a Machine datasource. I selected SqlServer from the drop down list. Then when I click Next, I'm prompted for how I want to authenticate.
CREATE TRIGGER testtrigger1
ON testdatatable
AFTER update
AS
BEGIN
INSERT INTO testtable (datecol,usercol1,usercol2) VALUES (getdate(),user_name(),suser_sname());
END
GO
We also have a database system that is used exclusively within the organisation and use Window NT logins. This function returns the current users login name:
CREATE FUNCTION dbo.UserName() RETURNS varchar(50)
AS
BEGIN
RETURN (SELECT nt_username FROM master.dbo.sysprocesses WHERE spid = ##SPID)
END
You can use this function in your triggers.
It should be
select user name(),suser sname()
replace spaces with underscores
you need to connect with integrated security aka trusted connection see (http://www.connectionstrings.com/?carrier=sqlserver)
How many users of the app will there be? Is there possibility of using windows integrated authentication for SQL authentication?
Updated: If you can give each user a SQL login (windows integrated) then you can pickup the logged on user using the SYSTEM_USER function.
My solution would be not to let Access modify the data with linked tables.
I would only create the UI in Access and create an ADO connection to the server using windows authenticated in the connection string. Compile you Access application as dbe to protect the VB code.
I would not issue SQL statement, but I would call stored procedures to perform the changes in the database, and create the audit log entry in an atomic transaction.
The UI (Access) does not need to know the inner works on the server. All it needs to do is request and update/insert/delete using the stored procedures you would create for this purpose. The server should handle the work.
Retrieve a record set with ADO using a view with the hint NOLOCK implemented in the server and cache this data in Access for local display. Or retrieve a single record and lock only that row for editing.
Using linked tables your users will be locking each other.
With ADO connections you will not have the trouble to set ODBCs on every single client.
Create a table to set the server status. You application will check it before any action. you can use it to close the server to the application in case that you need to perform changes or maintenance.
Access is a great tool. But it should only handle its local data and not be allowed to mess with the precious server.