How to capture Firebird SQL queries? - sql

Is there any way to capture SQL queries transmitted by old application
created in Delphi/C++Builder + Firebird?
I don't have source code of that client app or access to (remote) database server.

Firebird 2.5 added the trace API which can be used to track prepare and execution of statements and a number of other things. The tools included in Firebird for use of the trace API are rather basic, but it might well be sufficient for your needs. Be aware that by default the trace API limits the size of statements captured and logged, and it might take some time to tweak the trace configuration to get all information you need.
An example configuration is:
<database mydatabase.fdb>
enabled true
log_statement_prepare true
time_threshold 0
max_sql_length 65536
</database>
This should capture all statement prepares with the full SQL query in the database mydatabase.fdb.
See for more information: Audit and Trace Services in Firebird 2.5.
There are several vendors who provide tools that utilize the trace API (for example FB Tracemanager by Upscene Productions), and as already mentioned in the comments, there is also FBScanner by IBSurgeon which acts as a proxy between the client and a Firebird server and allows you to record the traffic (including statements).

Firebird includes a utility fbtracemgr.exe that can be used for tracing. Here's a sample command line:
cd "C:\Program Files\Firebird\Firebird_3_0"
fbtracemgr -start -service localhost/3050:service_mgr -config c:\temp\fb-trace.config -user sysdba -password <secret> >c:\temp\fb-trace.log
Discussion of parameters:
The -start parameter instructs the tool to start a trace session. There are other parameters, just run fbtracemgr.exe without any arguments to see a list of possible parameters.
The -service parameter tells the tool which service to trace. It is essential that you use the same connection method as the client that you want to monitor.
Let's say you use FlameRobin, in this case you probably have defined a database connection that uses TCP/IP and that connects to localhost and the default TCP port 3050. To match this you have to prefix the service name with "localhost/3050".
If you want to trace an isql.exe session, then you probably let isql.exe connect without using localhost. In this case you have to omit the "localhost/port" prefix and just specify -service service_mgr.
The -config parameter specifies the path where the config file is located that contains the settings to be used for this trace session. Tracing must be configured with settings that define all the details of the trace, including what to trace. The settings can only be specified in the form of a configuration file.
The Firebird engine performs tracing of its own - the System Audit session. For this purpose it includes a trace configuration file located in its program folder. Use this file as an inspiration/template. It contains many commented options explaining purpose and syntax of each option. Filesystem location: C:\Program Files\Firebird\Firebird_3_0\fbtrace.conf.
The -user and -password parameters are necessary only if you want to monitor a TCP/IP connection. If you want to monitor direct connections without authentication (e.g. isql.exe) then you can omit the credentials.
The user you specify for tracing must, obviously, have the rights to "spy" on the connection being traced.
The example uses "sysdba" which has of course all the rights. The user of the connection being traced should also be ok.
The last part of the command redirects output to a trace log file. This is optional, but you'll probably want to do this because can be lots of output. You can open the trace log file in a text editor such as Notepad++ which will alert you when new content is written to the file.

Sorry for necroposting :) but I had the same question. And now we have the trace/audit tool in IBExpert IDE. It can be found Services menu.

Related

SSIS flat file folder permission error when NOT running from SQL Server Agent

Setup: A pretty standard data export SSIS package (SQL Server 2016 compatible), created in VS2019/Data Tools and deployed using the SSIS Project Deployment model to the Integration Services Catalog of a SQL Server 2016 instance. The package creates files in a network folder before sending the file out via FTP and putting a copy of the file in a Sent folder.
The project requirements include having the package running on a schedule using "default" parameter values, as well as allowing users to manually run the package using "non-default" parameter values from within a stand-alone application.
Current behavior: the package behaves correctly when run from a SQL Server Agent Job that is configured with a SQL proxy and credentials mapped to a domain login with the proper permissions for the network folder.
Problem: the Data Flow task fails to create the file with a "Cannot open the datafile" error when running the package directly using any of the following methods (even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job):
Using SSMS to right-click on the package and selecting Execute
Using the DTEXEC SQL utility
Using the SSISDB.catalog.start_execution SQL Server stored procedure
As far as I'm aware, these are the only methods capable of starting a SSIS package and changing the package's parameter values. I either need to get one of the latter 2 methods to work, find another option that allows for changing the parameter values while launching the package, or use one of 2 techniques I'm aware of (detailed below) that would add yet another failure point to the process as well as other potential issues.
Note: If the process is changed to initially create the file on the SQL Server's local harddrive, then the Data Flow task succeeds, but the later copy to Sent folder task fails with a very similar permissions error.
Alternative #1: this technique requires creating a new table, loading the parameter values to the table, changing the package to check the table and potentially set it's parameters/variables based on what it finds. The package can then be launched using a SQL Server Agent Job (for which there are multiple methods to manually launch them) and if the calling object has correctly populated the table, the package will behave as if it's parameters were changed at runtime otherwise it will run with the default values.
Alternative #2: Change all folders used by the package to point to folders local to the SQL Server instance and then create a separate scheduled task/application/whatever, with the valid credentials, that would synchronize or move the files to their proper network folders.
even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job
This is probably because the account is not logged on locally at the SQL Server, and so it's a Double-Hop Impersonation scenario, and would require Kerberos Constrained Delegation to be configured.
And you are correct in assessing the options. The general solution is to invoke catalog.start_execution from a session running on the SQL Server, and an Agent Job is the simplest built-in way to do this (the others being xp_cmdshell, Service Broker Activation, or SQL CLR).

SQL server : mapping network drive - Insufficient system resources exist to complete the requested service

Hello I am trying to create a new plan on SQL server to backup all my database.
My goal is to backup them to a network drive thus if I do have some trouble with my server, I will be able to restore databases to other server thanks to backup present in the network drive.
When my plan is executed, I do have some error so I try to execute manually the relative query.
After some investigation, it seems even net use command doesn't work (whereas it is working and I do it from cmd)
EXEC XP_CMDSHELL 'net use Z: \\ServerName\loggin/user:loggin password'
error is
System error 1450 has occurred. Insufficient system resources exist to complete the requested service.
Beside, I do have another server where it is working so I suppose some configuration missing but can't find them
as my network drive is also accessible via FTP, I chose this way to make the job : create a batch file that run winscp and use this batch file in a SQL agent job . I need to add right to batch file to SQL Server agent account. I also need to define a credential and a proxy to be used in the job.

Execute job spoon with software

I have a JOB done in SPOON, which is executed without problems in the command line, but I would like to know if there is any software in which I can execute these JOBS and go to see the execution visually. The idea is that for the most pleasant exploitation area these tasks are executed.
You have two solutions:
Carte:
Use the carte server which is shipped with the PDI. Install the PDI on any server, launch carte (specifying the port), then you can execute/view/stop/restart job/transformation from any browser. Documentation is here.
Of course you can launch a job/transformation from your own PDI. Just define a new Slave server, on the left panel, tab view, default username/password = cluster/cluster. Then each time you run a job/transformation, choose the carte server, instead of Pentaho/local in the Run configuration.
Loggin
If you just want to follow job/transformation, you may use the database logging: Right-click any where, Parameters, Logging, Job/Transformation, then define a database, a table and a logging interval of 2 seconds.
Then every two seconds, the line_read, line_written, errors, and log_field are written to a database. This database can be read by an external process and displayed on the screen or on a browser.
This method is used in the github/ETL-pilot which uses a tomcat (because you probably have a tomcat already running with a Pentaho server), but can easily be adapted to a nodejs or any other server. (If you do it and OpenSource it, please add a link to your work on our github).

Exception has been thrown by the target of an invocation in SSIS

I have a SQL job created on SQL Server Agent with Type:Operating System(CmdExec).
I have the following error in the log file generated.
Source: ST_CheckSrcFile
Description: Exception has been thrown by the target of an invocation.
And this is executed correctly when the command line is executed on the Server System thru cmd.
Anybody have a clue why this could be happening?
Yes, you have a coding issue. What that issue is, cannot be determined from your question's current lack of detail.
Since it works fine outside of Agent but fails from within, my prime assumption would be that you are accessing a file or network resource and the account SQL Agent uses, or the designated proxy for Job Steps of type CmdExec, do not have access to the resource.
It could resolved by simply using a UNC instead of a mapped drive letter or by granting the acccount rights to file system on the location machine or a myriad of other approaches but unless we know what the code is doing, we can't be more specific than this.

Using Web Deploy (msdeploy) to publish a WebMatrix site

I started building my site in WebMatrix and then switched to using VS2010 so I could have better Intellisense and debugging. I've been loading WebMatrix to deploy and it's been working fine.
However, loading WebMatrix is a PITA and I actually want more flexibility over the web deployment process.
So I started learning about msdeploy.exe and how to use it. I was able to successfully get the site to sync as I wanted with the following command line:
msdeploy.exe
-verb:sync
-dest:iisApp=MySite,wmsvc=www.mysite.com,username=administrator,password=blahblahblah
-allowUntrusted
-skip:absolutePath=webdeploy.cmd
-skip:absolutePath=web.config
-skip:objectName=dirPath,absolutePath="App_Data"
-skip:objectName=dirPath,absolutePath="bin"
-skip:absolutePath=vwd.webinfo
-source:iisApp="C:\Users\charlie\Documents\Visual Studio 2010\WebSites\MySite"
I had to use -allowTrusted because the cert on the server uses a differnt host name than www. No biggie. I have some -skips for stuff I don't want to write to the dest as well.
It all works great.
I use SQL Server (Express) on my host (a WebMatrix AMI on AWS).
I want to have the ability to push my database to the host as well. I am trying to use the following msdeploy commmand:
msdeploy.exe
-verb:sync
-source:dbFullSql="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=C:....\MySite.mdf;User instance=true"
-dest:dbFullSql="Server=www.mysite.com\SQLEXPRESS;Initial Catalog=webmatrix_db;Uid=webmatrix_user;Pwd=<pwd>"
This gives me
Error: The database 'webmatrix_db' could not be created.
Error: A network-related or instance-specific error occurred while establishing aa
connection to SQL Server. The server was not found or was not accessible. ...
I think my problem is the connection string. I copied Server=".\SQLEXPRESS;Initial Catalog=webmatrix_db;Uid=webmatrix_user;Pwd=<pwd> from the WebMatrix UI and pre-pended it with www.mysite.com thinking it needed my hostname somewhere.
Obviously this is not correct and I can't find any examples of connection strings that work either.
Note that SQL is not exposed directly by this server. I assume WebMatrix's invocation of msdeploy is connecting using my admin credentials (not the SQL credentials) first and then msdeploy invokes the SQL commands on the remote host. I need something like the ...wmsvc=www.mysite.com,username=administrator,password=blahblahblah in the -dest option of the first example I gave above.
It would be awesome if I could see a log of how WebMatrix was invoking msdeploy.
What is the correct msdeploy command to do what I want?
[UPDATE - ANSWER]
One of the best things about StackOverflow, is that posting a question really makes you think about what you are doing. Shortly after I posted the above, I realized the wmsvc=www.mysite.com,username=administrator,password=blahblahblah parameter in the -dest parameter was the key. The question became how to correctly add it to my specific example.
This msdeploy command line now connects correctly:
msdeploy.exe -verb:sync -source:dbFullSql="data source=.\SQLEXPRESS;Integrated Security=SSPI;AttachDBFilename=C:\Users\charlie\Documents\Visual Studio 2010\WebSites\Fiinom\App_Data\MySite.mdf;User instance=true" -dest:dbFullSql="Server=.\SQLEXPRESS;Initial Catalog=webmatrix_db;Uid=webmatrix_user; Pwd=rI2vP3rK6hV8nN8",wmsvc=www.mysite.com,username=administrator,password=blahblahblah -allowUntrusted
Now that msdeploy is connecting successfully and executing commands, I need to figure out how to make it actually merge the database. Right now it's giving me an error that a table already exists and can't create it...
This is related to your side comment on merging the database...
Currently Web Deploy does not have any provider that supports database merges - the dbFullSql provider uses SQL Server Management Objects ("SMO") to script out the db contents and we then apply it on the other side. Effectively Web Deploy thus will only overwrite the destination db with the source db.
If you are okay with that as a "merge" you can get around that table already exists error by using SMO scripting options - this is what WebMatrix does to make the db publishing/downloading work. To your source just add:
,scriptDropsFirst=true
(this scripts out drops for all the objects in your source database so that if they exist on the destination they will get dropped and won't block you)
You might also need:
,copyAllUsers=false
(if you aren't a sysadmin on the remote SQL database, chances are you won't be able to create logins, which are a server-level action. Typically if you don't use this setting, you'll get an error about creating a login, or login doesn't exist, because SMO scripts our your database user as "for LOGIN " and that login doesn't exist on the server)
Hope that helps!
Kristina