Endeca Forge Process - sql

I am using queries.xml file which contains 5 sql statements.
I would like to know if the 5 sql queries are executed at the same time or one sql query at a time when running the forge process?

The queries are executed one at a time as the file is processed. Also, the file is processed top down.

Related

How to check if an SQL Script executed successfully in MS SQL Server?

I have created multiple SQL DB Maintenance scripts which I am required to run in a defined order. I have 2 scripts. I want to run the 2nd script, only on successful execution of 1st script. The scripts contain queries that creates tables, stored procedures, SQL jobs etc.
Please suggest an optimal way of achieving this. I am using MS SQL Server 2012.
I am trying to implement it without using an SQL job.
I'm sure I'm stating the obvious, and it's probably because I'm not fully understand what you meant by "executed successfully", but if you meant no SQL error while running:
The optimal way to achieve it is to create a job for your scripts, then create two steps - one for the first script and for the second. Once both steps are there, you go to the advanced options of step 1 and set it up to your needs.
Screenshot
Can you create a SQL Server Agent Job? You could just set the steps to be Step 1: Run first script, Step 2: run second script. In the agent job setup you can decide what to when step 1 fails. Just have it not run step 2. You can also set it up to email you, skip to another step, run some other code etc... If anything the first code did failed with any error message, your second script would not run. -- If you really needed to avoid a job, you could add some if exists statements to your second script, but that will get very messy very fast
If the two scripts are in different files
Add a statement which would log into a table the completion and date .Change second script to read this first and exit,if not success
if both are in same file
ensure they are in a transaction and read ##trancount at the start of second script and exit ,if less than 1
SQL Server 2005’s job scheduling subsystem, SQL Server Agent, maintains a set of log files with warning and error messages about the jobs it has run, written to the %ProgramFiles%\Microsoft SQL Server\MSSQL.1\MSSQL\LOG directory. SQL Server will maintain up to nine SQL Server Agent error log files. The current log file is named SQLAGENT .OUT, whereas archived files are numbered sequentially. You can view SQL Server Agent logs by using SQL Server Management Studio.

Talend's tOracleInput does not read data

My colleague created a project in Talend to read data from Oracle database.
I used his project and so I have his Job context with connection parameters to Oracle DB and Talend successfully connects on that computer.
I've created a trivial job which is composed of two components: tOracleInput which should be reading data and tLogRow which should be redirecting output to Talend's terminal.
The problem is that when I start the job - data is not outputted to terminal and instead of row amount outputted per second I see Starting ... status.
Would it be connection issues, inappropriate java version on my computer or something else?
Starting... status means that the query is being executed. Usually it takes a few seconds to execute a simple query against the database. This is because of the Oracle database behavior that it starts to return the data without completing a full table scan. To use this feature you can use joins and filters, but not group by / order by.
On the other hand if you're using a view or executing a complex query, or just simply use DISTINCT it could happen that the query execution takes a few minutes. This is because the oracle database generates the ResultSet on the database side before returning the records.

SSIS Query Takes time

We have situation where query takes time through SSIS packages during the weekend run whereas its run in 3 mins during weekdays run. There is considerable increase in the records count but with that it should run in max of 15 mins duration.
We have found temporary solution to overcome this but this one is manual effort need to performed every weekend run.
Temporary solution is, we run the SQL task source query in the SSMS before the package gets triggered from a job.
Whereas the query running through will also take longer time execute but we abort manually ran query. This creates Execution plan Cache of that DB server.
After this when query runs from package it will execute in 3 mins regardless the no of records.
Kindly let us know if any permanent fix can be done for this.
Thanks,
SANDY

Modify statement prior execution

I use an application connected with an sql database. I found using the profiler that the application runs an update query with a syntax error. I don't have access to the application's source code. The result is that the record is not updated. Is there a way to modify the query every time it is executed with something like trigger? I can't use INSTEAD OF because there ism't any record updated or inserted.
This answer
https://stackoverflow.com/a/3319031/1359088
suggests a way to log to a text file all the errors. You could write a little utility and schedule it to run every hour or whatever, which could read through this log, find the erroneous sql statements, fix them, then run them itself.

SQL Job Occurring every Less than 10 seconds

Is it possible to make Job Schedule, which will occur every less than 10 seconds?
Because Sql server doesn't allow that.
Schedule type is "Recurring" and Occurs "Daily".
Select occurs Daily and run every 10 seconds. Although keep in mind if your job takes longer than the time specified to run, the agent won't invoke it again until the running task completes.
See comments below. The UI tool doesn't allow input for less than 10 seconds. In your case you could schedule two job tasks offset by some number of seconds. The issue then is that the jobs could overlap as they are distinct tasks as far as SQL Agent knows.
I tried do something similar (SQL Server job with prcise timing), but the only reliabile way I found was custom windows service execution SQL statements.