how to display the result of a procedure running from the server side in oracle? - sql

I am trying to automate a daily monitoring activity where there are set of scripts to be executed(all are select statements). I am in the process of creating a procedure which runs these scripts and by means of scheduler, this will be running daily once. My problem is since these activities are taking place in server side(server backbone), How do i save the results? Earlier we will run all the scripts manually and save it in a notepad. Is there any option to do the same in automation? Like saving in our PC or SQL developer? Instead of logging in to server and searching the path where the file is saved? I thought of saving the results in a table but i am looking for a better option.Please suggest...

Generally it is a good idea to save the results in a table as this gives you flexibility when querying the results or exporting them in multiple formats.
There are multiple options to get the data to the client:
Query the table with the results from the client
Generate a HTML from the results table and have make it accessible from a HTTP server.
You can also create a web PL/SQL package and generate the HTML within (http://docs.oracle.com/cd/B28359_01/appdev.111/b28424/adfns_web.htm#i1006207)
Export the data from the results table to a file and put it in a shared directory that is accessible by the client.
Email the results from the PL/SQL package.

I thought of saving the results in a table but i am looking for a better option.
What is exactly the issue with the "table" option?
Regarding "saving in our PC or SQL developer": one problem with that is that a PC/app screen is:
a PC is usually less resilient to reboots, crashes, etc.
it's intended for private use. Unless you're working alone - these logs may be of interest to other people;
..
Other options: it can be made to send e-mail; copy the file to a well known place (incl. one which is directly mounted on your PC); write to database table (as already suggested); and more.

Related

SQL database copied and updated

I have a main system that I use to add records to and run multiple routines on an SQL database using MS Access. Some routines take days to run.
I want to build a second PC where I can hopefully easily update its copy of the database and then run the long routines on while continuing to keep up the day-to-day activities on the main system.
How easy (or feasible) is it to take a copy of a sql database from one computer and update it on another?
Those processing times sound rather large. I would suggest that you consider building some server side routines that run on SQL server – they will run much faster, and more important reduce if not eliminate most network traffic. Keep in mind that working with lots of data using Access to SQL server can and will often run SLOWER than just using Access without SQL server.
As for moving the SQL database to another computer? The idea and concept is very much like moving a file like Access to another computer. You simply from SQL manager create backup file on the first computer. (it will be a single file – choose device, and then add a file location). You also find that such .bak files from SQL zip REALLY well, so if you using some kind of FTP or internet, then zip it file before you transmit it.
Regardless, you can even transfer that “bak” file with a jump drive or whatever. You then restore that database to the other computer – and you off to the races. (on the target computer, or your local computer, simply choose “restore” and restore the database from the “bak” file you transferred to the other computer running sql server. So the whole database with many tables etc. will be turned into a single file - much like access is.
So moving and making copies of a SQL database is a common task, and once you get the hang of this process, it not really much harder than moving an access file between computers.
I would however question why the process are taking so long – they may well be complex, but the use of store procedures and pass-through quires would substantial speed up your application as opposed to processing the data with a local client like Access + VBA. So try adopting more t-sql and store procedures – they will run “many” times faster – often 1 hour can be cut down to say 1 minute or less. So moving is easy, but you might eliminate the whole "day" of time down to a few minutes of time if you can run the processing routines server side as opposed to client side of which will occur when using Access as the client. (the access client can most certainly send or call t-sql routines that run server side - but the main trick here is to get those processing routines running server side).
Last time I used ms access it was just a file like an excel file which you could simply copy to the other machine and do whatever you want with.
If you are fairly comfortable with sql/administration, and only need a one-way copy (main system to second PC) then you could migrate ms access to mysql:
http://www.kitebird.com/articles/access-migrate.html
(Telling Microsoft Access to Export Its Own Tables)
This process should be fairly easy to automate if you need to do this regularly.

SQL Server database : amalgamate 90 database update scripts into a single script

I have an application that has been released for several years. I have updated the SQL Server database around 90 times and each time I have created an update SQL script, so when the user runs the new software version the database is updated.
I would like to amalgamate all the updates into a single SQL script to make deployment faster and simpler, is there an easy way to do this? I presume I could simply grab the latest database after it has run through all 90 updates via SQL Server Management Studio?
EDIT
For clarification; the software I wrote automatically applies new database updates when the user downloads the latest version. This is done via C# / .Net and look for an embedded sql script on startup in the format XX_update.sql calling each script one by one i.e.
1_update.sql - this creates the tables and initial data etc. This was my initial release database.
2_update.sql - updates to the initial database such as adding a new SP or changing column datatype etc
3_update.sql
4_update.sql
...
90_update.sql (4 years and lots of program updates later!).
Ideally, I would install my software and create a brand new database running through all 90 update scripts. Then take this database and convert it into a script which I can replace all 90 scripts above.
This is too long for a comment.
There is no "easy" way to do this. It depends on what the individual updates are doing.
There is a process you can follow in the future, though. Whenever an update occurs, you should maintain scripts both for incremental updates and for complete updates. You might also want to periodically introduce major versions, and be able to upgrade to and from those.
In the meantime, you'll need to build the complete update by walking through the individual ones.
I use a similar system at work and while I prefer to run the scripts separately I have amalgamated several scripts sometimes when they have to be deployed by another person with no problem.
In SQL Server the rule is that as long as you separate the scripts by go and use SQL Server Management Studio or another tool that process the batch separator properly there is no problem in amalgamating it, because it would look like separate scripts to SQL Server (instead of being sent to SQL Server as a one big script the tool send it in batches using the go as the batch separator).
The only problem is that if you have an error in the middle of the script, the tool would continue sending the rest of batches to the server (at least by default, maybe there is an option for changing this). That is why I prefer when possible using a tool to run then separately, just to err on the safer side and stop if there is a problem and locate easily the problematic command.
Also, for new deployments your best option is what you say of using a database that is already updated instead of using the original one and apply all the updates. And to prevent being in this situation again you can keep a empty template database that is not used for any testing and update it when you update the rest of databases.
Are you running your updates manaually on the server. Instead, you can create a .bat file, powershell script or a exe. Update scripts in your folder can be numbered. The script can just pick up updates one by one and execute it over the database connection instead of you doing it.
If you have multiple script files and you want them to combine into one,
rename these as txt, combine, rename the resulting file as sql.
https://www.online-tech-tips.com/free-software-downloads/combine-text-files/
This is easy. Download SQL Server Data Tools and use that to create the deployment script. If there's no existing database, it will create all the objects, and if targeting an older version it will perform a diff against the target database and create scripts that create the missing objects, and alter the existing ones.
You can also generate scripts for the current version using SSMS, or use SSMS to take a backup, and use a restore to in an install.

Programmatic Export with Indeterminate Table Structure from SQL Server 2008 using T-SQL

I have searched for an answer to this, and one seems not to exist.
Problem:
A website is querying a database and unable to return results (as an export to Excel) in a timely fashion. This is primarily due to result set size. I'd like to set up a background process to 'ping' for waiting queries and execute them one-by-one, dumping data into a location to be downloaded from. The 'pinging' task can be handled a whole host of ways. My original ideal solution was a trigger (alternatively, a SQL Server Agent task) that exported the data to the filesystem. But I have run into an issue where I don't know how to set up an amorphous output to the filesystem with a simple T-SQL statement.
SSIS is apparently the standard solution to this. I don't know enough about SSIS to know whether it will handle what I want it to do, but I have been told the queries are too great in number / various in output for that to be a feasible solution.
xp_cmdshell can be run to do a BCP export. This works fine, but apparently opens a security hole.
Previous solutions:
A solution I used years ago, DTS passing data straight to the operating system, seems to have been disabled in SQL Server 2008/ 2012. I also used to be able to use sp_makewebtask to export data directly to the filesystem but no longer can do that either.
Current solution
I am writing a PowerShell script tied to some SQL tables and stored procedures to manage execution. This seems like a non-ideal solution; I'm curious as to whether I have missed something. Is there an easy way to set up SSIS to export data without a structure? A way to create an Excel file on the fly and fill it with data?
The answer seems to be No.
You can export to CSV instead of Excel (because Excel opens CSV files easily), but they don't have any formatting. You can set up SSIS (or BCP in a scheduled task) to export in the CSV file a single column which already contains the commas and the text delimiters, so the data would be presented by Excel in separate columns.

Race condition caused by cursors persisted to temp files - is it possible

I'm troubleshooting a problem with a Visual Fox Pro application (built with the Visual Fox Express framework) which I suspect is being caused by a race condition. The application is being hosted on a Citrix XenApp server and under certain conditions, data displayed on a certain form appears to be incorrect, and changes to something other than what the user is entering.
The form in question displays a list of records returned from a query on a SQL Server database based on certain information entered by the user.
If this is what is happening I suspect the sequence of events is something like this:
1) User 1 enters data and causes form to dispay grid of data of
results returned from database.
2) User 2 opens same form on different Citrix session and enters data
causing form to display a grid data of results returned from database.
This cursor gets persisted to disk and overwrites, or somehow
conflicts with User 1's cursor for that form.
3) Some FoxPro cursor mechanism on User 1's instance sees changed data
in the cursor (from User 2) and updates the screen with data from the
cursor.
I don't know much about how FoxPro works but from what I understand in some circumstances a cursor will be persisted to a temp file. On our Citrix application server this temp folder may be shared by between 10 and 50 users. I'm looking for information about if a race condition caused by a cursor written to a file in the temp folder is something that is even possible so that I can continue researching down that path or rule it out definitively.
I know there are ways to make it so that the FoxPro temp files are written to a different folder for each user and I am working on making the change to do that but I would like to find out if anyone else has seen a similar problem or thinks that what I suspect is actually possible.
IT does sound strange, but yes, Foxpro creates temp tables of cursors it uses for display and query results, such as local or remote data access. However, when created, they are created as read-only or read-write, but ONLY for the person per connection. When a cursor attempts to be created, it generates a random file name for the results and uses that as the .dbf cursor for presenting to the user.
COULD IT be a racing issue? I doubt that, but not knowing specifics of the quite old Visual FoxExpress framework, don't know what/where you would configure to have it dynamically use a different location of temp files. It should be going to the temp files path of the Windows environment variables. So, if users of the Citrix connection are using the same user / password for multiple sessions, yes, it would go to the same location, but when trying to generate the temp file, it would fail getting an exclusive handle and try again with the next random file name.
I'd say very unlikely that temp files are implicated here. Each cursor you create uses a different temp file; I don't see how two users, even in a Citrix-type situation, would share a single temp file.

Free Software to Allow Users to Run Oracle SQL Scripts but NOT create them

We would like to allow users to run custom Oracle 11G SQL scripts that have been created for them, complete with parameter prompts, and get a CSV extract of the resulting dataset. Right now, I just use SQLPlus and SQLDeveloper to do those things, but those tools would allow the creation of custom scripts as well, and we do not want users to try to create custom queries.
In many cases we intend to fulfill this need with Crystal Reports/Crystal Server, but we use CR XI, and sometimes very WIDE extracts are difficult to create because of the page size limitations. It also has a limit for the number of concurrent users, and sometimes we may need more.
Does anyone know of a FREE tool that can allow users to execute Oracle SQL Scripts and get file exports as a result and yet will NOT allow them to create new Scripts?
NOTE: We have a Citrix environment and therefore are able to limit where the script files are located and what access users have to those files and folders.
Given that a SQL script is just a text file, I'm not sure I see how this could be possible but perhaps I'm missing something about how you see something like SQL*Plus allowing the creation of custom scripts. If you give me any tool that runs SQL scripts, I can always open my favorite text editor, write a SQL script, and have your tool run it (assuming that you allow users to create new files in your Citrix environment or to map a file from their local machines).
Personally, I'd probably create a small APEX application in the database that would present a menu that let users pick an export. Behind the scenes, the APEX app would run whatever select was necessary (I'd generally create a CLOB in the database rather than a file on the file system assuming unless you're making a great deal of use of SQL*Plus formatting commands in your scripts) and would allow the user to download the file (or use some alternate file delivery mechanism such as email).
I use jasper reports for that: http://jasperforge.org/index.php?q=project/jasperreports