I'm looking to see if there is a way to code PB to upload a sql script and to save the output. I know how to code PB by writing the sql directly in it, but I want to program it so I can do something like 'click browse', select .sql file, and save the results in a text file. I've been searching but can't seem to find what I'm looking for. Any guidance would be much appreciated. Thank you.
In a nutshell you can open the desired file, read the contents into a string, use the sql string to create a datastore (see SyntaxFromSql method), execute the sql via a retrieve, then save the results with the SaveAs method.
Related
I have worked in SAS for much of my career and always found it easy to write a dataset out to csv using a proc export. I now am working in SQL Server and am not finding any similar functionality. Everything I find on the web refers to copy/paste the data into excel or using an export wizard. I don't like those options as the end goal is to automate this query and have the data output where it can be utilized by other programs. Is there any code based way to achieve this?
Are you using SSMS? CTRL-T then execute your query, you'll get a text (csv) result. Alternatively, Query -> Results to -> Results to Text. Alternatively, you can right click the results grid and select Save Results As
For automation, you can use SQL Server Agent to schedule a job the writes the output of a query to a file.
I'm trying to find a specific string on my database. I'm currently using FlameRobin to open the FDB file, but this software doesn't seems to have a properly feature for this task.
I tried the following SQL query but i didn't work:
SELECT
*
FROM
*
WHERE
* LIKE '126278'
After all, what is the best solution to do that? Thanks in advance.
You can't do such thing. But you can convert your FDB file to a text file like CSV so you can search for your string in all the tables/files at the same time.
1. Download a database converter
First step you need a software to convert you databse file. I recommend using Full Convert to do it. Just get the free trial and download it. It is really easy to use and it will export each table in a different CSV file.
2. Find your string in multiple files at the same time
For that task you can use the Find in files feature of Notepad++ to search the string in all CSV files located at the same folder.
3. Open the desired table on FlameRobin
When Notepad++ highlight the string, it shows in what file it is located and the number of the line. Full Convert saves each CSV with the same name as the original table, so you can find it easily whatever database manager software you are using.
Here is Firebird documentation: https://www.firebirdsql.org/file/documentation/reference_manuals/fblangref25-en/html/fblangref25.html
You need to read about
Stored Procedures of "selectable" kind,
execute statement command, including for execute statement variant
system tables, having "relation" in names.
Then in your SP you do enumerate all the tables, then you do enumerate all the columns in those tables, then for every of them you run a usual
select 'tablename', 'columnname', columnname
from tablename
where columnname containing '12345'
over every field of every table.
But practically speaking, it most probably would be better to avoid SQL commands and just to extract ALL the database into a long SQL script and open that script in Notepad (or any other text editor) and there search for the string you need.
I have exported a table from another db into an .sql file as insert statements.
The exported file has around 350k lines in it.
When i try to simply run them, I get a "not enough memory" error before the execution even starts.
How can import this file easily?
Thanks in advance,
Orkun
You have to manually split sql file into smaller pieces. Use Notepad++ or some other editor capable to handle huge files.
Also, since you wrote that you have ONE table, you could try with utility or editor which can automatically split file into pieces of predefined size.
Use SQLCMD utility.. search MICROSOFT documentation.. with that you just need to gives some parameters. One of them is file path.. no need to go through the pain of splitting and other jugglery..
i want to ask is it possible to read the data in *xls file then get data and update to *.dbf file based on the staff id using vb.net script? if yes, do i need convert the xls to dbf first and then do the update or can just read the xls and update to dbf file without converting?
You can just read the xls file using COM and then update the database using the data you read in from Excel, no need to convert anything.
Saying that it might be simpler to convert the excel data to an access database and just use queries to update the data from there. It depends how complex your data is and how often you have to do this task.
By the way, VB.NET is not a scripting language, it is a pre-compiled language. I assume that by saying "script" you are thinking of vba? What you are trying to do could just as well be done in vba though...
UPDATE:
Here are some details on how I think you should proceed with this. You can google all the stuff required for each step.
Reading in the data from Excel:
If you can you should really avoid yourself a whole load of work and export the data to a csv format file, and read it using a StreamReader.
If you really need to read it in straight from Excel, you need to go the COM automation route and you should have a look at using the Microsoft.Office.Interop.Excel namespace. It is very similar to automating Excel via vba, so if you are familiar with that it should be relatively easy.
Writing the data to your database:
This part is quite simple. Use the System.Data.OleDB namespace to create a connection to your database and then generate an OleDBCommand with the correct query to insert your data and hey presto your're done. If you have a complex database and want to manipulate a lot of data you should really have a look into adapting the LinqToSQL framework to work with access or into how to use the IQToolkit.
I hope this helps.
UPDATE 2:
Just remembered that I wrote a method to write data to Excel a couple of weeks ago. It's not the same as reading data in, but it should be very similar... In fact reading data should be even simpler. See if this helps you get started:
You need to import Microsoft.Office.Interop.Excel for this to work
Private Function ExportToExcelFile(ByVal FileName As String) As Boolean
With New Excel.Application
With .Workbooks.Add()
For Each sheet As Worksheet In .Worksheets
If sheet.Index > 1 Then
Call sheet.Delete()
End If
Next
With CType(.Worksheets(1), Worksheet).Range("A1")
' Do stuff with the cells in you spreadsheet here
End With
Call .SaveAs(FileName)
End With
Call .Quit()
End With
End Function
For writing stuff to your database here is an example which I provided to somebody else recently.
I would like to speed up a process of reading data from txt file. txt file looks as following:
"NameA";"407;410;500"
"NameB";"407;510"
"NameC";"407;420;500;600"
and I would like to have it as :
"NameA";"407"
"NameA";"410"
"NameA";"500"
"NameB";"407"
"NameB";"510"
"NameC";"407"
"NameC";"420"
"NameC";"500"
"NameC";"600"
Any thoughts on performing the task with SQL stored procedure?
Thanks
Any thoughts? Yes... don't do it!
PL/SQL is only good for handling "database" objects - tables, rows, columns etc. Don't try to get it to do something it's not suited for - it will only lead to frustration and a bad solution.
Instead, use a shell script/command to massage your input file into a form directly and conveniently useable by your database.
I would suggest using sed or awk.