Reading txt file using stored procedure and manipulation with the same - sql

I would like to speed up a process of reading data from txt file. txt file looks as following:
"NameA";"407;410;500"
"NameB";"407;510"
"NameC";"407;420;500;600"
and I would like to have it as :
"NameA";"407"
"NameA";"410"
"NameA";"500"
"NameB";"407"
"NameB";"510"
"NameC";"407"
"NameC";"420"
"NameC";"500"
"NameC";"600"
Any thoughts on performing the task with SQL stored procedure?
Thanks

Any thoughts? Yes... don't do it!
PL/SQL is only good for handling "database" objects - tables, rows, columns etc. Don't try to get it to do something it's not suited for - it will only lead to frustration and a bad solution.
Instead, use a shell script/command to massage your input file into a form directly and conveniently useable by your database.
I would suggest using sed or awk.

Related

When should we use Infile statement

I need a little help from SAS experts. I am trying to import excel files into SAS environment using Infile statement. But I an facing errors. Could you please help to enlighten me in which situation Infile statement work properly?
Thanks in Advance
An infile statement in a data step is typically used for reading text files, e.g. csv or fixed-width data. You can in theory read any file you like with one, but for things like excel files that have a complex internal structure, you should the tools that already exist in SAS for that purpose.
In your situation you should use a libname statement. This might be useful to you:
https://www.lexjansen.com/pharmasug-cn/2014/PT/PharmaSUG-China-2014-PT09.pdf
INFILE statements are for reading data from text files. XLSX files are zipped XML files.
There are two primary ways to read/import an XLSX file. One is to use PROC IMPORT and import the file directly into SAS. The second is to use the LIBNAME statement and that allows you to treat it more like a SAS data set from the start.

Can you upload a sql script in powerbuilder and output the results?

I'm looking to see if there is a way to code PB to upload a sql script and to save the output. I know how to code PB by writing the sql directly in it, but I want to program it so I can do something like 'click browse', select .sql file, and save the results in a text file. I've been searching but can't seem to find what I'm looking for. Any guidance would be much appreciated. Thank you.
In a nutshell you can open the desired file, read the contents into a string, use the sql string to create a datastore (see SyntaxFromSql method), execute the sql via a retrieve, then save the results with the SaveAs method.

What is the best way to import data using insert statements into a table in MS SQL?

I have exported a table from another db into an .sql file as insert statements.
The exported file has around 350k lines in it.
When i try to simply run them, I get a "not enough memory" error before the execution even starts.
How can import this file easily?
Thanks in advance,
Orkun
You have to manually split sql file into smaller pieces. Use Notepad++ or some other editor capable to handle huge files.
Also, since you wrote that you have ONE table, you could try with utility or editor which can automatically split file into pieces of predefined size.
Use SQLCMD utility.. search MICROSOFT documentation.. with that you just need to gives some parameters. One of them is file path.. no need to go through the pain of splitting and other jugglery..

Generate DDL SQL create table statement after scanning CSV file

Are there any command line tools (Linux, Mac, and/or Windows) that I could use to scan a delimited file and output a DDL create table statement with the data type determined for me?
Did some googling, but couldn't find anything. Was wondering if others might know, thanks!
DDL-generator can do this. It can generate DDL's for YAML, JSON, CSV, Pickle and HTML (although I don't know how the last one works). I just tried it on some data exported from Salesforce and it worked pretty well. Note you need to use it with Python 3, I could not get it to work with Python 2.7.
You can also try https://github.com/mshanu/idli. It can take csv file as input and can generate create statement with appropriate types.It can generate for mysql, oracle and postgres. I am actively working on this and happy to receive feedback for future improvement

Import part of MySQL dump (not all of it)

I'm going to do some stress tests and right now I have a really really huge MySQL dump file in hand that could be used as the benchmark.
There's only one table inside the dump.
What's awkward is that my server doesn't have that much disk space to actually hold this table. So I would like to just import some random part of the dump, not all of them.
Is it possible? If yes, what does the command line look like?
I have created a shell script for this. If you are on a unix based system, use
https://github.com/JoyceBabu/MySQL-Dump-Table-Extractor
Invoke the script using ./extract_table.sh sqlfile.sql
To extract a single table type the table name
To extract all tables from table1 to table2 type table1 table2
To view a list of all available tables type LIST
MySQL dump files are simply text files full of SQL statements. Write a simple program to read the dump file and write random parts of it to a new dump file.
Couldn't you just manually split the file? These are just flat text files...so open it up in your favorite text editor and delete half of the file (or however much you want).