When should we use Infile statement - file-io

I need a little help from SAS experts. I am trying to import excel files into SAS environment using Infile statement. But I an facing errors. Could you please help to enlighten me in which situation Infile statement work properly?
Thanks in Advance

An infile statement in a data step is typically used for reading text files, e.g. csv or fixed-width data. You can in theory read any file you like with one, but for things like excel files that have a complex internal structure, you should the tools that already exist in SAS for that purpose.
In your situation you should use a libname statement. This might be useful to you:
https://www.lexjansen.com/pharmasug-cn/2014/PT/PharmaSUG-China-2014-PT09.pdf

INFILE statements are for reading data from text files. XLSX files are zipped XML files.
There are two primary ways to read/import an XLSX file. One is to use PROC IMPORT and import the file directly into SAS. The second is to use the LIBNAME statement and that allows you to treat it more like a SAS data set from the start.

Related

How to create format files using bcp from flat files

I want to use a format file to help import a comma delimited file using bulk insert. I want to know how you generate format files from a flat file source. The microsoft guidance on this subjects makes it seem as though you can only generate a format file from a SQL table. But I want it to look at text file and tell me what the delimiters are in that file.
Surely this is possible.
Thanks
The format file can, and usually does include more than just delimiters. It also frequently includes column data types, which is why it can only be automatically generated from the Table or view the data is being retrieved from.
If you need to find the delimiters in a flat file, I'm sure there are a number of ways to create a script that could accomplish that, as well as creating a format file.

What is the best way to import data using insert statements into a table in MS SQL?

I have exported a table from another db into an .sql file as insert statements.
The exported file has around 350k lines in it.
When i try to simply run them, I get a "not enough memory" error before the execution even starts.
How can import this file easily?
Thanks in advance,
Orkun
You have to manually split sql file into smaller pieces. Use Notepad++ or some other editor capable to handle huge files.
Also, since you wrote that you have ONE table, you could try with utility or editor which can automatically split file into pieces of predefined size.
Use SQLCMD utility.. search MICROSOFT documentation.. with that you just need to gives some parameters. One of them is file path.. no need to go through the pain of splitting and other jugglery..

Writing data flow to postgresql

I know that by doing:
COPY test FROM '/path/to/csv/example.txt' DELIMITER ',' CSV;
I can import csv data to postgresql.
However, I do not have a static csv file. My csv file gets downloaded several times a day and it includes data which has previously been imported to the database. So, to get a consistent database I would have to leave out this old data.
My bestcase idea to realize this would be something like above. However, worstcase would be a java program manually checks each entry of the db with the csv file. Any recommendations for the implementation?
I really appreciate your answer!
You can dump latest data to the temp table using COPY command and MERGE temp table with the live table.
If you are using JAVA program for execute COPY command, then try CopyManager API.

Reading txt file using stored procedure and manipulation with the same

I would like to speed up a process of reading data from txt file. txt file looks as following:
"NameA";"407;410;500"
"NameB";"407;510"
"NameC";"407;420;500;600"
and I would like to have it as :
"NameA";"407"
"NameA";"410"
"NameA";"500"
"NameB";"407"
"NameB";"510"
"NameC";"407"
"NameC";"420"
"NameC";"500"
"NameC";"600"
Any thoughts on performing the task with SQL stored procedure?
Thanks
Any thoughts? Yes... don't do it!
PL/SQL is only good for handling "database" objects - tables, rows, columns etc. Don't try to get it to do something it's not suited for - it will only lead to frustration and a bad solution.
Instead, use a shell script/command to massage your input file into a form directly and conveniently useable by your database.
I would suggest using sed or awk.

import csv to sql

I have to import an csv file to SQL database table which already created (empty and has the same number of named columns). It would be great if you could suggest any tutorials or give some tips.
I assume you are using Microsoft SQL Server. Do you need to do this in a program or manually? There is a tutorial on using the bcp command for that, or alternatively a SQL command. If you need to parse the CSV file for your own code, see this previous SO question.