Bash creating csv from values - sql

I'm trying to create a csv in my bash script from some values I'm getting from another non-csv file.
The problem I see is that the values have commas(,) in them.
The csv file wrong because of that (the values with commas in them are 2 or more different values now)
Is there any way to get rid of that problem or any other way to build a csv in a bash script. I can create any other files too, it just needs to be compatible with standard sql import.
Thanks

I now added quotation marks before and after every value and it works great. The csv looks like it should.

Related

Importing and maintaining multiple csv files into PostgreSQL

I am new to using SQL, so please bear with me.
I need to import several hundred csv files into PostgreSQL. My web search has only indicated how to import many csv files into one table. However, most csv files have different column types (all have one line headers). Is it possible to somehow run a loop, and have each csv imported to a table with the same name as the csv? Creating each table manually and specifying columns is not an option. I know that COPY will not work as the table needs to already by specified.
Perhaps this is not feasible in PostgreSQL? I would like to accomplish this in pgAdmin III or the PSQL console, but I am open to other ideas (using something like R to change the csv to a format more easily entered into PostgreSQL?).
I am using PostgreSQL on a Windows 7 computer. It was requested that I use PostgreSQL, thus the focus of the question.
The desired result is a database full of tables, that I will then join with a spreadsheet that includes specific site data. Thanks!
Use pgfutter.
The general syntax looks like this:
pgfutter csv
In order to run this on all csv files in a directory from Windows Command Prompt, navigate to the desired directory and enter:
for %f in (*.csv) do pgfutter csv %f
Note that the path for the downloaded program must be added to the list of accepted paths for Environmental Variables.
EDIT:
Here is the command line code for Linux users
Run it as
pgfutter *.csv
Or if that won't do
find -iname '*.csv' -exec pgfutter csv {} \;
In the terminal use nano to make a file to loop through moving csv files under my directory to postgres DB
>nano run_pgfutter.sh
The content of run_pgfutter.sh:
#! /bin/bash
for i in /mypath/*.csv
do
./pgfutter csv ${i}
done
Then make the file executable:
chmod u+x run_pgfutter.sh

How to create format files using bcp from flat files

I want to use a format file to help import a comma delimited file using bulk insert. I want to know how you generate format files from a flat file source. The microsoft guidance on this subjects makes it seem as though you can only generate a format file from a SQL table. But I want it to look at text file and tell me what the delimiters are in that file.
Surely this is possible.
Thanks
The format file can, and usually does include more than just delimiters. It also frequently includes column data types, which is why it can only be automatically generated from the Table or view the data is being retrieved from.
If you need to find the delimiters in a flat file, I'm sure there are a number of ways to create a script that could accomplish that, as well as creating a format file.

how to create excel file in ios

I am tring to create a excel file in IOS, I manage to do this simple creating our string and writing it to a file with extention .csv, This fine but the problem is that all the data is comming on a single cell. Can any help me out with some code.
You can just insert a comma in the string between all seperate columns and a newline for every row, then write it CSV and it'll work.
Good luck

Writing data flow to postgresql

I know that by doing:
COPY test FROM '/path/to/csv/example.txt' DELIMITER ',' CSV;
I can import csv data to postgresql.
However, I do not have a static csv file. My csv file gets downloaded several times a day and it includes data which has previously been imported to the database. So, to get a consistent database I would have to leave out this old data.
My bestcase idea to realize this would be something like above. However, worstcase would be a java program manually checks each entry of the db with the csv file. Any recommendations for the implementation?
I really appreciate your answer!
You can dump latest data to the temp table using COPY command and MERGE temp table with the live table.
If you are using JAVA program for execute COPY command, then try CopyManager API.

importing CSV file into sqlite table

I've imported a CSV file into an sqlite table. everything went fine except it included quotes " " around the data in the fields. not sure why because there are no quotes in the CSV file.
anyone know how to avoid this or get rid of the quotes somehow?
Here's a screenshot from the firefox sqlite import settings I'm using:
thanks for any help.
I would guess that there really ARE double-quotes around the data. If you use Windows, a CSV will automatically open in Excel and it looks like there are no quotes because Excel interprets the file properly. However, I bet if you open the file in Notepad there will be quotes around the strings.
and then, as pointed out in your discussion above, truncate your sqlite table and reimport the data, indicating that the fields are enclosed by double quotes.