I need to print a table column out to a text file in postgres, just a simple SELECT "column_name" FROM "table." I'm pretty sure there is a way to do this but I can't remember the syntax. Can anyone help?
Use COPY.
If you need to copy an entire table, you can specify a table name:
COPY country TO '/usr1/proj/bray/sql/country_data';
You can also copy a query result:
COPY (SELECT column_name FROM country WHERE country_name LIKE 'A%')
TO '/usr1/proj/bray/sql/a_list_countries.copy';
The same syntax can be used to import a table:
COPY country FROM '/usr1/proj/bray/sql/country_data';
It is possible to specify additional options, e.g. delimiters, format, etc. In my daily work, I often use:
COPY country TO '/usr1/proj/bray/sql/country_data' DELIMITER ',' CSV;
For a full description of the COPY statement, refer to the linked documentation page above.
Related
Is there any query that results from source table names and column table names using a mapping or mapping Id in informatica. This has been very hard and challenging
Like when we search up SELECT * FROM opb_mapping WHERE mapping_name LIKE '%CY0..%'
It is resulting in some details but I cannot find source table names and target table names. Help if you can.
Thanks.
You can use below view to get the data.
(Assuming you have full access to metadata tables.
select source_name, source_field_name,
target_name, target_column_name,
mapping_name, subject_name folder
from REP_FLD_MAPPING
where mapping_name like '%xx%';
Only issue i can see is, if you have overwrite sql, then you need to check sql for true source.
I am new to SQL and I was wondering if there was any way to search a table for instances with values from a column of an external table (csv file). To explain that in a clearer manner, this is what I'm working on: If a column in the csv file contains latitudes and another column contained longitudes; I want to search a table that contains information about several locations, with their Latitudes and Longitudes specified in the table.
I want to retrieve the information about that particular location with Latitudes and Longitudes as input from a csv file.
Would it look something like this? :
CREATE TABLE MyTable(
latitude DECIMAL(5, 2) NOT NULL,
longitude DECIMAL(5, 2) NOT NULL
);
LOAD DATA INFILE 'C:\Users\Admin\Desktop\Catalog.csv'
INTO TABLE MyTable
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SELECT
main.object_id,
main.latitude,
main.longitude,
main.Some_Information
FROM
location_info AS main,
MyTable AS temp
WHERE
main.latitude = temp.latitude AND
main.longitude = temp.longitude
I also tried using psql's \copy like:
\copy MyTable FROM 'C:\Users\Admin\Desktop\Catalog.csv' WITH CSV;
As given here -> http://postgresguide.com/utilities/copy.html.
But this didn't work either. There was an error near "\" at or near copy, but then this could be because of the presence of an older version of psql.
Also I am not a Superuser, hence the use of \copy and not COPY FROM.
I also tried using a temporary table and using \copy alongside it. It gave the same error as above.
PostgreSQL does not support the LOAD DATA syntax you're using. You'll need to use COPY instead.
Your workflow should look more like:
CREATE TABLE MyTable(
latitude DECIMAL(5, 2) NOT NULL,
longitude DECIMAL(5, 2) NOT NULL
);
COPY MyTable(latitude, longitude)
FROM 'C:\Users\Admin\Desktop\Catalog.csv' WITH CSV;
SELECT
main.object_id,
main.latitude,
main.longitude,
main.Some_Information
FROM
location_info AS main
JOIN MyTable AS temp
on main.latitude = temp.latitude
and main.longitude = temp.longitude
There are three main things to notice here.
I've removed your \ from the COPY command.
I've specified the columns that you're trying to insert into with the COPY command. If the columns are in a different order in the CSV file, simply reorder them in the COPY expression.
I've changed the syntax of your join to a standard ANSI join. The logic is the same, but this is a better standard to use for readability/compatibility reasons.
I am VERY new to SQL and I am having a little trouble.
Let's say I have a table, called data, with two columns, class, and name.
I want to create the column math if it doesn't exist, and give it a value of John.
I can do this with:
INSERT INTO data VALUES ('math','John')
But if I change John to Steve, I want math to have a value of "John","Steve".
But instead, it creates another row called "math" with a value of "Steve", how can I make this insert into the same column?
Thanks
I would strongly recommend against storing a CSV list of names in your table. CSV is hard to query, update, and maintain. Actually, there is nothing at all wrong with your current approach:
class | name
math | John
math | Steve
Data in this format can easily be queried, because it is relational. And if you need to add, update, or remove a name associated with a class, it becomes a one record affair without having to deal with CSV. Note that if you really need a CSV representation, you can still achieve that using SQLite's GROUP_CONCAT() function:
SELECT class, GROUP_CONCAT(name, ',') AS names
FROM yourTable
GROUP BY class
Not totally sure about SQLite but the normal command usually is something like:
ALTER TABLE table_name
ADD column_name datatype
Then insert into
I want to search a text in a table without knowing its attributes.
Example : I have a table Customer,and i want to search a record which contains 'mohit' in any field without knowing its column name.
You are looking for Full Text Indexing
Example using the Contains
select ColumnName from TableName
Where Contains(Col1,'mohit') OR contains(col2,'mohit')
NOTE - You can convert the above Free text query into dynamic Query using the column names calculated from the sys.Columns Query
Also check below
FIX: Full-Text Search Queries with CONTAINS Clause Search Across Columns
Also you can check all Column Name From below query
Select Name From sys.Columns Where Object_Id =
(Select Object_Id from sys.Tables Where Name = 'TableName')
Double-WildCard LIKE statements will not speed up the query.
If you wanna make a full search on the table, you must surely be knowing the structure of the table. Considering the table has fields id, name, age, and address, then your SQL Query should be like:
SELECT * FROM `Customer`
WHERE `id` LIKE '%mohit%'
OR `name` LIKE '%mohit%'
OR `age` LIKE '%mohit%'
OR `address` LIKE '%mohit%';
Mohit, I'm glad you devised the solution by yourself.
Anyway, whenever you again face an unknown table or database, I think it will be very welcome the code snippet I just posted here.
Ah, one more thing: the answers given did not addressed your problem, did they?
I'm always being given a large list of say id's which I need to search in our database have manually put them into a sql statement like the follow which can take a while putting single quotes around each number followed by a comma, I was hoping someone has a easy way of doing this for me? Or am I just being a bit lazy...
select * from blah where idblah in ('1234-A', '1235-A', '1236-A' ................)
You can use the worlds' simplest code generator.
Just paste in the list of values, setup the pattern and voila... you have a set of quoted values.
I have also used Excel in the past, using the CONCAT function with smart paste.
I would set aside a table to hold the values and have my queries JOIN against that table. Set up a simple import script (don't forget to clear out the table at the start) and something like this is a breeze. Run the import, run the query. You never have to touch the query again or regenerate any code.
As an example:
CREATE TABLE Search_ID_List (
id VARCHAR(20) NOT NULL,
CONSTRAINT PK_Search_ID_List PRIMARY KEY CLUSTERED (id)
)
and:
SELECT
<column list>
FROM
Search_ID_List SIL
INNER JOIN Blah B ON
B.id = SIL.id
If you want to be able to save past search criteria or have multiple searches available to you at the same time then you can just add an identifying column which gets filled in by your import. It can be the file from where the ids came, some descriptive code/name, or whatever. Then just add that to the WHERE clause of your query and you're all set.
You could do something like this.
select * from blah where ',' + '1234-A,1235-A,1236-A' + ',' LIKE ',%' + idblah + '%,'
This pattern is super useful when you're being passed a comma delimited list of values to filter by, but I think would be applicable here as well.