Query all tables in a sql data base with Python - sql

I am using python with sqlite3, and I have problem in reading out tables in a data base created by myself.
The data base is created using the below code snippet
con = sqlite3.connect(folderdir + dbFileName)
#Create a cursor
c = con.cursor()
#Below code creates the data base. I later need to extract all tablename in
my data base
c.execute('CREATE TABLE IF NOT EXISTS ' + tablename + ' (Bord text, Date
text, Time text, ID integer, Menu text, Qty integer, Price real,
Time_kitchen text, TableActive integer)')
c.execute('INSERT INTO ' + tablename + ''' (Bord, Date, Time, TableActive)
VALUES(?,?,?,?)''',
(tablename, date, tabletime, 1))
con.commit()
con.close()
If above code is called many times with different tablename variables, I get a database.db file containing many tablenames. My problem is, even using the sqlite_master, I cannot read out all the tablenames. I use the below code to read out
import sqlite3
#I have verified that I can connect successfully to the database file
con = sqlite3.connect(dbFilePath + "/" + dbFileName + ".db")
c = con.cursor()
c.execute("SELECT ALL Name FROM sqlite_master")
rows = c.fetchall()
for row in rows:
print(row)
The print returned nothing although I know for sure that I have a database with multiple tablenames in. What am I doing wrong?

When the file whose name is given to connect() does not exist, SQLite will happily create a new, empty database for you.
Ensure that the file name is correct; and it should not be a relative file name to avoid a dependency on the current directory.

Related

How to build a function to add items to different tables

I currently have several functions that do the same thing. Each function add a value to a different table. I have a subject table, a format table, a studio table, a language table etc.. Each has its own function to add to it. These tables are then used to pull their values into separate dropdowns (GUI).
I would like to create a single function that insert values into each of these tables. I use lambda to pass the name of the column. With this I add an 's' to name the table (each table have the same name as the column but plural). Then I use the variable in different place in the function.
If I take the "subject" function. The database table is called "subjects," the column is called "subject" and the input box on the GUI is called "subject_add." Therefore to pull the value from the input box I use "subject_add.get()" Same for all other dropdowns but with different wording.
What I don't know is how to make the "subject_add.get()" work if I use a variable for the word "subject." I have here the function I am currently working on.
def add_to_dropdown(column):
table_name = column + "s"
element = column + "_add.get()"
if element:
conn = sqlite3.connect(conndb.data_source)
c = conn.cursor()
c.execute(f'INSERT INTO {table_name}({column}) VALUES (?)', (element,))
messagebox.showinfo("Interest", f"A new {column} has been added to the database")
conn.commit()
c.close()
conn.close()
else:
messagebox.showwarning("Warning", f"{column} field is empty!")
column_add.delete(0, tk.END)
The table_name variable looks fine. The element variable looks fine, but the "if element:" won't work, as well as the "element" in the SQL INSERT command.
Here is the button with it's lambda command that passes the word "subject" to the function.
subject_add_btn = ttk.Button(frame_add_items, text="add ", style='a.TButton', image=arrow, compound=tk.RIGHT, command=lambda: add_to_dropdown("subject"))
subject_add_btn.place(x=480, y=50, height=35, width=90)
Hopefully, this makes sense to you. It's not very easy to explain in words. It's easier to show the code.
To resolve the issue is used the eval() functions. Here are the changes I made for it to work well:
def add_to_dropdown(column):
# This function handle all dropdown
table_name = column + "s"
element = column + "_add.get()"
if eval(element):
conn = sqlite3.connect(conndb.data_source)
c = conn.cursor()
c.execute(f'INSERT INTO {table_name}({column}) VALUES (?)', (eval(element),))

Most efficient way of deleting records from a view with multiple base tables?

I need write a VB function that deletes all records from the base tables of a view that have been initialised with default values/default constraint key values.
My understanding is that the only way to do this is to delete the records from each base table individually, but I am not sure if there is an easier, more efficient way of doing this than what I am trying to attempt. I would like some guidance/advice if possible.
This is the only way I can think of of doing this:
Run a query that returns base table names from the view:
DECLARE #vn as nvarchar(max) = 'dbo.TABLE_NAME'
SELECT referenced_server_name, referenced_database_name, referenced_entity_name as SourceTable,referenced_minor_name as SourceColumn, referenced_minor_id as depnumber
FROM sys.dm_sql_referenced_entities (#vn, 'OBJECT')
where referenced_minor_name IS NOT NULL
ORDER BY referenced_entity_name, referenced_minor_id
run sp_helpconstraint N'<table_name>', which will give me a list of all default values/default constraint types, as well as the column names I will need to compare default values to the values in each table and determine whether or not a record should be deleted from them.
Questions
Is there an easier/more efficient way of trying to delete a record from a view?
NOTE: the full function I ended up writing has been added to the answers below for anyone interested in the answer
I am not sure if there is an easier, more efficient way of doing this than what I am trying to attempt. I would like some guidance/advice if possible
TLDR; there isn't
You need to understand that a view is NOT data in the database; it's a stored SQL query that is run every time you select from the view.
It might even be the case that SQL Server takes your query and mixes it with the query that provides the view and optimizes them and runs them so it's not even necessarily the case that it runs the view query, gets all million records that the view represents, and then rifles through them looking for the one guy called Constantinople Ernhardt - SQL Server might consider it better to silently and transparently rewrite the query you gave so it's planned and run completely differently to what you might think - it does this for every query, in a process called optimization.
Your view is:
CREATE VIEW MyView AS
SELECT * FROM Person p JOIN Address a on p.AddressId = a.Id
You write:
SELECT * FROM MyView WHERE Name = 'Abc' and HouseName = 'def'
You might think it does this (and conceptually, you're right):
SELECT * FROM
(
SELECT * FROM Person p JOIN Address a on p.AddressId = a.Id
) x WHERE Name = 'Abc' and HouseName = 'def'
But it probably gets rewritten to this:
SELECT * FROM Person p JOIN Address a on p.AddressId = a.Id WHERE Name = 'Abc' and HouseName = 'def'
So now that's out the way and you can see that a view is just a query on top of a table, that gets run every time you select from it - how do you delete data from a query?
You can't, because queries don't have data; they retrieve data from tables
The only way to "delete data from a view" is to delete data from the table the view selects the data from
You can only do this by DELETE statements on the table(s) concerned
There is a facility where you can write an INSTEAD OF trigger on a view, and then delete from the view, and SQL Server will run the trigger (which deletes from the underlying tables). It might look like you're deleteing data from the view, but really, you're just causing a mechanism to remove data from the underlying tables in the same way that the view is a mechanism that drags data out of those tables.
You could write a stored procedure that deletes data, but again, that's just a mechanism to delete data from the underlying table
Choose any method you like, as suits your business goals and desire for encapsulating your software in a certain way. For example, in the past I've had a software I couldn't change (lost the sourcecode or whatever) and it has been hard wired to SELECT FROM users or DELETE FROM users - we wanted the software to carry on working even though the users table was being renamed to members. We renamed the table then created a view called users that just did SELECT * FROM members - that allows the app to carry on working, reading data. Then we created INSTEAD OF triggers to update and delete data in the members table when the app attempted to do that operation to the users view (which the app still thought was a table)
So why is it so hard? Well the data that comes out of your view might not even relate to a table row any more. Here's a simple version:
CREATE VIEW MyView AS
SELECT MAX(SUBSTR(Name, 4, 999)) MaxFirstName FROM Person GROUP BY Gender
Suppose there were two people called Mr Lee Smith and Ms Lee Smith, you've taken the max of the function output and got Lee Smith and now you want to delete Lee Smith out of the person table by analyzing the view and deleting... what out of the Person table? Which record? Grouping mushed all the records up together. The MAX name of one an the MIN birthdate from another..
Here's another example, bit more ridiculous but imaginable:
CREATE VIEW MyView AS
SELECT name as x FROM person
UNION
SELECT street FROM address
This could easily produce the distinct value "Penny Lane" - but is it a person, or a road? Which should be deleted if we delete from this view where x = 'Penny Lane'
There isn't a magic bullet where you can run it and it will says "this table uses these 3 tables" so you can delete from them. It wouldn't even be a good premise. Your view might select from one data table and one lookup table, and deleting Gender type 1 from the lookup table just because you're deleting Donald Trump from the users table, would be a bad call
If you want to provide delete facilities on a view, you need to code something up; there isn't an automagical solution that can work out what data from what tables should be deleted and what should remain. Just imagine how difficult it would be to analyze a view that joins 9 tables, with a mix of join styles, plus another 8 lists of VALUES, 3 calls to a cross applied table valued function that parses some json, sling a couple of row generating recursive CTEs in there and a pivot too...
Absolutely no chance that anyone would have written the magic button to pick all that apart into "work out the list of base tables and what data should be deleted from which to satisfy DELETE FROM MyView WHERE output_of_parsing_function = 'Hello'
This is the final working function I ended up using. (feel free to suggest a better way)
If DI.CommonDataFunctions.IsThisAView(BaseTableName, Globals.dif) Then
Dim viewColumnNames As New List(Of String)
Dim tableName As String
Dim viewTables As New List(Of String)
Dim isDefault
Dim primaryKeyName As String
//return table names from view
Dim qp As New List(Of SqlParameter)
qp.Add(New SqlParameter("#vn", $"dbo.{BaseTableName}"))
Dim sql As String = "
SELECT
referenced_entity_name as SourceTable,referenced_minor_name as SourceColumn
FROM
sys.dm_sql_referenced_entities (#vn, 'OBJECT')
WHERE
referenced_minor_name IS NOT NULL
ORDER BY
referenced_entity_name, referenced_minor_id"
Using dr As New DataReader(Globals.dif.GetDBDetails)
Dim constraintKeys As New Dictionary(Of String, String)()
Dim primaryKeyList As New List(Of Int32)
Dim table As String
dr.ExecuteReader(sql, qp)
Do While dr.Read
tableName = dr.Item("SourceTable").ToString.ToUpper.Trim
viewColumnNames.Add(dr.Item("SourceColumn").ToString.ToUpper.Trim)
If Not viewTables.Contains(tableName) Then
viewTables.Add(tableName)
End If
Loop
For Each table In viewTables
Dim columnName As String
Dim defaultConstraintValue
isDefault = True
table = table
dr.ExecuteReader("
SELECT Col.Column_Name from
INFORMATION_SCHEMA.TABLE_CONSTRAINTS Tab,
INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE Col
WHERE
Col.Constraint_Name = Tab.Constraint_Name
AND Col.Table_Name = Tab.Table_Name
AND Constraint_Type = 'PRIMARY KEY'
AND Col.Table_Name = '" + table + "'")
While dr.Read
primaryKeyName = dr.Item(0)
End While
//return default constraints
dr.ExecuteReader("
SELECT
ColumnName = c.name,
TableName = t.name,
df.definition
FROM
sys.default_constraints df
INNER JOIN
sys.tables t ON df.parent_object_id = t.object_id
INNER JOIN
sys.columns c ON c.object_id = df.parent_object_id AND df.parent_column_id = c.column_id
WHERE
t.Name = N'" + table + "'")
While dr.Read
defaultConstraintValue = dr.Item("definition").ToString
//delete "(( ))" Or "( )" from default constraint
If defaultConstraintValue.StartsWith("((") AndAlso defaultConstraintValue.EndsWith("))") Then
defaultConstraintValue = defaultConstraintValue.Substring(0, defaultConstraintValue.Length - 2)
defaultConstraintValue = defaultConstraintValue.Substring(2)
ElseIf defaultConstraintValue.StartsWith("(") AndAlso defaultConstraintValue.EndsWith(")") Then
defaultConstraintValue = defaultConstraintValue.Substring(0, defaultConstraintValue.Length - 1)
defaultConstraintValue = defaultConstraintValue.Substring(1)
End If
If defaultConstraintValue.StartsWith("'") AndAlso defaultConstraintValue.EndsWith("'") Then
defaultConstraintValue = defaultConstraintValue.Substring(0, defaultConstraintValue.Length - 1)
defaultConstraintValue = defaultConstraintValue.Substring(1)
If Not IsNumeric(defaultConstraintValue) Then
defaultConstraintValue = "'" + defaultConstraintValue + "'"
End If
End If
columnName = dr.Item("ColumnName").ToString.ToUpper.Trim
constraintKeys.Add(columnName, defaultConstraintValue)
End While
Next
Dim sql2 = "SELECT " + primaryKeyName + " FROM " + BaseTableName
If constraintKeys IsNot Nothing Then
Dim isFirstFilter = True
sql2 &= " WHERE "
For Each constraintKey In constraintKeys
If viewColumnNames.Contains(constraintKey.Key) AndAlso constraintKey.Key <> "FAMILY_UID" Then
If isFirstFilter = False Then
sql2 &= " And "
End If
If IsNumeric(constraintKey.Value) Then
Dim intConverted = CInt(constraintKey.Value)
sql2 &= constraintKey.Key + " = " + intConverted.ToString + " "
If isFirstFilter = True Then
isFirstFilter = False
End If
Else
sql2 &= constraintKey.Key + " = " + constraintKey.Value + " "
If isFirstFilter = True Then
isFirstFilter = False
End If
End If
End If
Next
End If
dr.ExecuteReader(sql2)
While dr.Read
primaryKeyList.Add(dr.Item(primaryKeyName))
End While
If primaryKeyList.Count > 0 Then
For Each table In viewTables
Dim isFirstFilter = True
Dim sql3 = "DELETE FROM " + table + " WHERE " + primaryKeyName + " IN ("
For Each primaryKey In primaryKeyList
sql3 &= primaryKey.ToString
If Not primaryKey = primaryKeyList(primaryKeyList.Count - 1) Then
sql3 &= ", "
End If
Next
sql3 &= ")"
Using CEx As New CommandExecutor(Globals.dif)
CEx.ExecuteNonQuery(sql3)
End Using
Next
End If
End Using
End If

How can get correct Column Name from Database with Npgsql?

This is my table in database:
CREATE TABLE public.test
(
id integer NOT NULL DEFAULT nextval('test_id_seq'::regclass),
hc character varying(30),
"Hc" character varying(30),
"HC" character varying(30),
f character varying(30),
"F" character varying(30),
f1 character varying(30),
te numeric(2,2),
CONSTRAINT test_pkey PRIMARY KEY (id)
)
If i get a Table Definition by Npgsql from vb.net:
select * from test where null = null
Result: some colums had changed name:
Ex: Hc => Hc1,HC => HC2
How can get correct Column Name from Database with Npgsql?
It seems that you (directly or indirectly) use DbDataAdapter.Fill which renames columns as follows if necessary:
If the DbDataAdapter encounters duplicate columns while populating a
DataTable, it generates names for the subsequent columns, using the
pattern "columnname1", "columnname2", "columnname3", and so on.
Apparently, this deduplication process treats column names in a case-insensitive way which is why the columns also get renamed in your example. There is no way to turn off this behavior directly (see AdapterUtil.cs,2402).
A work around would be to use an additional SqlCommand and use SqlDataReader.GetName to obtain the exact column names and then change the columns of the DataTable accordingly. This could be done as follows:
Dim query = "select * from test where null = null"
' setup connection
Dim connection As New NpgsqlConnection(
String.Format("Server={0};Port={1};Database={2};User Id={3};Password={4};",
host, port, database, user, password))
connection.Open()
' fill data table from query
Dim table As New DataTable
Using adapter = New NpgsqlDataAdapter(query, connection)
adapter.Fill(table)
End Using
' correct column names
Using command = New NpgsqlCommand(query, connection)
Using reader = command.ExecuteReader()
For i = 0 To table.Columns.Count - 1
table.Columns(i).ColumnName = reader.GetName(i)
Next
End Using
End Using
' display table in DataGridView
view.DataSource = table

Reading data from CSV file and insert to SQL using vb.net

Hi i have test folder in that daily we copy one csv file we dont use any code to copy file we just drag and drop from local. CSV file have 11 columns but i want only 3 columns data in sql. so i created 3 columns in sql. My aim is to read file from folder and insert those 3 columns data to sql. I will run the task daily using task scheduler if file found in folder it need to import data to sql
This will do it for you. Simply choose the fields (columns) you want to load.
Protected Sub uploadButton_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles uploadButton.Click
' declare CsvDataReader object which will act as a source for data for SqlBulkCopy
Using csvData = New CsvDataReader(new StreamReader(fileUpload.PostedFile.InputStream, True))
' will read in first record as a header row and
' name columns based on the values in the header row
csvData.Settings.HasHeaders = True
' must define data types to use while parsing data
csvData.Columns.Add("varchar") ' First
csvData.Columns.Add("varchar") ' Last
csvData.Columns.Add("datetime") ' Date
csvData.Columns.Add("money") ' Amount
' declare SqlBulkCopy object which will do the work of bringing in data from
' CsvDataReader object, connecting to SQL Server, and handling all mapping
' of source data to destination table.
Using bulkCopy = New SqlBulkCopy("Data Source=.;Initial Catalog=Test;User ID=sa;Password=")
' set the name of the destination table that data will be inserted into.
' table must already exist.
bulkCopy.DestinationTableName = "Customer"
' mappings required because we're skipping the customer_id column
' and letting SQL Server handle auto incrementing of primary key.
' mappings not required if order of columns is exactly the same
' as destination table definition. here we use source column names that
' are defined in header row in file.
bulkCopy.ColumnMappings.Add("First", "first_name") ' map First to first_name
bulkCopy.ColumnMappings.Add("Last", "last_name") ' map Last to last_name
bulkCopy.ColumnMappings.Add("Date", "first_sale") ' map Date to first_sale
bulkCopy.ColumnMappings.Add("Amount", "sale_amount") ' map Amount to sale_amount
' call WriteToServer which starts import
bulkCopy.WriteToServer(csvData)
End Using ' dispose of SqlBulkCopy object
End Using ' dispose of CsvDataReader object
End Sub ' end uploadButton_Click

Generate insert SQL statements from a CSV file

I need to import a csv file into Firebird and I've spent a couple of hours trying out some tools and none fit my needs.
The main problem is that all the tools I've been trying like EMS Data Import and Firebird Data Wizard expect that my CSV file contains all the information needed by my Table.
I need to write some custom SQL in the insert statement, for example, I have a CSV file with the city name, but as my database already has all the cities in another table (normalized), I need to write a subselect in the insert statement to lookup for the city and write its ID, also I have a stored procedure to cread GUIDS.
My insert statement would be something like this:
INSERT INTO PERSON (ID, NAME, CITY_ID) VALUES((SELECT NEW_GUID FROM CREATE_GUID), :NAME, (SELECT CITY_ID FROM CITY WHERE NAME = :CITY_NAME)
How can I approach this?
It's a bit crude - but for one off jobs, I sometimes use Excel.
If you import the CSV file into Excel, you can create a formula which creates an INSERT statement by using string concatenation in the formula. So - if your CSV file has 3 columns that appear in columns A, B, and C in Excel, you could write a formula like...
="INSERT INTO MyTable (Col1, Col2, Col3) VALUES (" & A1 & ", " & B1 & ", " & C1 & ")"
Then you can replicate the formula down all of your rows, and copy, and paste the answer into a text file to run against your database.
Like I say - it's crude - but it can be quite a 'quick and dirty' way of getting a job done!
Well, if it's a CSV, and it this is a one time process, open up the file in Excel, and then write formulas to populate your data in any way you desire, and then write a simple Concat formula to construct your SQL, and then copy that formula for every row. You will get a large number of SQL statements which you can execute anywhere you want.
Fabio,
I've done what Vaibhav has done many times, and it's a good "quick and dirty" way to get data into a database.
If you need to do this a few times, or on some type of schedule, then a more reliable way is to load the CSV data "as-is" into a work table (i.e customer_dataload) and then use standard SQL statements to populate the missing fields.
(I don't know Firebird syntax - but something like...)
UPDATE person
SET id = (SELECT newguid() FROM createguid)
UPDATE person
SET cityid = (SELECT cityid FROM cities WHERE person.cityname = cities.cityname)
etc.
Usually, it's much faster (and more reliable) to get the data INTO the database and then fix the data than to try to fix the data during the upload. You also get the benefit of transactions to allow you to ROLLBACK if it does not work!!
I'd do this with awk.
For example, if you had this information in a CSV file:
Bob,New York
Jane,San Francisco
Steven,Boston
Marie,Los Angeles
The following command will give you what you want, run in the same directory as your CSV file (named name-city.csv in this example).
$ awk -F, '{ print "INSERT INTO PERSON (ID, NAME, CITY_ID) VALUES ((SELECT NEW_GUID FROM CREATE_GUID), '\''"$1"'\'', (SELECT CITY_ID FROM CITY WHERE NAME = '\''"$2"'\''))" }' name-city.csv
Type awk --help for more information.
Two online tools which helped me in 2020:
https://numidian.io/convert/csv/to/sql
https://www.convertcsv.com/csv-to-sql.htm
The second one is based on JS and does not upload your data (at least not at the time I am writing this)
You could import the CSV file into a database table as is, then run an SQL query that does all the required transformations on the imported table and inserts the result into the target table.
Assuming the CSV file is imported into temp_table with columns n, city_name:
insert into target_table
select t.n, c.city_id as city
from temp_table t, cities c
where t.city_name = c.city_name
Nice tip about using Excel, but I also suggest getting comfortable with a scripting language like Python, because for some tasks it's easier to just write a quick python script to do the job than trying to find the function you need in Excel or a pre-made tool that does the job.
You can use the free csvsql to do this.
Install it using these instructions
Now run a command like so to import your data into your database. More details at the links above, but it'd be something like:
csvsql --db firebase:///d=mydb --insert mydata.csv
The following works with sqlite, and is what I use to convert data into an easy to query format
csvsql --db sqlite:///dump.db --insert mydata.csv
use the csv-file as an external table. Then you can use SQL to copy the data from the external table to your destination table - with all the possibilities of SQL.
See http://www.firebirdsql.org/index.php?op=useful&id=netzka
Just finished this VBA script which might be handy for this purpose. All should need to do is change the Insert statement to include the table in question and the list of columns (obviously in the same sequence they appear on the Excel file).
Function CreateInsertStatement()
'Output file location and start of the insert statement
SQLScript = "C:\Inserts.sql"
cStart = "Insert Into Holidays (HOLIDAY_ID, NAT_HOLDAY_DESC, NAT_HOLDAY_DTE) Values ("
'Open file for output
Open SQLScript For Output As #1
Dim LoopThruRows As Boolean
Dim LoopThruCols As Boolean
nCommit = 1 'Commit Count
nCommitCount = 100 'The number of rows after which a commit is performed
LoopThruRows = True
nRow = 1 'Current row
While LoopThruRows
nRow = nRow + 1 'Start at second row - presuming there are headers
nCol = 1 'Reset the columns
If Cells(nRow, nCol).Value = Empty Then
Print #1, "Commit;"
LoopThruRows = False
Else
If nCommit = nCommitCount Then
Print #1, "Commit;"
nCommit = 1
Else
nCommit = nCommit + 1
End If
cLine = cStart
LoopThruCols = True
While LoopThruCols
If Cells(nRow, nCol).Value = Empty Then
cLine = cLine & ");" 'Close the SQL statement
Print #1, cLine 'Write the line
LoopThruCols = False 'Exit the cols loop
Else
If nCol > 1 Then 'add a preceeding comma for all bar the first column
cLine = cLine & ", "
End If
If Right(Left(Cells(nRow, nCol).Value, 3), 1) = "/" Then 'Format for dates
cLine = cLine & "TO_DATE('" & Cells(nRow, nCol).Value & "', 'dd/mm/yyyy')"
ElseIf IsNumeric(Left(Cells(nRow, nCol).Value, 1)) Then 'Format for numbers
cLine = cLine & Cells(nRow, nCol).Value
Else 'Format for text, including apostrophes
cLine = cLine & "'" & Replace(Cells(nRow, nCol).Value, "'", "''") & "'"
End If
nCol = nCol + 1
End If
Wend
End If
Wend
Close #1
End Function
option 1:
1- have you tried IBExert? IBExpert \ Tools \ Import Data (Trial or Customer Version).
option 2:
2- upload your csv file to a temporary table with F_BLOBLOAD.
3- create a stored procedure, which used 3 functions (f_stringlength, f_strcopy, f_MID)
you cross all your string, pulling your fields to build your INSERT INTO.
links:
2: http://freeadhocudf.org/documentation_english/dok_eng_file.html
3: http://freeadhocudf.org/documentation_english/dok_eng_string.html
A tool I recently tried that worked outstandingly well is FSQL.
You write an IMPORT command, paste it into FSQL and it imports the CSV file into the Firebird table.
you can use shell
sed "s/,/','/g" file.csv > tmp
sed "s/$/'),(/g" tmp > tmp2
sed "s/^./'&/g" tmp2 > insert.sql
and then add
INSERT INTO PERSON (ID, NAME, CITY_ID) VALUES(
...
);