ActiveRecord: List columns in table from console - sql

I know that you can ask ActiveRecord to list tables in console using:
ActiveRecord::Base.connection.tables
Is there a command that would list the columns in a given table?

This will list the column_names from a table
Model.column_names
e.g. User.column_names

This gets the columns, not just the column names and uses ActiveRecord::Base::Connection, so no models are necessary. Handy for quickly outputting the structure of a db.
ActiveRecord::Base.connection.tables.each do |table_name|
puts table_name
ActiveRecord::Base.connection.columns(table_name).each do |c|
puts "- #{c.name}: #{c.type} #{c.limit}"
end
end
Sample output: http://screencast.com/t/EsNlvJEqM

Using rails three you can just type the model name:
> User
gives:
User(id: integer, name: string, email: string, etc...)
In rails four, you need to establish a connection first:
irb(main):001:0> User
=> User (call 'User.connection' to establish a connection)
irb(main):002:0> User.connection; nil #call nil to stop repl spitting out the connection object (long)
=> nil
irb(main):003:0> User
User(id: integer, name: string, email: string, etc...)

If you are comfortable with SQL commands, you can enter your app's folder and run rails db, which is a brief form of rails dbconsole. It will enter the shell of your database, whether it is sqlite or mysql.
Then, you can query the table columns using sql command like:
pragma table_info(your_table);

complementing this useful information, for example using rails console o rails dbconsole:
Student is my Model, using rails console:
$ rails console
> Student.column_names
=> ["id", "name", "surname", "created_at", "updated_at"]
> Student
=> Student(id: integer, name: string, surname: string, created_at: datetime, updated_at: datetime)
Other option using SQLite through Rails:
$ rails dbconsole
sqlite> .help
sqlite> .table
ar_internal_metadata relatives schools
relationships schema_migrations students
sqlite> .schema students
CREATE TABLE "students" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar, "surname" varchar, "created_at" datetime NOT NULL, "updated_at" datetime NOT NULL);
Finally for more information.
sqlite> .help
Hope this helps!

You can run rails dbconsole in you command line tool to open sqlite console. Then type in .tables to list all the tables and .fullschema to get a list of all tables with column names and types.

To list the columns in a table I usually go with this:
Model.column_names.sort.
i.e. Orders.column_names.sort
Sorting the column names makes it easy to find what you are looking for.
For more information on each of the columns use this:
Model.columns.map{|column| [column.name, column.sql_type]}.to_h.
This will provide a nice hash.
for example:
{
id => int(4),
created_at => datetime
}

For a more compact format, and less typing just:
Portfolio.column_types

I am using rails 6.1 and have built a simple rake task for this.
You can invoke this from the cli using rails db:list[users] if you want a simple output with field names. If you want all the details then do rails db:list[users,1].
I constructed this from this question How to pass command line arguments to a rake task about passing command line arguments to rake tasks. I also built on #aaron-henderson's answer above.
# run like `rails db:list[users]`, `rails db:list[users,1]`, `RAILS_ENV=development rails db:list[users]` etc
namespace :db do
desc "list fields/details on a model"
task :list, [:model, :details] => [:environment] do |task, args|
model = args[:model]
if !args[:details].present?
model.camelize.constantize.column_names.each do |column_name|
puts column_name
end
else
ActiveRecord::Base.connection.tables.each do |table_name|
next if table_name != model.underscore.pluralize
ActiveRecord::Base.connection.columns(table_name).each do |c|
puts "Name: #{c.name} | Type: #{c.type} | Default: #{c.default} | Limit: #{c.limit} | Precision: #{c.precision} | Scale: #{c.scale} | Nullable: #{c.null} "
end
end
end
end
end

Related

SQL statement to convert jsonb hash to json string

I have a rails data migration (postgres db) where I have to use pure sql to convert the data due to some model restrictions. The data is stored as json as a string, but I need it to be a usable hash for other purposes.
My migration works to convert it to the hash. However, my down method ends up just deleting the data or leaving it as an empty {}. Btw to clear up any confusion, my column name is actually saved as data in table Games
Based on my up method, how would i properly reverse the migration using sql only?
class ConvertGamesDataToJson < ActiveRecord::Migration[6.0]
def up
statement = <<~SQL
update games set data = regexp_replace(trim(both '"' from data::text), '\\\\"', '"', 'g')::jsonb;
SQL
ActiveRecord::Base.connection.execute(statement)
# this part works!
end
def down
statement = <<~SQL
update games set data = to_json(data::text)::jsonb;
SQL
ActiveRecord::Base.connection.execute(statement)
end
end
Here is how the it looks after properly converting it
data: {
"id"=>"d092a-f2323",
"recent"=>'yes',
"note"=>"some text",
"order"=>1
}
how it is before the migration and what it needs to rollback to:
data:
"{
\"id\":\"d092a-f2323\",
\"recent\":\"yes\",
\"note\":\"some text\",
\"order\":1,
}"
If you're displaying a data structure in the rails console, those \" aren't really there. They're just formatting because the console has wrapped the string in ". For example...
[2] pry(main)> %{"up": "down"}
=> "\"up\": \"down\""
But if we print it...
[3] pry(main)> puts %{"up": "down"}
"up": "down"
Given that is a JSON string, you can simply change the type of the column to jsonb and be done with it.
-- up
alter table games alter column data type jsonb USING data::jsonb;
-- down
alter table games alter column data type text;
Postgres doesn't know how to automatically cast text to jsonb, so we need to tell it. using data::jsonb does a simple cast of the text to jsonb. It can cast jsonb to text just fine.
You can do this in a migration with change_column.
def up
change_column :users, :data, :jsonb, using: 'data::jsonb'
end
def down
change_column :users, :data, :text
end

How to INSERT a reference to UUID from another table in PostgreSQL?

I'm learning to use Sequelize to use with a PostgreSQL database. All of the following is happening on a dev. environment. This happened while manually trying to insert data into my tables to check if things are setup correctly through Sequelize, check on failing unit tests, etc.
I've made two tables with Sequelize models: User and Publication. Both these tables are generating UUIDv4. I've associated the User hasMany Publications, and Publication belongsTo User (you may reference the extra info).
On my psql shell, I've inserted the following record to my User table (rest of the data cut out for brevity):
| id | firstName | lastName | ..|
|----------------------------------------|------------|-----------|---|
| 8c878e6f-ee13-4a37-a208-7510c2638944 | Aiz | .... |...|
Now I'm trying to insert a record into my Publication table while referencing my newly created user above. Here's what I entered into the shell:
INSERT INTO "Publications"("title", "fileLocation", ..., "userId")VALUES('How to Pasta', 'www.pasta.com', ..., 8c878e6f-ee13-4a37-a208-7510c2638944);
It fails and I receive the following error:
ERROR: syntax error at or near "c878e6f"
LINE 1: ...8c878e6f-ee...
(it points to the second character on the terminal in LINE 1 reference - the 'c').
What's wrong here? Are we supposed to enter UUIDs another way if we want to do it manually in psql? Do we paste the referenced UUID as a string? Is there a correct way I'm missing from my own research?
Some extra info if it helps:
From my models:
Publication.associate = function(models) {
// associations can be defined here
Publication.belongsTo(models.User, {
foreignKey: "userId"
});
};
and
User.associate = function(models) {
// associations can be defined here
User.hasMany(models.Publication, {
foreignKey: "userId",
as: "publications"
});
};
Here's how I've defined userId in Publication:
userId: {
type: DataTypes.UUID,
references: {
model: "User",
key: "id",
as: "userId"
}
}
If it's worth anything, my (primaryKey) id on both models are type: DataTypes.UUID, defaultValue: DataTypes.UUIDV4 (I don't know if this is an issue).
surround your uuid in apostrophes (write it as a string) and pg will convert it to a uuid
Starting and ending your string with {} is optional
Eg
INSERT INTO "Publications"("title", "fileLocation", ..., "userId")VALUES('How to Pasta', 'www.pasta.com', ..., '8c878e6f-ee13-4a37-a208-7510c2638944');
Or
INSERT INTO "Publications"("title", "fileLocation", ..., "userId")VALUES('How to Pasta', 'www.pasta.com', ..., '{8c878e6f-ee13-4a37-a208-7510c2638944}');
Source (I don't do pgsql much so I casted around for another person who wrote some working pgsql. If this doesn't work out for you let me know and I'll remove the answer): PostgreSQL 9.3: How to insert upper case UUID into table

SQL injections in Rails 4 issue

I'm trying to learn about SQL injections and have tried to implement these, but when I put this code in my controller:
params[:username] = "johndoe') OR admin = 't' --"
#user_query = User.find(:first, :conditions => "username = '#{params[:username]}'")
I get the following error:
Couldn't find all Users with 'id': (first, {:conditions=>"username = 'johndoe') OR admin = 't' --'"}) (found 0 results, but was looking for 2)
I have created a User Model with the username "johndoe", but I am still getting no proper response. BTW I am using Rails 4.
You're using an ancient Rails syntax. Don't use
find(:first, :condition => <condition>) ...
Instead use
User.where(<condtion>).first
find accepts a list of IDs to lookup records for. You're giving it an ID of :first and an ID of condition: ..., which aren't going to match any records.
User.where(attr1: value, attr2: value2)
or for single items
User.find_by(attr1: value, attr2: value)
Bear in mind that while doing all this, it would be valuable to check what the actual sql statement is by adding "to_sql" to the end of the query method (From what I remember, find_by just does a LIMIT by 1)

Ruby statement with embedded SQL INSERT - Syntax error

I have the following Ruby script: It creates a database, reads a csv file and inserts each row into the database.
require "sqlite3"
require "csv"
require "pp"
begin
db = SQLite::Database.new("myDB.db")
db.execute("CREATE TABLE IF NOT EXISTS MYTABLE(Id INTEGER PRIMARY KEY AUTOINCREMENT,
stations TEXT, dayparts TEXT, age TEXT, rtg DOUBLE, reach DOUBLE")
myData = {}
CSV.foreach("test_file.csv", :headers=>true, :header_converters => :symbol, :converters => :all)
do |row|
row.to_hash.each do |key, value|
mydata[key.to_sym] = value
end
db.execute("INSERT INTO myDB(stations, dayparts, age, rtg, reach) VALUES(?,?,?,?,?)",
myDATA[:stations], myData[:dayparts], myData[:age], myData[:rtg], mydata[:dlyrch000])
end
rescue SQLite3::Exception => e
puts "Exception occured"
puts e
ensure
db.close if db
end
When I run this script with static data. that is this line:
db.execute("INSERT INTO myDB(stations, dayparts, age, rtg, reach) VALUES(?,?,?,?,?)",
myDATA[:stations], myData[:dayparts], myData[:age], myData[:rtg], mydata[:dlyrch000])
is replaced by this:
db.execute("INSERT INTO myDB(stations, dayparts, age, rtg, reach) VALUES(?,?,?,?,?)",
test, test, 33, .8989, .23434)
A database is created with this data.
But when I try this script as above it throws an exception:
syntax error, unexpected ",", expecting ')' ... rtg, reach) VALUES (?,?,?,?,?)",
---> myData [:stations], myData[....
etc.
I have tried different options but cannot seem to get around this. Can someone please help me with this
There are three syntax errors I can see.
The CREATE TABLE SQL statement as a missing close parenthesis before the closing quote. It should look like
db.execute("CREATE TABLE IF NOT EXISTS MYTABLE(Id INTEGER PRIMARY KEY AUTOINCREMENT,
stations TEXT, dayparts TEXT, age TEXT, rtg DOUBLE, reach DOUBLE)")
The block for CSV.foreach has to start on the same line as the closing parenthesis for the method call, like this
CSV.foreach("test_file.csv", :headers => true, :header_converters => :symbol, :converters => :all) do |row|
...
enc
The database constructor uses SQLite when it should be SQLite3. Like this
db = SQLite3::Database.new("myDB.db")
I can't see anything wrong with the part of your code that is raising an error, but I presume you aren't showing the current version of your program as it is a long way from getting as far as that.

Rails3: SQL execution with hash substitution like .where()

With a simple model like that
class Model < ActiveRecord::Base
# ...
end
we can do queries like that
Model.where(["name = :name and updated_at >= :D", \
{ :D => (Date.today - 1.day).to_datetime, :name => "O'Connor" }])
Where the values in the hash will be substituted into the final SQL statement with proper escaping depending on the underlying database engine.
I would like to know a similar feature for SQL execution like:
ActiveRecord::Base.connection.execute( \
["update models set name = :name, hired_at = :D where id = :id;"], \
{ :id => 73465, :D => DateTime.now, :name => "O'My God" }] \
) # THIS CODE IS A FANTASY. NOT WORKING.
(Please do not solve the example with loading a Model object, modifying and then saving! The example is only an illustration for the feature I would like to have / know. Concentrate on the subject!)
The original problem is that I want to insert large amount (many thousand lines) of data into the database. I want to use some features of the SQL abstraction of the ActiveRecord framework but I don't want to use model objects based on ActiveRecord::Base because they are damn slow! (8 queries per second for my current problem.)
query = ActiveRecord::Base.connection.raw_connection.prepare("INSERT INTO users (name) VALUES(:name)")
query.execute(:name => 'test_name')
query.close
Extending the #peufeu solution with concrete code example for bulk insert:
users_places = []
users_values = []
timestamp = Time.now.strftime('%Y-%m-%d %H:%M:%S')
params[:users].each do |user|
users_places << "(?,?,?,?)"
users_values << user[:name] << user[:punch_line] << timestamp << timestamp
end
bulk_insert_users_sql_arr = ["INSERT INTO users (name, punch_line, created_at, updated_at) VALUES #{users_places.join(", ")}"] + users_values
begin
sql = ActiveRecord::Base.send(:sanitize_sql_array, bulk_insert_users_sql_arr)
ActiveRecord::Base.connection.execute(sql)
rescue
"something went wrong with the bulk insert sql query"
end
Here is the reference to sanitize_sql_array method in ActiveRecord::Base, it generates the proper query string by escaping the single quotes in the strings. For example the punch_line "Don't let them get you down" will become "Don\'t let them get you down".
Yes you could do raw SQL, but checkout the ar-extensions gem that helps with batch inserts:
https://github.com/zdennis/ar-extensions
Here's a post on it, and various other techniques:
http://www.coffeepowered.net/2009/01/23/mass-inserting-data-in-rails-without-killing-your-performance/
For INSERTs, batching them using a long VALUES clause (as shown by Simon's link) is the fastest way (unless you want to generate a text file and load it in your database with MySQL's LOAD DATA INFILE). But you have to be very careful about escaping your text values (which is not done in the example).
I was asking "what database are you using" because it does matter for mass UPDATEs.
For instance, you can do this on postgres (and I believe SQL Server changing "columnX" to "colX" ):
UPDATE foo
JOIN (VALUES (1,2),(3,4),... long list) v ON (foo.id=v.column1)
SET foo.bar = v.column2
And you can update a load of rows using a single statement, very fast.
If you don't need Ruby to perform some Ruby-specific magic on your data, the fastest way to transfer data from one DB to a different one is to export as a text file (CSV or tab separated), load it on the other DB (LOAD DATA INFILE on MySQL), perhaps in a temporary table, and bulk process using SQL.
EDIT : Here's how I do this in Python :
sql = [ "INSERT INTO foo (column list) VALUES " ]
values = []
for tuple in tuple_list:
append "(?,?,?,?)" to sql
extend values list with tuple
Then join sql into a string, you get "INSERT INTO foo (column list) VALUES (?,?,?,?),(?,?,?,?),(?,?,?,?)" with the "(?,?,?,?)" repeated as many times as you have lines to insert.
Then "values" contains a list of (a1,b1,c1,d1,a2,b2,c2,d2,a3,b3,c3,d3) with an,bn,cn,dn being the tuples you want to insert for line n. Each one corresponds to a placeholder in the sql string.
Then pass this to the usual "execute query with parameters" function which will handle quoting and escaping as usual.
I encountered a similar issue recently when tying to insert 100K+ records into a MySQL database for a Rails 4 app using mysql2 gem. The data included characters that had to be sanitized prior to insert.
The solution I ended going with was a slightly modified version of Option 3 described at https://www.coffeepowered.net/2009/01/23/mass-inserting-data-in-rails-without-killing-your-performance/
Here's the relevant code block from the above link:
TIMES = 10000
inserts = []
TIMES.times do
inserts.push "(3.0, '2009-01-23 20:21:13', 2, 1)"
end
sql = "INSERT INTO user_node_scores (`score`, `updated_at`, `node_id`, `user_id`) VALUES #{inserts.join(", ")}"
The modification I made was using the public method ActiveRecord::Base.sanitize() on values that required it.
inserts = []
created = Time.now.strftime "%Y-%m-%d %H:%M:%S"
params[:audits].each do |audit|
inserts.push "(#{audit.user_id), #{created}," + ActiveRecord::Base.sanitize(audit.comment) + ", #{audit.status})"
end
sql = "INSERT INTO user_audits (`user_id`, `created_at`, `comment`, `status`) VALUES #{inserts.join(", ")}"