How to insert custom value after validation in rails model - ruby-on-rails-3

This has been really difficult to find information on. The crux of it all is that I've got a Rails 3.2 app that accesses a MySQL database table with a column of type POINT. Without non-native code, rails doesn't know how to interpret this, which is fine because I only use it in internal DB queries.
The problem, however, is that it gets cast as an integer, and forced to null if blank. MySQL doesn't allow null for this field because there's an index on it, and integers are invalid, so this effectively means that I can't create new records through rails.
I've been searching for a way to change the value just before insertion into the db, but I'm just not up enough on my rails lit to pull it off. So far I've tried the following:
...
after_validation :set_geopoint_blank
def set_geopoint_blank
raw_write_attribute(:geopoint, '') if geopoint.blank?
#this results in NULL value in INSERT statement
end
---------------------------
#thing_controller.rb
...
def create
#thing = Thing.new
#thing.geopoint = 'GeomFromText("POINT(' + lat + ' ' + lng + ')")'
#thing.save
# This also results in NULL and an error
end
---------------------------
#thing_controller.rb
...
def create
#thing = Thing.new
#thing.geopoint = '1'
#thing.save
# This results in `1` being inserted, but fails because that's invalid spatial data.
end
To me, the ideal would be to be able to force rails to put the string 'GeomFromText(...)' into the insert statement that it creates, but I don't know how to do that.
Awaiting the thoughts and opinions of the all-knowing community....

Ok, I ended up using the first link in steve klein's comment to just insert raw sql. Here's what my code looks like in the end:
def create
# Create a Thing instance and assign it the POSTed values
#thing = Thing.new
#thing.assign_attributes(params[:thing], :as => :admin)
# Check to see if all the passed values are valid
if #thing.valid?
# If so, start a DB transaction
ActiveRecord::Base.transaction do
# Insert the minimum data, plus the geopoint
sql = 'INSERT INTO `things`
(`thing_name`,`thing_location`,`geopoint`)
values (
"tmp_insert",
"tmp_location",
GeomFromText("POINT(' + params[:thing][:lat].to_f.to_s + ' ' + params[:thing][:lng].to_f.to_s + ')")
)'
id = ActiveRecord::Base.connection.insert(sql)
# Then load in the newly-created Thing instance and update it's values with the passed values
#real_thing = Thing.find(id)
#real_thing.update_attributes(b, :as => :admin)
end
# Notify the user of success
flash[:message] = { :header => 'Thing successfully created!' }
redirect_to edit_admin_thing_path(#real_thing)
else
# If passed values not valid, alert and re-render form
flash[:error] = { :header => 'Oops! You\'ve got some errors:', :body => #thing.errors.full_messages.join("</p><p>").html_safe }
render 'admin/things/new'
end
end
Not beautiful, but it works.

Related

SQL statement to convert jsonb hash to json string

I have a rails data migration (postgres db) where I have to use pure sql to convert the data due to some model restrictions. The data is stored as json as a string, but I need it to be a usable hash for other purposes.
My migration works to convert it to the hash. However, my down method ends up just deleting the data or leaving it as an empty {}. Btw to clear up any confusion, my column name is actually saved as data in table Games
Based on my up method, how would i properly reverse the migration using sql only?
class ConvertGamesDataToJson < ActiveRecord::Migration[6.0]
def up
statement = <<~SQL
update games set data = regexp_replace(trim(both '"' from data::text), '\\\\"', '"', 'g')::jsonb;
SQL
ActiveRecord::Base.connection.execute(statement)
# this part works!
end
def down
statement = <<~SQL
update games set data = to_json(data::text)::jsonb;
SQL
ActiveRecord::Base.connection.execute(statement)
end
end
Here is how the it looks after properly converting it
data: {
"id"=>"d092a-f2323",
"recent"=>'yes',
"note"=>"some text",
"order"=>1
}
how it is before the migration and what it needs to rollback to:
data:
"{
\"id\":\"d092a-f2323\",
\"recent\":\"yes\",
\"note\":\"some text\",
\"order\":1,
}"
If you're displaying a data structure in the rails console, those \" aren't really there. They're just formatting because the console has wrapped the string in ". For example...
[2] pry(main)> %{"up": "down"}
=> "\"up\": \"down\""
But if we print it...
[3] pry(main)> puts %{"up": "down"}
"up": "down"
Given that is a JSON string, you can simply change the type of the column to jsonb and be done with it.
-- up
alter table games alter column data type jsonb USING data::jsonb;
-- down
alter table games alter column data type text;
Postgres doesn't know how to automatically cast text to jsonb, so we need to tell it. using data::jsonb does a simple cast of the text to jsonb. It can cast jsonb to text just fine.
You can do this in a migration with change_column.
def up
change_column :users, :data, :jsonb, using: 'data::jsonb'
end
def down
change_column :users, :data, :text
end

Rails with Devise: Set Unique Username from First Name

In my app I'm trying to automatically populate a username column from the first_name field that I already have tied in to my Devise login system. In theory, it should just be the user's first_name if they are the only one with that name, but it should be something like "Mal the 4th" or "Jayne the 3rd" if there are other users with that first name already.
So far, in googling and consulting other SO posts (like this one) I have this basic structure in my registrations_controller:
before_create :set_username
private
def set_username
#users = User.where(first_name == self.first_name)
same_first_name_array = []
#users.each do |u|
same_first_name_array << u.first_name
end
if same_first_name_array.size = 0
self.username = first_name
else
self.username = first_name + " the " + ordinalize(same_first_name_array.size + 1)
end
end
But I'm struggling to fill in the blanks. So far it looks like the best way to do it is to do an if statement that checks if that first_name is unique and, if it's not to ordinalize some kind of count, but please let me know if there's a better or "more Ruby" way. Any help getting this to work would be appreciated!
I think your way is correct, however can be optmized definately, you can avoid the step in which you create a array and populate all the same names.
Also, try to see if you can create a index on the first name column, this will optimise the query.
self.username already points to name user has entered, you need to change it only if it has multiple occurrences so no need of if / else. change it only if there are multiple occurrence.
You can rewrite it like below
before_create :set_username
private
def set_username
#users = User.where(first_name == self.first_name)
self.username = #users.count.eql?(0) ? self.first_name : first_name + " the " + ordinalize(#users.count + 1)
end
Note: having user names likes this will create performance issues when your application scales and has large number of users.

SQL injections in Rails 4 issue

I'm trying to learn about SQL injections and have tried to implement these, but when I put this code in my controller:
params[:username] = "johndoe') OR admin = 't' --"
#user_query = User.find(:first, :conditions => "username = '#{params[:username]}'")
I get the following error:
Couldn't find all Users with 'id': (first, {:conditions=>"username = 'johndoe') OR admin = 't' --'"}) (found 0 results, but was looking for 2)
I have created a User Model with the username "johndoe", but I am still getting no proper response. BTW I am using Rails 4.
You're using an ancient Rails syntax. Don't use
find(:first, :condition => <condition>) ...
Instead use
User.where(<condtion>).first
find accepts a list of IDs to lookup records for. You're giving it an ID of :first and an ID of condition: ..., which aren't going to match any records.
User.where(attr1: value, attr2: value2)
or for single items
User.find_by(attr1: value, attr2: value)
Bear in mind that while doing all this, it would be valuable to check what the actual sql statement is by adding "to_sql" to the end of the query method (From what I remember, find_by just does a LIMIT by 1)

Ruby statement with embedded SQL INSERT - Syntax error

I have the following Ruby script: It creates a database, reads a csv file and inserts each row into the database.
require "sqlite3"
require "csv"
require "pp"
begin
db = SQLite::Database.new("myDB.db")
db.execute("CREATE TABLE IF NOT EXISTS MYTABLE(Id INTEGER PRIMARY KEY AUTOINCREMENT,
stations TEXT, dayparts TEXT, age TEXT, rtg DOUBLE, reach DOUBLE")
myData = {}
CSV.foreach("test_file.csv", :headers=>true, :header_converters => :symbol, :converters => :all)
do |row|
row.to_hash.each do |key, value|
mydata[key.to_sym] = value
end
db.execute("INSERT INTO myDB(stations, dayparts, age, rtg, reach) VALUES(?,?,?,?,?)",
myDATA[:stations], myData[:dayparts], myData[:age], myData[:rtg], mydata[:dlyrch000])
end
rescue SQLite3::Exception => e
puts "Exception occured"
puts e
ensure
db.close if db
end
When I run this script with static data. that is this line:
db.execute("INSERT INTO myDB(stations, dayparts, age, rtg, reach) VALUES(?,?,?,?,?)",
myDATA[:stations], myData[:dayparts], myData[:age], myData[:rtg], mydata[:dlyrch000])
is replaced by this:
db.execute("INSERT INTO myDB(stations, dayparts, age, rtg, reach) VALUES(?,?,?,?,?)",
test, test, 33, .8989, .23434)
A database is created with this data.
But when I try this script as above it throws an exception:
syntax error, unexpected ",", expecting ')' ... rtg, reach) VALUES (?,?,?,?,?)",
---> myData [:stations], myData[....
etc.
I have tried different options but cannot seem to get around this. Can someone please help me with this
There are three syntax errors I can see.
The CREATE TABLE SQL statement as a missing close parenthesis before the closing quote. It should look like
db.execute("CREATE TABLE IF NOT EXISTS MYTABLE(Id INTEGER PRIMARY KEY AUTOINCREMENT,
stations TEXT, dayparts TEXT, age TEXT, rtg DOUBLE, reach DOUBLE)")
The block for CSV.foreach has to start on the same line as the closing parenthesis for the method call, like this
CSV.foreach("test_file.csv", :headers => true, :header_converters => :symbol, :converters => :all) do |row|
...
enc
The database constructor uses SQLite when it should be SQLite3. Like this
db = SQLite3::Database.new("myDB.db")
I can't see anything wrong with the part of your code that is raising an error, but I presume you aren't showing the current version of your program as it is a long way from getting as far as that.

Rails3: SQL execution with hash substitution like .where()

With a simple model like that
class Model < ActiveRecord::Base
# ...
end
we can do queries like that
Model.where(["name = :name and updated_at >= :D", \
{ :D => (Date.today - 1.day).to_datetime, :name => "O'Connor" }])
Where the values in the hash will be substituted into the final SQL statement with proper escaping depending on the underlying database engine.
I would like to know a similar feature for SQL execution like:
ActiveRecord::Base.connection.execute( \
["update models set name = :name, hired_at = :D where id = :id;"], \
{ :id => 73465, :D => DateTime.now, :name => "O'My God" }] \
) # THIS CODE IS A FANTASY. NOT WORKING.
(Please do not solve the example with loading a Model object, modifying and then saving! The example is only an illustration for the feature I would like to have / know. Concentrate on the subject!)
The original problem is that I want to insert large amount (many thousand lines) of data into the database. I want to use some features of the SQL abstraction of the ActiveRecord framework but I don't want to use model objects based on ActiveRecord::Base because they are damn slow! (8 queries per second for my current problem.)
query = ActiveRecord::Base.connection.raw_connection.prepare("INSERT INTO users (name) VALUES(:name)")
query.execute(:name => 'test_name')
query.close
Extending the #peufeu solution with concrete code example for bulk insert:
users_places = []
users_values = []
timestamp = Time.now.strftime('%Y-%m-%d %H:%M:%S')
params[:users].each do |user|
users_places << "(?,?,?,?)"
users_values << user[:name] << user[:punch_line] << timestamp << timestamp
end
bulk_insert_users_sql_arr = ["INSERT INTO users (name, punch_line, created_at, updated_at) VALUES #{users_places.join(", ")}"] + users_values
begin
sql = ActiveRecord::Base.send(:sanitize_sql_array, bulk_insert_users_sql_arr)
ActiveRecord::Base.connection.execute(sql)
rescue
"something went wrong with the bulk insert sql query"
end
Here is the reference to sanitize_sql_array method in ActiveRecord::Base, it generates the proper query string by escaping the single quotes in the strings. For example the punch_line "Don't let them get you down" will become "Don\'t let them get you down".
Yes you could do raw SQL, but checkout the ar-extensions gem that helps with batch inserts:
https://github.com/zdennis/ar-extensions
Here's a post on it, and various other techniques:
http://www.coffeepowered.net/2009/01/23/mass-inserting-data-in-rails-without-killing-your-performance/
For INSERTs, batching them using a long VALUES clause (as shown by Simon's link) is the fastest way (unless you want to generate a text file and load it in your database with MySQL's LOAD DATA INFILE). But you have to be very careful about escaping your text values (which is not done in the example).
I was asking "what database are you using" because it does matter for mass UPDATEs.
For instance, you can do this on postgres (and I believe SQL Server changing "columnX" to "colX" ):
UPDATE foo
JOIN (VALUES (1,2),(3,4),... long list) v ON (foo.id=v.column1)
SET foo.bar = v.column2
And you can update a load of rows using a single statement, very fast.
If you don't need Ruby to perform some Ruby-specific magic on your data, the fastest way to transfer data from one DB to a different one is to export as a text file (CSV or tab separated), load it on the other DB (LOAD DATA INFILE on MySQL), perhaps in a temporary table, and bulk process using SQL.
EDIT : Here's how I do this in Python :
sql = [ "INSERT INTO foo (column list) VALUES " ]
values = []
for tuple in tuple_list:
append "(?,?,?,?)" to sql
extend values list with tuple
Then join sql into a string, you get "INSERT INTO foo (column list) VALUES (?,?,?,?),(?,?,?,?),(?,?,?,?)" with the "(?,?,?,?)" repeated as many times as you have lines to insert.
Then "values" contains a list of (a1,b1,c1,d1,a2,b2,c2,d2,a3,b3,c3,d3) with an,bn,cn,dn being the tuples you want to insert for line n. Each one corresponds to a placeholder in the sql string.
Then pass this to the usual "execute query with parameters" function which will handle quoting and escaping as usual.
I encountered a similar issue recently when tying to insert 100K+ records into a MySQL database for a Rails 4 app using mysql2 gem. The data included characters that had to be sanitized prior to insert.
The solution I ended going with was a slightly modified version of Option 3 described at https://www.coffeepowered.net/2009/01/23/mass-inserting-data-in-rails-without-killing-your-performance/
Here's the relevant code block from the above link:
TIMES = 10000
inserts = []
TIMES.times do
inserts.push "(3.0, '2009-01-23 20:21:13', 2, 1)"
end
sql = "INSERT INTO user_node_scores (`score`, `updated_at`, `node_id`, `user_id`) VALUES #{inserts.join(", ")}"
The modification I made was using the public method ActiveRecord::Base.sanitize() on values that required it.
inserts = []
created = Time.now.strftime "%Y-%m-%d %H:%M:%S"
params[:audits].each do |audit|
inserts.push "(#{audit.user_id), #{created}," + ActiveRecord::Base.sanitize(audit.comment) + ", #{audit.status})"
end
sql = "INSERT INTO user_audits (`user_id`, `created_at`, `comment`, `status`) VALUES #{inserts.join(", ")}"