I'm having trouble using ruby with dbi for some reason, I'm trying to do a select and put the results in an array but no luck.
require 'dbi'
db = DBI.connect('DBI:OCI8:database', XXXX, XXXX)
#Gets Consumer Id Number you want to create accounts for
numberOfAccounts = []
puts("Please enter a CID")
NewCID = gets.chomp()
numberOfAccounts << db.execute("select T_NBR from T_CBA where C_ID='#{NewCID}'").fetch
My array ends up like this:
[[<#BigDecimal:fc115f8,'0.8000169202 2E11',12(16)>]]
where I would like to have several different numbers like [222, 3232, 2323] etc.
I've searched online but to no avail.
DBI has probably determined that the underlying column can contain integers too large to fit in a regular int type, based on the data field. Or it may just use BigDecimal for all integer types to avoid worrying about it.
If you know that your values are all small enough to fit into a regular integer, you can convert the array to integers after you've populated it, like so:
1.9.3-p194 :014 > numberOfAccounts
=> [[#<BigDecimal:119cd90,'0.123E3',9(36)>], [#<BigDecimal:119cd18,'0.456E3',9(36)>]]
1.9.3-p194 :015 > numberOfAccounts.flatten!.collect!(&:to_i)
=> [123, 456]
1.9.3-p194 :016 > numberOfAccounts
=> [123, 456]
Related
I have a PostgreSQL table with YAML data stored in a text field.
I'm attempting to find all instances of where a key has been changed from false to true.
audited_changes: {"hide_on_map"=>[false, true]}
I can easily find all instances of this key with a like query on the attribute hide_on_map
[3] pry(main)> like_query = ActiveRecord::Base.send(:sanitize_sql_like, 'hide_on_map')
Audited::Audit.where(auditable_type: 'Lot').where('audited_changes like ?', "%#{like_query}%").count
(245.8ms) SELECT COUNT(*) FROM "audits" WHERE "audits"."auditable_type" = $1 AND (audited_changes like '%hide\_on\_map%') [["auditable_type", "Lot"]]
=> 1710
However, adding double quotes breaks this
[4] pry(main)> like_query = ActiveRecord::Base.send(:sanitize_sql_like, '"hide_on_map"')
Audited::Audit.where(auditable_type: 'Lot').where('audited_changes like ?', "%#{like_query}%").count
(238.5ms) SELECT COUNT(*) FROM "audits" WHERE "audits"."auditable_type" = $1 AND (audited_changes like '%"hide\_on\_map"%') [["auditable_type", "Lot"]]
=> 0
Let alone the full query
[5] pry(main)> like_query = ActiveRecord::Base.send(:sanitize_sql_like, '"hide_on_map"=>[false, true]')
Audited::Audit.where(auditable_type: 'Lot').where('audited_changes like ?', "%#{like_query}%").count
(245.0ms) SELECT COUNT(*) FROM "audits" WHERE "audits"."auditable_type" = $1 AND (audited_changes like '%"hide\_on\_map"=>[false, true]%') [["auditable_type", "Lot"]]
=> 0
Started going down a rabbit hole of converting to JSONB but this adds several additional complications that I'd rather not have to solve. Suggestions on a properly formed LIKE clause?
For those asking, two examples of this query directly in SQL at the psql prompt.
select count(*) from audits where audited_changes like '%"hide\_on\_map"%';
select count(*) from audits where audited_changes like '%\"hide\_on\_map\"%';
Both resulted in 0 results.
The problem is that there is not a unique way to express that data in YAML:
hide_on_map:
- no
- yes
"hide_on_map": [false, true]
are both valid YAML representations of your data.
I fear you cannot avoid using some native type, or at least a "compacted" JSON text (which would contain literally '"hide_on_map":[false,true]'.
I'm trying to do native SQL in Doctrine. Basically I have 2 parameters:
CANDIDATE_ID - user for who we delete entries,
list of FILE_ID to keep
So I make
$this->getEntityManager()->getConnection()->
executeUpdate( "DELETE FROM FILE WHERE CANDIDATE_ID = :ID AND NOT ID IN :KEEPID",
array(
"ID" => $candidate->id,
"KEEPID" => array(2) )
);
But Doctrine fails:
Notice: Array to string conversion in D:\xampp\htdocs\azk\vendor\doctrine\dbal\lib\Doctrine\DBAL\Connection.php on line 786
Is this bug in Doctrine? I'm making somewhere else select with IN but with QueryBuilder and it's working. Maybe someone could suggest better way of deleting entries, with QueryBuilder for example?
$stmt = $conn->executeQuery('SELECT * FROM articles WHERE id IN (?)',
array(array(1, 2, 3, 4, 5, 6)),
array(\Doctrine\DBAL\Connection::PARAM_INT_ARRAY)
);
From Doctrine's documentation.
You can't pass an array of IDs to a parameter. You can do this for scalar values, but even if this had a 'toString', it wouldn't be what you want.
String concatenation is one method,
"DELETE FROM FILE WHERE CANDIDATE_ID = :ID AND NOT ID IN (". implode(",", $list_of_ids) .")"
But this method goes straight around parameters, and therefore suffers in terms of readability, and is limited to a certain maximum line length, which can vary between databases.
Another approach is to write a function returning a table result, which takes a string of IDs as a parameter.
You could also solve this with a join to a table containing the IDs to keep.
It's a problem I've seen many times with few good answers, but it's usually caused by a misunderstanding in the way the database is modelled. This is a 'code smell' for database access.
I have several hundred documents with this information:
=> User(id: integer, email: string, amount: string)
How can I sum the amount variable of all the documents?
I am using Rails + Mongoid.
thanks
Normally with an integer field, you could do this:
User.sum(:amount)
# => 142.0
However, since your amount field is a string, it'll just concatenate strings instead. For example, if you had 3 users with amounts of 12, 30, and 100, you would get "01230100" as a result. In this case, you may need to use something like Ruby's inject instead:
User.all.inject(0) { |sum, user| sum + user.amount.to_f }
# => 142.0
With a simple model like that
class Model < ActiveRecord::Base
# ...
end
we can do queries like that
Model.where(["name = :name and updated_at >= :D", \
{ :D => (Date.today - 1.day).to_datetime, :name => "O'Connor" }])
Where the values in the hash will be substituted into the final SQL statement with proper escaping depending on the underlying database engine.
I would like to know a similar feature for SQL execution like:
ActiveRecord::Base.connection.execute( \
["update models set name = :name, hired_at = :D where id = :id;"], \
{ :id => 73465, :D => DateTime.now, :name => "O'My God" }] \
) # THIS CODE IS A FANTASY. NOT WORKING.
(Please do not solve the example with loading a Model object, modifying and then saving! The example is only an illustration for the feature I would like to have / know. Concentrate on the subject!)
The original problem is that I want to insert large amount (many thousand lines) of data into the database. I want to use some features of the SQL abstraction of the ActiveRecord framework but I don't want to use model objects based on ActiveRecord::Base because they are damn slow! (8 queries per second for my current problem.)
query = ActiveRecord::Base.connection.raw_connection.prepare("INSERT INTO users (name) VALUES(:name)")
query.execute(:name => 'test_name')
query.close
Extending the #peufeu solution with concrete code example for bulk insert:
users_places = []
users_values = []
timestamp = Time.now.strftime('%Y-%m-%d %H:%M:%S')
params[:users].each do |user|
users_places << "(?,?,?,?)"
users_values << user[:name] << user[:punch_line] << timestamp << timestamp
end
bulk_insert_users_sql_arr = ["INSERT INTO users (name, punch_line, created_at, updated_at) VALUES #{users_places.join(", ")}"] + users_values
begin
sql = ActiveRecord::Base.send(:sanitize_sql_array, bulk_insert_users_sql_arr)
ActiveRecord::Base.connection.execute(sql)
rescue
"something went wrong with the bulk insert sql query"
end
Here is the reference to sanitize_sql_array method in ActiveRecord::Base, it generates the proper query string by escaping the single quotes in the strings. For example the punch_line "Don't let them get you down" will become "Don\'t let them get you down".
Yes you could do raw SQL, but checkout the ar-extensions gem that helps with batch inserts:
https://github.com/zdennis/ar-extensions
Here's a post on it, and various other techniques:
http://www.coffeepowered.net/2009/01/23/mass-inserting-data-in-rails-without-killing-your-performance/
For INSERTs, batching them using a long VALUES clause (as shown by Simon's link) is the fastest way (unless you want to generate a text file and load it in your database with MySQL's LOAD DATA INFILE). But you have to be very careful about escaping your text values (which is not done in the example).
I was asking "what database are you using" because it does matter for mass UPDATEs.
For instance, you can do this on postgres (and I believe SQL Server changing "columnX" to "colX" ):
UPDATE foo
JOIN (VALUES (1,2),(3,4),... long list) v ON (foo.id=v.column1)
SET foo.bar = v.column2
And you can update a load of rows using a single statement, very fast.
If you don't need Ruby to perform some Ruby-specific magic on your data, the fastest way to transfer data from one DB to a different one is to export as a text file (CSV or tab separated), load it on the other DB (LOAD DATA INFILE on MySQL), perhaps in a temporary table, and bulk process using SQL.
EDIT : Here's how I do this in Python :
sql = [ "INSERT INTO foo (column list) VALUES " ]
values = []
for tuple in tuple_list:
append "(?,?,?,?)" to sql
extend values list with tuple
Then join sql into a string, you get "INSERT INTO foo (column list) VALUES (?,?,?,?),(?,?,?,?),(?,?,?,?)" with the "(?,?,?,?)" repeated as many times as you have lines to insert.
Then "values" contains a list of (a1,b1,c1,d1,a2,b2,c2,d2,a3,b3,c3,d3) with an,bn,cn,dn being the tuples you want to insert for line n. Each one corresponds to a placeholder in the sql string.
Then pass this to the usual "execute query with parameters" function which will handle quoting and escaping as usual.
I encountered a similar issue recently when tying to insert 100K+ records into a MySQL database for a Rails 4 app using mysql2 gem. The data included characters that had to be sanitized prior to insert.
The solution I ended going with was a slightly modified version of Option 3 described at https://www.coffeepowered.net/2009/01/23/mass-inserting-data-in-rails-without-killing-your-performance/
Here's the relevant code block from the above link:
TIMES = 10000
inserts = []
TIMES.times do
inserts.push "(3.0, '2009-01-23 20:21:13', 2, 1)"
end
sql = "INSERT INTO user_node_scores (`score`, `updated_at`, `node_id`, `user_id`) VALUES #{inserts.join(", ")}"
The modification I made was using the public method ActiveRecord::Base.sanitize() on values that required it.
inserts = []
created = Time.now.strftime "%Y-%m-%d %H:%M:%S"
params[:audits].each do |audit|
inserts.push "(#{audit.user_id), #{created}," + ActiveRecord::Base.sanitize(audit.comment) + ", #{audit.status})"
end
sql = "INSERT INTO user_audits (`user_id`, `created_at`, `comment`, `status`) VALUES #{inserts.join(", ")}"
Per section 2.2 of rails guide on Active Record query interface here:
which seems to indicate that I can pass a string specifying the condition(s), then an array of values that should be substituted at some point while the arel is being built. So I've got a statement that generates my conditions string, which can be a varying number of attributes chained together with either AND or OR between them, and I pass in an array as the second arg to the where method, and I get:
ActiveRecord::PreparedStatementInvalid: wrong number of bind variables (1 for 5)
which leads me to believe I'm doing this incorrectly. However, I'm not finding anything on how to do it correctly. To restate the problem another way, I need to pass in a string to the where method such as "table.attribute = ? AND table.attribute1 = ? OR table.attribute1 = ?" with an unknown number of these conditions anded or ored together, and then pass something, what I thought would be an array as the second argument that would be used to substitute the values in the first argument conditions string. Is this the correct approach, or, I'm just missing some other huge concept somewhere and I'm coming at this all wrong? I'd think that somehow, this has to be possible, short of just generating a raw sql string.
This is actually pretty simple:
Model.where(attribute: [value1,value2])
Sounds like you're doing something like this:
Model.where("attribute = ? OR attribute2 = ?", [value, value])
Whereas you need to do this:
# notice the lack of an array as the last argument
Model.where("attribute = ? OR attribute2 = ?", value, value)
Have a look at http://guides.rubyonrails.org/active_record_querying.html#array-conditions for more details on how this works.
Instead of passing the same parameter multiple times to where() like this
User.where(
"first_name like ? or last_name like ? or city like ?",
"%#{search}%", "%#{search}%", "%#{search}%"
)
you can easily provide a hash
User.where(
"first_name like :search or last_name like :search or city like :search",
{search: "%#{search}%"}
)
that makes your query much more readable for long argument lists.
Sounds like you're doing something like this:
Model.where("attribute = ? OR attribute2 = ?", [value, value])
Whereas you need to do this:
#notice the lack of an array as the last argument
Model.where("attribute = ? OR attribute2 = ?", value, value) Have a
look at
http://guides.rubyonrails.org/active_record_querying.html#array-conditions
for more details on how this works.
Was really close. You can turn an array into a list of arguments with *my_list.
Model.where("id = ? OR id = ?", *["1", "2"])
OR
params = ["1", "2"]
Model.where("id = ? OR id = ?", *params)
Should work
If you want to chain together an open-ended list of conditions (attribute names and values), I would suggest using an arel table.
It's a bit hard to give specifics since your question is so vague, so I'll just explain how to do this for a simple case of a Post model and a few attributes, say title, summary, and user_id (i.e. a user has_many posts).
First, get the arel table for the model:
table = Post.arel_table
Then, start building your predicate (which you will eventually use to create an SQL query):
relation = table[:title].eq("Foo")
relation = relation.or(table[:summary].eq("A post about foo"))
relation = relation.and(table[:user_id].eq(5))
Here, table[:title], table[:summary] and table[:user_id] are representations of columns in the posts table. When you call table[:title].eq("Foo"), you are creating a predicate, roughly equivalent to a find condition (get all rows whose title column equals "Foo"). These predicates can be chained together with and and or.
When your aggregate predicate is ready, you can get the result with:
Post.where(relation)
which will generate the SQL:
SELECT "posts".* FROM "posts"
WHERE (("posts"."title" = "Foo" OR "posts"."summary" = "A post about foo")
AND "posts"."user_id" = 5)
This will get you all posts that have either the title "Foo" or the summary "A post about foo", and which belong to a user with id 5.
Notice the way arel predicates can be endlessly chained together to create more and more complex queries. This means that if you have (say) a hash of attribute/value pairs, and some way of knowing whether to use AND or OR on each of them, you can loop through them one by one and build up your condition:
relation = table[:title].eq("Foo")
hash.each do |attr, value|
relation = relation.and(table[attr].eq(value))
# or relation = relation.or(table[attr].eq(value)) for an OR predicate
end
Post.where(relation)
Aside from the ease of chaining conditions, another advantage of arel tables is that they are independent of database, so you don't have to worry whether your MySQL query will work in PostgreSQL, etc.
Here's a Railscast with more on arel: http://railscasts.com/episodes/215-advanced-queries-in-rails-3?view=asciicast
Hope that helps.
You can use a hash rather than a string. Build up a hash with however many conditions and corresponding values you are going to have and put it into the first argument of the where method.
WRONG
This is what I used to do for some reason.
keys = params[:search].split(',').map!(&:downcase)
# keys are now ['brooklyn', 'queens']
query = 'lower(city) LIKE ?'
if keys.size > 1
# I need something like this depending on number of keys
# 'lower(city) LIKE ? OR lower(city) LIKE ? OR lower(city) LIKE ?'
query_array = []
keys.size.times { query_array << query }
#['lower(city) LIKE ?','lower(city) LIKE ?']
query = query_array.join(' OR ')
# which gives me 'lower(city) LIKE ? OR lower(city) LIKE ?'
end
# now I can query my model
# if keys size is one then keys are just 'brooklyn',
# in this case it is 'brooklyn', 'queens'
# #posts = Post.where('lower(city) LIKE ? OR lower(city) LIKE ?','brooklyn', 'queens' )
#posts = Post.where(query, *keys )
now however - yes - it's very simple. as nfriend21 mentioned
Model.where(attribute: [value1,value2])
does the same thing