Ruby 1.9: gem crypt19 : blowfish.encrypt_string : problem with encoding - ruby-on-rails-3

I am building a Rails app where I serialize a hash to JSON, and then encode the hash using the gem crypt19 and the blowfish algorithm. I'm using Rails 3.0.9, Ruby 1.9.2 p180, the latest crypt19 v1.2.1 and I'm developing on Windows7. In my Rails controller, I do:
require 'crypt/blowfish'
h=Hash.new
h["thing"]="12345"
h["thang"]="abcdefghijklmnopqrstuvwxyz"
blowfish=Crypt::Blowfish.new("SECRET")
encrypted_thingy=blowfish.encrypt_string(h.to_json)
I assign encrypted_thingy to a value in the model (which is a string),
#my_model.string_thing=encrypted_thingy
#my_model.save
but when I save the model it throws an Argument Error exception where the model is saved.
invalid byte sequence in US-ASCII
(And when I assign it a plain old string, #my_model="xxxxxxxx", everything works fine.
My eventual plan is to store encrypted_thingy in the database via the model, and then later decrypt it, parse out JSON, and get the values for "thing" and "thang".
Searching the 'net suggested that I need to change the encoding, but it is not clear how I do that with the result of the crypt19/blowfish encoding.
Is there any way to store this encrypted string as a string just like any other string I store?
Or is there a way to apply some magic (along with reversible magic) to turn that funky string into a real string which I can pass around in an email?

I was able to make it work. There is a gem called "hex_string" which converts binary-ish things with strange encodings into byte strings.
First I had to do
gem install hex_string
Then the code looked like this
require 'crypt/blowfish'
require 'hex_string'
h=Hash.new
h["thing"]="12345"
h["thang"]="abcdefghijklmnopqrstuvwxyz"
blowfish=Crypt::Blowfish.new("SECRET")
encrypted_thingy=blowfish.encrypt_string(h.to_json).to_hex_string.split(" ").join
The "encrypted_thingy" was now a string which I could pass around easily in my web app, store in a database, debug, etc without worrying about character encoding, etc.
To reverse the process, I did this:
decrypted_string= blowfish.decrypt_string(encrypted_thingy.to_byte_string)
The decrypted string could then be JSON-parsed to extract the data in the original hash.

Related

Making Comeonin's hashpwsalt() deterministic (Phoenix)

In my Phoenix project, I am using:
{:comeonin, "~> 4.0"},
{:bcrypt_elixir, "~> 0.12.0"},
I see a lot of examples of user creation/authentication in which Comeonin.Bcrypt.hashpwsalt(password) is called with one argument. However, when I run this, or add_hash() from iex, it seems like the outputs are indeterministic:
iex(10)> password = Comeonin.Bcrypt.hashpwsalt("password")
"$2b$12$QUL1ytej8UqTvpU34E2oieshgOonf0RRZI0nva6T3HlK2RQ2JT74O"
iex(11)> password = Comeonin.Bcrypt.hashpwsalt("password")
"$2b$12$jz3sb5rLrmdHVRr7Nvq0te9He0Wt00DYy4kM.t9LFp6ZSx.siovJC"
iex(12)> password = Comeonin.Bcrypt.add_hash("password")
%{password: nil,
password_hash: "$2b$12$4Ih30p4LbNk5LQStMDtah.ht0AQSO8mhhfCUeRQlFSNuI9vEgKI/q"}
iex(13)> password = Comeonin.Bcrypt.add_hash("password")
%{password: nil,
password_hash: "$2b$12$92oe9Ccovrwi1GuHK5Zo3uaxbQEXEvgyqEx6o4tsW2J8TEsc/LrtS"}
Why does this occur, and how can I guarantee a deterministic hash from a given input?
hashpwsalt generates a random salt each time, so the resulting hash is going to be different every time. This is the recommended way of generating a password hash. You then use check_pass or checkpw for checking if a password matches the stored hash. If for some reason you want to get the same hash, you can use the library directly. For an example, see here:
https://github.com/riverrun/bcrypt_elixir/blob/master/lib/bcrypt.ex#L84
Can't think of a reason you would want to do this, though. It's more likely that you're making a mistake.

Ruby mongodb: Three newly created objects doesn't appear to exist during testing

I'm using mongodb to store some data. Then I have a function that gets the object with the latest timestamp and one with the oldest. I haven't experienced any issues during development or production with this method but when I try to implement a test for it the test fails approx 20% of the times. I'm using rspec to test this method and I'm not using mongoid or mongomapper. I create three objects with different timestamps but get a nil response since my dataset contains 0 objects. I have read a lot of articles about write_concern and that it might be the problem with "unsafe writes" but I have tried almost all the different combinations with these parameters (w, fsync, j, wtimeout) without any success. Does anyone have any idea how to solve this issue? Perhaps I have focused too much with the write_concern track and that the problems lies somewhere else.
This is the method that fetches the latest and oldest timestamp.
def first_and_last_timestamp(customer_id, system_id)
last = collection(customer_id).
find({sid:system_id}).
sort(["t",Mongo::DESCENDING]).
limit(1).next()
first = collection(customer_id).
find({sid:system_id}).
sort(["t",Mongo::ASCENDING]).
limit(1).next()
{ min: first["t"], max: last["t"] }
end
Im inserting data using this method where data is a json object.
def insert(customer_id, data)
collection(customer_id).insert(data)
end
I have reverted back to use the default for setting up my connection
Mongo::MongoClient.new(mongo_host, mongo_port)
I'm using the gem mongo (1.10.2). I'm not using any fancy setup for my mongo database. I've just installed mongo using brew on my mac and started it. The version of my mongo database is v2.6.1.

JClouds S3: Specify Content Length when uploading file

I wrote an application using JClouds 1.6.2 and had file upload code like
java.io.File file = ...
blobStore.putBlob(containerName,
blobStore.blobBuilder(name)
.payload(file)
.calculateMD5()
.build()
);
This worked perfectly well.
Now, in jclouds 1.7, BlobStore.calculateMD5() is deprecated. Furthermore, even if calculating the MD5 hash manually (using guava Hashing) and passing it with BlobStore.contentMD5(), I get the following error:
java.lang.IllegalArgumentException: contentLength must be set, streaming not supported
So obviously, I also have to set the content length.
What is the easiest way to calculate the correct content length?
Actually I don't think, jclouds suddenly removed support of features and makes uploading files much more difficult. Is there a way to let jclouds calculate MD5 and/or content length?
You should work with ByteSource which offers several helper methods:
ByteSource byteSource = Files.asByteSource(new File(...));
Blob blob = blobStore.blobBuilder(name)
.payload(byteSource)
.contentLength(byteSource.size())
.contentMD5(byteSource.hash(Hashing.md5()).asBytes())
.build();
blobStore.putBlob(containerName, blob);
jclouds made these changes to remove functionality duplicated by Guava and make some of the costs of some operations, e.g., hashing, more obvious.

Weird encoding error when saving ActiveRecord record

I have the following situation: My webservice is receiving JSON data and creating models (typical REST scenario). Sometimes I get a
Encoding::CompatibilityError Exception: incompatible character encodings: ASCII-8BIT and UTF-8
error message when saving the records, which can only be (or is) bound to two attributes. Firing up the debugger, setting ANY of those two attributes to an empty string and saving works, like so:
model = Model.new(params[:model])
model.save! # Fails with above error message
model = Model.new(params[:model])
model.attribute1 = ""
model.save! # Works
model = Model.new(params[:model])
model.attribute2 = ""
model.save! # Works too!
Now the params are parsed from the http request, how can they be dependent on each other?
Anyone with the same scenario?
Edit:
We've found the reason for the compability error: https://github.com/jruby/activerecord-jdbc-adapter/issues/229 As it seems, the JDBC adapter has some errors with utf-8 encoding, something which has been fixed for a long time in traditional rubies.
As added in the edit to my original question, the problem is a bug in the JDBC adapter of JRuby (which I forgot to add as a constraint, my bad!): https://github.com/jruby/activerecord-jdbc-adapter/issues/229

invalid byte sequence in UTF-8 on page request

I'm getting "invalid byte sequence in UTF-8" on page requests (permalinks) and I have no idea why nor can I reproduce it but I do get a lot of exceptions like this:
A ArgumentError occurred in products#index:
invalid byte sequence in UTF-8
activesupport (3.0.4) lib/active_support/core_ext/object/blank.rb:68:in `=~'
-------------------------------
Request:
-------------------------------
* URL : http://www.mysite.com/category/category-name-\x8E~ice
* Parameters: {"page"=>1, "controller"=>"products", "action"=>"index", "category"=>"category-name-\x8E~ice"}
The string at the end should not be there ("-\x8E~ice"). Any idea why that shows up or what can I do to debug/reproduce it ?
Thanks
we created a rails middleware that filters out all the strange encodings that can not be handled within our app.
the problem that we encounter is that there are requests that have strange encodings, for example Cp1252 / Windows-1252. when ruby 1.9 tries to match those strings against utf-8 regexps it blows up.
i tried various ways of dealing with this problem by using iconv, but it looks like solutions that work on my mac don't work on the servers. so the simplest approach is probably the best...
I've just posted a new gem called UTF8Cleaner which is heavily based on #phoet and #pithyless' work. It include a Railtie, so you can just drop it in to your Gemfile and forget about those "invalid byte sequence" errors.
https://github.com/singlebrook/utf8-cleaner
Similar to #phoet, I also used a Rails Middleware to solve similar encoding issues.
Tested on Ruby 1.9.3 (no Iconv):
https://gist.github.com/3639014
If you are using apache (and mod_rails) you can prevent these invalid url requests from hitting your Rails application completely by following this answer:
https://stackoverflow.com/questions/13512727/how-can-i-configure-apache-to-respond-400-when-request-contains-an-invalid-byte/13527812#13527812