I am importing gmail contacts, some users had huge number of contacts its taking long time to save in database. How to use in delay job to run in background asynchronously.
I am using delay_job gem
Here is code I wrote
token = Google::Authorization.exchange_singular_use_for_session_token(params[:token])
unless token == false
#contacts = Google::Contact.all(token)
#contacts.each do |contact|
next if contact.email.nil?
c = {
:user_id => current_user.id,
:source => 'gmail',
:name => contact.name,
:email => contact.email
}
c = Contact.find_or_initialize_by_email(c[:email])
c.update_attributes(c)
end
end
Add these gems in the Gemfile
gem 'ghazel-daemons'
gem 'delayed_job'
then run
bundle install
rails g delayed_job:active_record
rake db:migrate
Then use the delay method provided by delayed job to run the process in background
c = Contact.find_or_initialize_by_email(c[:email])
c.delay.update_attributes(c)
Start the delayed job process from the project root directory using the command,
rake jobs:work
For automating start/stop/restart after deployment, refer the documentation
https://github.com/collectiveidea/delayed_job/wiki/Rails-3-and-Capistrano
For more options, on how to use delayed job methods, you can check this page https://github.com/collectiveidea/delayed_job
Related
I'm trying to write a standalone script so I can run a regular cron job that will update the database as well as cache some data to a file locally so we don't have to wait for query times. In doing so, I am using ActiveRecord. I have the following code:
require "active_record"
require "rubygems"
require "./lib/queries/my_query_file"
def my_method
#get stored query string in my_query_file
sql = MY_QUERY
#sql_con = ActiveRecord::Base.establish_connection(
:adapter => "sqlserver",
:host => "my_host",
:port => "my_port",
:username => "my_user",
:password => "my_pass",
:database => "my_db",
:timeout => "100000"
)
##query_result = #sql_con.connection.select_all(sql)
##query_result.each do |row|
#do something
end
end
When I try to run the above script I get the following error:
Specified 'sqlserver' for database adapter, but the gem is not loaded. Add gem '' to your Gemfile. (Gem::LoadError)
Any idea on what the issue could be? I've exhausted my search options to the point where I've gotten a headache from searching for answers. I finally caved in to post a question to see if there are any experts that help or folks that have encountered this issue before that might recall the solution.
You are using :adapter => "sqlserver", which makes ruby assume that sqlserver is the database which you are trying to use. It trying to make a lookup for a ruby gem which has adapter connection for sqlserver.
When we use mysql gem, we can see there are library extensions written in C which help us connect over the ports to current mysql server.
That was kinda lame. I tried what user944938 suggested and that worked. I was just hoping that I can get it to work with ActiveRecord since that is what I am using elsewhere. I updated my code to look like this now:
require "tiny_tds"
require "./lib/queries/my_query_file"
def my_method
sql = MY_QUERY
client = TinyTds::Client.new(
:username => "my_user",
:password => "my_pass",
:host => "my_host",
:database => "my_db"
)
##build_query = client.execute(sql)
##build_query.each do |row|
#do something
end
end
My project is on Rails 3.2 and refinerycms v 2.0.10
I just generated a new engine and and ran my bundle and rails generate commands, and my migration. Now, per the docs, I need to run db:seed but I don't want to execute a db:seed at the app level because I have several other engines and I don't want to re-seed them.
it is related to this question:
Rails engine / How to use seed?
but the answer there is to run db:seed at the app level.
So how would I say something like rake myNewEngine:db:seed ? I know it can be done but my google fu is apparently too weak to dredge it up.
You can just generate your own rake task. Create a your_engine.rake file and make sure it is loaded in your Rakefile.
namespace :your_engine do
namespace :db do
task :seed do
YourEngine::Engine.load_seed
end
end
end
Edit the YOUR_ENGINE/lib/tasks/YOUR_ENGINE_tasks.rake
namespace :db do
namespace :YOUR_ENGINE do
desc "loads all seeds in db/seeds.rb"
task :seeds => :environment do
YOUR_ENGINE::Engine.load_seed
end
namespace :seed do
Dir[Rails.root.join('YOUR_ENGINE', 'db', 'seeds', '*.rb')].each do |filename|
task_name = File.basename(filename, '.rb')
desc "Seed " task_name ", based on the file with the same name in `db/seeds/*.rb`"
task task_name.to_sym => :environment do
load(filename) if File.exist?(filename)
end
end
end
end
end
then in your main app you can execute your custom seeds commands, executing any seed file individually
$rake -T | grep YOUR_ENGINE
rake db:YOUR_ENGINE:seed:seed1 # Seed seed1, based on the file with the same name in `db/seeds/*.rb`
rake db:YOUR_ENGINE:seeds # loads all seeds in db/seeds.rb
This question is an expanded version of Facebook Real-time updated does not call our servers, that seems to be dead. Also, Realtime updates internal server error on Heroku using Koala is not helpful because I'm subscribing from the heroku console as pjaspers suggested.
I have an app (ruby 1.9.2p290 and Rails 3.1.3) that connects to facebook to get data from the current user. Everything is working ok with the koala gem (v1.2.1), but I'm polling the fb servers every time the users logs in. I would like to use facebook real-time updates, and I have read the following:
Koala manual on fb realtime updates: https://github.com/arsduo/koala/wiki/Realtime-Updates
Facebook page on realtime: https://developers.facebook.com/docs/reference/api/realtime/
I have set up the system in test mode and deployed to heroku successfully. I can subscribe to the user object and I get the GET request to my server, but no POST with updated information is ever received from facebook. If I issue a POST to my server manually everything works.
More information:
routes.rb
get '/realtime' => 'realtime#verify'
post '/realtime' => 'realtime#change'
generating
realtime GET /realtime(.:format) {:controller=>"realtime", :action=>"verify"}
POST /realtime(.:format) {:controller=>"realtime", :action=>"change"}
The controller (mock version, only to test if it's working):
class RealtimeController < ApplicationController
def verify
render :text => params["hub.challenge"]
end
def change
puts params.inspect
render :nothing => true
end
end
The subscription from the heroku console:
irb(main):004:0> #updates = Koala::Facebook::RealtimeUpdates.new(:app_id => ENV['FACEBOOK_APP_ID'], :secret => ENV['FACEBOOK_APP_SECRET'])
=> #<Koala::Facebook::RealtimeUpdates:0x00000004f5bca8 #app_id="XXXXXXX", #app_access_token="XXXXXXX", #secret="XXXXXXX", #graph_api=#<Koala::Facebook::API:0x00000004a8d7a8 #access_token="XXXXXXX">>
irb(main):005:0> #updates.list_subscriptions
=> [{"object"=>"user", "callback_url"=>"http://blah-blah-0000.herokuapp.com/realtime", "fields"=>["education", "email", "friends", "name", "website", "work"], "active"=>true}]
I don't know what to do next...
Maybe I am not triggering the correct changing events?
How do I see the list of users of my app? (right now it's a test app and the only user would be me)
Anyone with this kind of issue?
Is something wrong in the code?
Is facebook down? Is it the end of Internet?
Thank you for the help :)
You need to respond to the GET request with a challenge response. I have the same route for both POST and GET requests and use the following code:
route:
match "facebook/subscription", :controller => :facebook, :action => :subscription, :as => 'facebook_subscription', :via => [:get,:post]
controller:
def realtime_request?(request)
((request.method == "GET" && params['hub.mode'].present?) ||
(request.method == "POST" && request.headers['X-Hub-Signature'].present?))
end
def subscription
if(realtime_request?(request))
case request.method
when "GET"
challenge = Koala::Facebook::RealtimeUpdates.meet_challenge(params,'SOME_TOKEN_HERE')
if(challenge)
render :text => challenge
else
render :text => 'Failed to authorize facebook challenge request'
end
when "POST"
case params['object']
# Do logic here...
render :text => 'Thanks for the update.'
end
end
end
That should get you going with things... Note that to make a subscription I am using this:
#access_token ||= Koala::Facebook::OAuth.new(FACEBOOK_API_KEY,FACEBOOK_API_SECRET).get_app_access_token
#realtime = Koala::Facebook::RealtimeUpdates.new(:app_id => FACEBOOK_API_KEY, :app_access_token => #access_token)
#realtime.subscribe('user', 'first_name,uid,etc...', facebook_subscription_url,'SOME_TOKEN_HERE')
I think the key is that you properly respond to the GET request from Facebook. They use this to verify that they are contacting the correct server prior to sending confidential info about their users.
Also -- its been a while since I've looked at this, but if I remember correctly, I seem to recall having issues with anything besides default protocol port specifications within the callback URL. Ex: http://www.something.com:8080/subscription did not work -- it had to be http://www.something.com/subscription
Not sure if this might be the case with you, but make sure that you have permissions to access the object -user, permissions, page- properties (location, email, etc.)
In my case, I was trying to get notifications for changes on the location property without my app requiring the user_location permission. For the user object, look at the Fields section in this page https://developers.facebook.com/docs/reference/api/user/
Does Facebook know where your application is? Does it have a URL that it can resolve?
In my app, when a user signs up, they are sent a confirmation email. I use delayed_job, to make the process of sending email go in background.
But disadvantage of using delayed_job is having a worker all the time. And having a worker for this is expensive.
Is there something other than delayed_job, that will make email sending go in background.
Here is my controller code snippet.
def create
#user = User.new(params[:user])
respond_to do |format|
if #user.save
UserMailer.delay.registration_confirmation(#user)
format.html { redirect_to #user, notice: 'User was successfully created.' }
format.js
The point is I am having 20-40 signups in a day. That means at most the the queue is busy for about 60 seconds and I will have to pay for the the entire day, which is very impractical. Some other nice approach.
I recommend setting up a cron that will run once every hour and send out e-mails. You'll have to create a new model, QueuedEmail or something similar, and then instead of using ActionMailer right away, save it as a QueuedEmail. This is basically the same thing that delayed_job does, but you'll have more control over when it gets run, and crons don't take up much memory.
The cron should be to script/rails runner, and you should have a method in your QueuedEmail model that will send out all pending e-mails. I recommend whenever to generate crontabs (quick to setup and very easy to use). The cron will look something like this (this example is set to run once a day at 2am, you can look up how to adjust the intervals, I don't know off the top of my head):
0 2 * * * /bin/bash -l -c 'cd /RAILS_ROOT/ && script/rails runner -e production '\''QueuedEmail.send_pending'\'' >> log/cron.log 2>&1'
Or in Whenever:
every :day, :at => '2 am' do
runner "QueuedEmail.send_pending"
end
QueuedEmail#send_pending
class QueuedEmail < ActiveRecord::Base
def send_pending
all.each do |email|
params = your_parsing_method(email.params)
record = your_parsing_method(email.record)
email.destroy if UserMailer.registration_confirmation(record, params).deliver
end
end
end
UserController#create
if #user = User.create(params[:user])
User.delay_email(#user, params[:user])
redirect_to user_path(#user), :notice => "Created User. E-mail will be sent within the hour."
end
User#delay_email
def delay_email(record, params)
QueuedEmail.create(:record => your_db_formatting_method(record), :params => your_db_formatting_method(params.to_s)) # or use something built in, like to_s.
end
None of this code is tested and is probably quite broken, but it's the general idea that matters. You could also go one step further and extend the Rails ActionMailer, but that is far more advanced.
If you are using a job only to send "signup mails" you probably be better not using a separated job for that (considering you have very low amount of emails to send).
This will simplify your configuration and will reduce your costs.
A separated job is useful if you have a task that takes a lot of time to be concluded or you have a very busy site, as it's not the case of either, it's a good idea to just send regular emails.
Have looked for a solution for 2 days now and am beating my head against the wall. I have tried both the normal authlogic build and the fork
authlogic or 'authlogic', :git => 'git://github.com/odorcicd/authlogic.git', :branch => 'rails3'
I have everything working except for when I create a UserSession it will not take the current_account details, so I"m left with every login will allow you to log into any of the subdomains. I can't seem to find a solution for this issue
def new
#user_session = #current_account.user_sessions.new
end
I think this is what you are looking for (from Not able to set a current_user using Authlogic on Rails 3.0.1)
class UserSession < Authlogic::Session::Base
allow_http_basic_auth false
end