I'm trying to debug while a test is not running, I have my test and I'm trying to print something so I can see the values of a tuple when mix test is run. I've tried doing this:
require Logger
test "creates element", %{conn: conn} do
Logger.debug "debugging #{inspect conn}"
conn = post conn, v1_content_path(conn, :create), content: #valid_attrs
...
...
end
But nothing is printed! It's driving me nuts!
Here is where I read to do what I'm doing How to pretty print conn content?
Edit Also tried with:
IO.puts "debugging #{inspect conn}"
Edit Here the contents of my test_helper.exs
ExUnit.start
Mix.Task.run "ecto.create", ~w(-r TestApp.Repo --quiet)
Mix.Task.run "ecto.migrate", ~w(-r TestApp.Repo --quiet)
Ecto.Adapters.SQL.begin_test_transaction(TestApp.Repo)
Edit Here my whole testing file:
defmodule TestApp.ContentControllerTest do
require Logger
use TestApp.ConnCase
#valid_attrs %{title: "Content Title", url: "http://www.content.com"}
#invalid_attrs %{}
setup %{conn: conn} do
conn
|> put_req_header("accept", "application/json")
{:ok, conn: conn}
end
test "my first test", %{conn: conn} do
Logger.debug "debugging #{inspect conn}"
end
end
Edit Here is the detail of mix test:
$ mix test
.
Finished in 2.5 seconds (0.6s on load, 1.9s on tests)
1 tests, 0 failures
Randomized with seed 685273
compile_time_purge_level
As pointed out in some comments to your question, the compile_time_purge_level can be reduced to the :debug level for the test environment by changing the :logger config in config/test.exs.
test.exs
config :logger,
backends: [:console],
compile_time_purge_level: :debug
run tests again
mix test
Related
I'm currently working on my first big elixir project and wanted to properly utilize testing this time.
However, if I add my Modules to the "normal" supervisor, i cannot start them again with start_supervised! and all tests fail with Reason: already started: #PID<0.144.0>
Here is my code:
(application.ex)
defmodule Websocks.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
#moduledoc false
use Application
def start(_type, _args) do
children = [
{Websocks.PoolSupervisor, []},
{Websocks.PoolHandler, %{}}
# {Websocks.Worker, arg}
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Websocks.Supervisor]
Supervisor.start_link(children, opts)
end
end
Some of my tests:
defmodule PoolHandlerTest do
use ExUnit.Case, async: true
alias Websocks.PoolHandler
doctest PoolHandler
setup do
start_supervised!({PoolHandler, %{}})
%{}
end
test "adding two pools and checking if they are there" do
assert PoolHandler.add(:first) == :ok
assert PoolHandler.add(:second) == :ok
assert PoolHandler.get_pools() == {:ok,%{:first => nil, :second => nil}}
end
and the pool handler:
defmodule Websocks.PoolHandler do
use GenServer
# Client
def start_link(default) when is_map(default) do
GenServer.start_link(__MODULE__, default, name: __MODULE__)
end
# Server (callbacks)
#impl true
def init(arg) do
{:ok, arg}
end
end
(I cut out the stuff i think is not necessary, but the complete code is on github here: github)
Thanks in advance for any help i get!
As #Everett mentioned in the comment - your application will already be started for you when you mix test, so there is no need to start your GenServers again. It seems like you're interacting with the global instance in your test, so if that's what you want, then it should just work.
However, if you'd like to start a separate instance just for your test, you need to start an unnamed one. For example, you could add an optional pid argument to your wrapper functions:
defmodule Websocks.PoolHandler do
# ...
def add(server \\ __MODULE__, value) do
GenServer.call(server, {:add, value})
end
# ...
end
Then, instead of using using start_supervised! like you do, you can start an unnamed instance in your setup and use it in your tests like so:
setup do
{:ok, pid} = GenServer.start_link(PoolHandler, %{})
{:ok, %{handler: pid}}
end
test "adding two pools and checking if they are there", %{handler: handler} do
PoolHandler.add(handler, :first)
# ...
end
I need to ask a user input from a ruby script on a remote server. I managed to perform it with bash with the following code
class ConfirmHandler
def on_data(command, stream_name, data, channel)
puts "data received: #{data}"
if data.to_s =~ /\?$/
prompt = Net::SSH::Prompt.default.start(type: 'confirm')
response = prompt.ask "Please enter your response (y/n)"
channel.send_data "#{response}\n"
end
end
end
require 'sshkit'
require 'sshkit/dsl'
include SSHKit::DSL
on '<ssh-server-name>' do |host|
cmd = <<-CMD
echo 'Do something?';
read response;
echo response=$response
CMD
capture(cmd.squish , interaction_handler: ConfirmHandler.new)
end
When I run this script on my local machine I see
data received: Do something?
Please enter your response (y/n)
data received: response=y
I try to wrap the bash CMD code into a ruby script:
on '<ssh-server-name>' do |host|
cmd = <<-CMD
ruby -e "
puts 'Do something?';
require 'open3';
response = Open3.capture3('read response; echo $response');
puts 'response=' + response.to_s;
"
CMD
capture(cmd.squish , interaction_handler: ConfirmHandler.new)
end
and get the following result:
data received: Do something?
Please enter your response (y/n)
data received: response=["\n", "", #<Process::Status: pid 9081 exit 0>]
I was writing the code above looking at the Interactive commands section on the SSHKit Github home page
How can I capture the user response from a ruby script with SSKKit on the remote server?
I was able to capture the user response from a ruby script on a remote server with the following code:
# ask_response.rb
puts 'Do something?';
response = `read response; echo $response`;
puts 'response=' + response.to_s;
ask_response.rb is a ruby script which is located on a remote server. And locally I run:
on '<ssh-server-name>' do |host|
capture("ruby ask_response.rb" , interaction_handler: ConfirmHandler.new)
end
I've got a integration test that is run through capybara. It visits a webpage, creates an object, and renders the results. When the object is created, the controller enqueues several jobs that create some related objects.
When I run my integration test, I want to be able to examine the rendered page as if those jobs had finished. The 2 obvious solutions are:
Set the queue adapter as :inline
Manually execute/clear the enqueued jobs after creating the object.
For 1), I've attempted to set the queue adapter in a before(:each) to :inline, but this does not change the adapter, it continues to use the test adapter (which is set in my test.rb config file):
before(:each) { ActiveJob::Base.queue_adapter = :inline }
after(:each) { ActiveJob::Base.queue_adapter = :test }
it "should work" do
puts ActiveJob::Base.queue_adapter
end
which outputs: #<ActiveJob::QueueAdapters::TestAdapter:0x007f93a1061ee0 #enqueued_jobs=[], #performed_jobs=[]>
For 2), I'm not sure if this is actually possible. ActiveJob::TestHelpers provides perform_enqueued_jobs, but this methods isn't helpful, as it seems to work only for jobs explicitly referenced in the passed in block.
Assuming you're using RSpec the easiest way to use perform_enqueued_jobs is with an around block. Combining that with metatdata tags you can do something like
RSpec.configure do |config|
config.include(RSpec::ActiveJob)
# clean out the queue after each spec
config.after(:each) do
ActiveJob::Base.queue_adapter.enqueued_jobs = []
ActiveJob::Base.queue_adapter.performed_jobs = []
end
config.around :each, perform_enqueued: true do |example|
#old_perform_enqueued_jobs = ActiveJob::Base.queue_adapter.perform_enqueued_jobs
ActiveJob::Base.queue_adapter.perform_enqueued_jobs = true
example.run
ActiveJob::Base.queue_adapter.perform_enqueued_jobs = #old_perform_enqueued_jobs
end
config.around :each, peform_enququed_at: true do |example|
#old_perform_enqueued_at_jobs = ActiveJob::Base.queue_adapter.perform_enqueued_at_jobs
ActiveJob::Base.queue_adapter.perform_enqueued_at_jobs = true
example.run
ActiveJob::Base.queue_adapter.perform_enqueued_at_jobs = #old_perform_enqueued_at_jobs
end
end
Note: you need to specify the queue_adapter as :test in your config/environments/test.rb if it's not already set
You can then specify :perform_enqueued metadata on a test and any jobs specified will be run
it "should work", :perform_enqueued do
# Jobs triggered in this test will be run
end
I'm having an issue where my integration tests do not seem to find the log_in_as method from my test_helper.rb
I have been following Michael Hart's Rails tutorial, so I was hoping not to massively refactor my code to try and get this to work. I would like to continue on through the book without having to exclude the tests, since it is pretty test heavy afterall.
Error:
UsersLoginTest#test_login_with_remembering:
NoMethodError: undefined method `log_in_as' for #<UsersLoginTest:0x00000005b18460>
test/integration/users_login_test.rb:43:in `block in <class:UsersLoginTest>'
User_login_test.rb:
require 'test_helper.rb'
class UsersLoginTest < ActionDispatch::IntegrationTest
.
.
.
test "login with remembering" do
log_in_as(#user, remember_me: '1')
assert_not_empty cookies['remember_token']
end
test "login without remembering" do
# Log in to set the cookie.
log_in_as(#user, remember_me: '1')
# Log in again and verify that the cookie is deleted.
log_in_as(#user, remember_me: '0')
assert_empty cookies['remember_token']
end
end
test_helper.rb:
ENV['RAILS_ENV'] ||= 'test'
class ActiveSupport::TestCase
fixtures :all
# Returns true if a test user is logged in.
def is_logged_in?
!session[:user_id].nil?
end
# Log in as a particular user.
def log_in_as(user)
session[:user_id] = user.id
end
end
class ActionDispatch::IntegrationTest
# Log in as a particular user.
def log_in_as(user, password: 'password', remember_me: '1')
post login_path, params: { session: { email: user.email,
password: password,
remember_me: remember_me } }
end
end
I had this same issue. There are two problems I had to fix:
Make sure there is only one test_helper.rb file, and
test_helper.rb is in the right folder
Hope this helps!
I know I can open a ssh connection to a remote server:
:ssh.start
:ssh.connect("11.22.33.44", 22, user: "my_login123")
But how can I actually send a command and receive a response from it? I don't mean the interactive mode, I want to just send a command and receive a reply.
It might just be easier to use an Elixir library such as SSHex as this actually uses the erlang :ssh library but provides a much nicer interface as well as making it simpler to accomplish what you are after.
E.g. From the readme
{:ok, conn} = SSHEx.connect ip: '123.123.123.123', user: 'myuser'
SSHEx.cmd! conn, 'mkdir -p /path/to/newdir'
res = SSHEx.cmd! conn, 'ls /some/path'
Where the value of res will be the response from the command
EDIT
However, if you are set on using :ssh. Then you would need to use the :ssh_connection modules exec command which takes in the :ssh connection as a parameter.
See this link here for more detail on how to do this.
Here is an example that uses only :ssh and no external libraries. To run it you will need to have public key login set up on your target host. For more information, read the Erlang SSH User's Guide.
ssh-connect.exs
#! /usr/bin/env elixir
:ssh.start()
{:ok, conn} = :ssh.connect('raspi', 22,
silently_accept_hosts: true,
user: System.get_env("USER") |> to_charlist(),
user_dir: Path.join(System.user_home!(), ".ssh") |> to_charlist(),
user_interaction: false,
)
{:ok, chan} = :ssh_connection.session_channel(conn, :infinity)
:success = :ssh_connection.exec(conn, chan, 'uname -a', :infinity)
for _ <- 0..3 do
receive do
{:ssh_cm, ^conn, value} -> IO.inspect(value)
end
end
:ok = :ssh.close(conn)
Sample output
{:data, 0, 0, "Linux raspberrypi 4.4.50+ #970 Mon Feb 20 19:12:50 GMT 2017 armv6l GNU/Linux\n"}
{:eof, 0}
{:exit_status, 0, 0}
{:closed, 0}
Use SSHex library can very convenient to build SSH connection.
Here is example below:
defmodule SshDemo do
#moduledoc false
def connect do
{:ok, conn} = SSHEx.connect ip: 'xxx.xxx.xxx.xxx', user: 'root', password: 'xxxxx'
SSHEx.cmd! conn, 'mkdir -p newdir'
end
end
If you use mix to create your project. You just add dependency in mix.exs file and run-- "mix deps.get"
defp deps do
[{:sshex, "2.1.2"}]
end
Then, you may compile this module. use -- "mix deps.compile".
Run above example will make a folder named newdir in ~/ path