Elixir - testing a full script - testing

I'm writing a test to check a function (called automatically by GenServer when a new file enters a folder) that calls other functions in the same module with pipes in order to read a file, process its content to insert it if needed and returns a list (:errors and :ok maps).
results looks like :
[
error: "Data not found",
ok: %MyModule{
field1: field1data,
field2: field2data
},
ok: %MyModule{
field1: field1data,
field2: field2data
},
error: "Data not found"
the code :
def processFile(file) do
insertResultsMap =
File.read!(file)
|> getLines()
|> extractMainData()
|> Enum.map(fn(x) -> insertLines(x) end)
|> Enum.group_by(fn x -> elem(x, 0) end)
handleErrors(Map.get(insertResultsMap, :error))
updateAnotherTableWithLines(Map.get(insertResultsMap, :ok))
end
defp getLines(docContent) do
String.split(docContent, "\n")
end
defp extractMainData(docLines) do
Enum.map(fn(x) -> String.split(x, ",") end)
end
defp insertLines([field1, field2, field3, field4]) do
Attrs = %{
field1: String.trim(field1),
field2: String.trim(field2),
field3: String.trim(field3),
field4: String.trim(field4)
}
mymodule.create_stuff(Attrs)
end
defp handleErrors(errors) do
{:ok, file} = File.open(#errorsFile, [:append])
saveErrors(file, errors)
File.close(file)
end
defp saveErrors(_, []), do: :ok
defp saveErrors(file, [{:error, changeset}|rest]) do
changes = for {key, value} <- changeset.changes do
"#{key} #{value}"
end
errors = for {key, {message, _}} <- changeset.errors do
"#{key} #{message}"
end
errorData = "data: #{Enum.join(changes, ", ")} \nErrors: #{Enum.join(errors, ", ")}\n\n"
IO.binwrite(file, errorData)
saveErrors(file, rest)
end
defp updateAnotherTableWithLines(insertedLines) do
Enum.map(insertedLines, fn {:ok, x} -> updateOtherTable(x) end)
end
defp updateOtherTable(dataForUpdate) do
"CLOSE" -> otherModule.doStuff(dataForUpdate.field1, dataForUpdate.field2)
end
I have several questions, and some will be pretty basic since I'm still learning :
What do you think of the code ? Any advices ? (take into account I voluntarily obfuscated names).
If I want to test this, is it the right way to test only processFile function ? Or should I make public more of them and test them individually ?
When I test the processFile function, I check that I'm receiving a list. Any way to make sure this list has only elements I'm waiting for, thus error: "String" or ok: %{}" ?

What do you think of the code? Any advices? (take into account I voluntarily obfuscated names).
Opinion based.
If I want to test this, is it the right way to test only processFile function?
Yes.
Or should I make public more of them and test them individually?
No, this is an implementation detail and testing it is an anti-pattern.
When I test the processFile function, I check that I'm receiving a list. Any way to make sure this list has only elements I'm waiting for, thus error: "String" or ok: %{}"?
You receive a Keyword. To check the explicit value, one might use:
foo = processFile(file)
assert not is_nil(foo[:ok])
OTOH, I’d better return a map from there and pattern match it:
assert %{ok: _} = processFile(file)
To assert that the result does not have anything save for :oks and :errors, one might use list subtraction:
assert Enum.uniq(Keyword.keys(result)) -- [:ok, :error] == []

Related

How to write a test for Plug error handling

I'm trying to use Plug.Test to test error handling implemented with Plug.ErrorHandler -- with assert conn.status == 406 and alike.
I have the defp handle_errors (containing a single send_resp statement) and it seems to be called, however, my tests fail with the same exception still (as if handle_errors has no effect).
A reference to a sample advanced Plug (not Phoenix) app will also be appreciated.
Try something like this (not tested):
defmodule NotAcceptableError do
defexception plug_status: 406, message: "not_acceptable"
end
defmodule Router do
use Plug.Router
use Plug.ErrorHandler
plug :match
plug :dispatch
get "/hello" do
raise NotAcceptableError
send_resp(conn, 200, "world")
end
def handle_errors(conn, %{kind: _kind, reason: reason, stack: _stack}) do
send_resp(conn, conn.status, reason.message)
end
end
test "error" do
conn = conn(:get, "/hello")
assert_raise Plug.Conn.WrapperError, "** (NotAcceptableError not_acceptable)", fn ->
Router.call(conn, [])
end
assert_received {:plug_conn, :sent}
assert {406, _headers, "not_acceptable"} = sent_resp(conn)
end
Use assert_error_sent/2 to assert that you raised an error and it was wrapped and sent with a particular status. Match against its {status, headers, body} return value to assert the rest of the HTTP response met your expectations.
response = assert_error_sent 404, fn ->
get(build_conn(), "/users/not-found")
end
assert {404, [_h | _t], "Page not found"} = response

Elixir - Using variables in doctest

In my application there is a GenServer, which can create other processes. All process IDs are saved to a list.
def create_process do
GenServer.call(__MODULE__, :create_process)
end
def handle_call(:create_process, _from, processes) do
{:ok, pid} = SomeProcess.start_link([])
{:reply, {:ok, pid}, [pid | processes]}
end
There is also a function to get the list of PIDs.
def get_processes do
GenServer.call(__MODULE__, :get_processes)
end
def handle_call(:get_processes, _from, processes) do
{:reply, processes, processes}
end
I tried to write a doctest for the get_processes function like this:
#doc """
iex> {:ok, pid} = MainProcess.create_process()
iex> MainProcess.get_processes()
[pid]
"""
However the test runner doesn't seem to see the pid variable, and I get an undefined function pid/0 error.
I know it could be simply solved in with a regular test, but i want to know it it possible to solve in doctest.
The problem is the [pid] line in your expected result. The expected result should be an exact value, not a variable. You can't reference the variable from the expected result. You can work around it by checking the pid on the previous line:
iex> {:ok, pid} = MainProcess.create_process()
iex> [pid] === MainProcess.get_processes()
true

Elixir - Manipulating a 2 dimensional list

Hope everybody is having a beautiful 2019 even though we're just a day in.
I am currently working on a small Phoenix app where I'm manipulating PDF files (in the context of this question I'm splitting them) and then uploading them to S3. Later on I have to delete the temporary files created by pdftk ( a pdf tool ) I use to split them up and also show the s3 links in the response body since this is an API request.
The way I have structured this is as following:
Inside my Split module where the core business logic is:
filenames = []
s3_links = []
Enum.map(pages, fn(item) ->
split_filename = item
|> split(filename)
link = split_filename
|> FileHelper.result_file_bytes()
|> ManageS3.upload()
|> FileHelper.save_file(work_group_id, pass)
[filenames ++ split_filename, s3_links ++ link]
end)
|> transform()
{filenames, s3_links}
The important things are split_filename and link
This is what I'm getting when I call an IO.inspect in the transform() method:
[
["87cdcd73-5b27-4757-a472-78aaf6cc6864.pdf",
"Some_S3_LINK00"],
["0ab460ca-5019-4864-b0ff-343966c7d72a.pdf",
"Some_S3_LINK01"]
]
The structuring is [[filename, s3_link], [filename, s3_link]] whereas the desired outcome would be that of [ [list of all filenames], [list of s3 links].
If anybody can lend a hand I would be super grateful. Thanks in advance!
Sidenotes:
Assigning filenames = []; s3_links = [] in the very beginning makes zero sense. Enum.map already maps the input. What you need is probably Enum.reduce/3.
Don’t use the pipe |> operator when the pipe consists of the only call, it is considered an anti-pattern by Elixir core team.
Always start pipes with a term.
Solution:
Reduce the input into the result using Enum.reduce/3 directly to what you need.
pages
|> Enum.reduce([[], []], fn item, [files, links] ->
split_filename = split(item, filename)
link =
split_filename
|> FileHelper.result_file_bytes()
|> ManageS3.upload()
|> FileHelper.save_file(work_group_id, pass)
[[split_filename | files], [link | links]]
end)
|> Enum.map(&Enum.reverse/1)
|> IO.inspect(label: "Before transform")
|> transform()
You did not provide the input to test it, but I believe it should work.
Instead of working on lists of lists, you may want to consider using tuples with lists. Something like the following should work for you.
List.foldl(pages, {[], []}, fn(item, {filenames, links}) ->
filename = split(item, filename)
link =
file_name
|> FileHelper.result_file_bytes()
|> ManagerS3.upload()
|> FileHelper.save_file(work_group_id, pass)
{[filename | filenames], [link | links]}
end)
This will return a value that looks like
{
["87cdcd73-5b27-4757-a472-78aaf6cc6864.pdf",
"0ab460ca-5019-4864-b0ff-343966c7d72a.pdf"],
["Some_S3_LINK00",
"Some_S3_LINK01"]
}
Though, depending on how you are using these values, maybe a list of tuples would be more appropriate. Something like
Enum.map(pages, fn(item) ->
filename = split(item, filename)
link =
filename
|> FileHelper.result_file_bytes()
|> ManageS3.upload()
|> FileHelper.save_file(work_group_id, pass)
{filename, link}
end)
would return
[
{"87cdcd73-5b27-4757-a472-78aaf6cc6864.pdf", "Some_S3_LINK00"},
{"0ab460ca-5019-4864-b0ff-343966c7d72a.pdf", "Some_S3_LINK01"}
]

pipes and error handling

Say you have the following function:
def get_city_temp(city_id) do
'blahblahcityforcastfortoday.com/request/#{city_id}'
|> HTTPoison.get
|> parse_body
|> get_forecast
|> get_temp
end
Now say the GET fails, so that the response is:
{:ok, %HTTPoison.Response{status_code: 400, ...}}
but get_forecast is expecting a well formed body, and therefore will error complaining about the structure passed to it, or a missing key, etc. What's the best way to handle errors presented like this? In other languages I just wrap all the function calls in try catchs and return a tuple with the success report. In this situation, I'm not sure how to structure my code to best report the error
to the user.
This is exactly what the with/1 macro is meant for. Assuming parse_body and other functions also return {:ok, _} on success and {:error, _} in failure, you can do:
with {:ok, response} <- HTTPoison.get(...),
{:ok, parsed} <- parse_body(response),
{:ok, forecast} <- get_forecast(parsed),
{:ok, temp} <- get_temp(forecast), do: {:ok, temp}
If any pattern match fails, this whole thing returns that value. For example, if get_forecast returned {:error, :foo} after all previous functions returned {:ok, _}, the with will return {:error, :foo}.

elixir dynamic module call

How can I call function func() in a module called App.Reporting.Name
based on the string "name" which is not known until runtime
using String.to_atom or to_existing_atom does not work :
alias App.Reporting.Name
module = "name" |> String.capitalise |> String.to_atom
apply(module, :func, [])
Without the alias, this does not work either
module = "App.Reporting.Name" |> String.to_atom
apply(module, :func, [])
I get an (UndefinedFunctionError) and (module :"App.Reporting.Name" is not available)
thanks
Your second approach is almost correct, you just need to prefix Elixir. because App.Reporting.Name is equal to :"Elixir.App.Reporting.Name", not :"App.Reporting.Name" since Elixir prefixes all module names (names starting with an uppercase letter) with Elixir. before turning it into an atom:
iex(1)> App.Reporting.Name == :"App.Reporting.Name"
false
iex(2)> App.Reporting.Name == :"Elixir.App.Reporting.Name"
true
So, this code should work:
module = "Elixir.App.Reporting.Name" |> String.to_atom
apply(module, :func, [])
and so should this:
module = Module.concat(App.Reporting, "name" |> String.capitalize |> String.to_atom)
apply(module, :func, [])
The reason yours isn't working is because the String.to_atom does just that, turns a string into an atom. Because there is no module called "App.Reporting.Name" it's most likely App.Reporting.Name it errors.
Not sure if this is the best way to do this, just one that sprang to mind. But you could do something like this:
iex(2)> module = "Casing"
"Casing"
iex(3)> Module.concat(String, "#{module}") |> apply(:upcase, ["test sentence"])
"TEST SENTENCE"
Another solution could be to create a macro that automatically does this process, however that is not something I am that great at so you will have to go through the docs here for that one.