Robot Framework Suite - api

I have 15 separate API Tests in Robot Framework. I want to create a Robot Test Suite and have all 15 tests within this suite so that I can run the .robot suite once and have all 15 tests run.
So suppose I have 3 separate tests :
1. Login.robot
2. Get_customer_data.robot
3. Get_product_info.robot and I want to put all these tests in one suite such that when I run the suite it runs the three tests at once. Also, we use mocked data instead of a database so all the mocked data files currently lie within the respective test folders.
For ex: theres a folder called Login which contains login_data (data files) and login.robot (Robot file).
I was thinking I will create a suite with all the .robot files and retain the data files in the respective folders for the suite to access them when the suite is run. I was wondering if this is right and if there is a document that could help me figure this out. Please Help
Could someone kindly help?

Your question mentions having 15 different tests it appears that what you really have are 15 different robot test suites (ie: 15 different files with the .robot suffix) based on a comment you wrote.
The simplest way to combine them into a suite that you can run all at once is to put them in a directory. You can then tell robot to run the directory and it will find all of the .robot files and run them.
For example:
tests
└── api
├── Get_customer_data
│   ├── customer_data
│   └── Get_customer_data.robot
├── Get_product_info
│   ├── Get_product_info.robot
│   └── product_data
├── Login
│   ├── login_data
│   └── Login.robot
└── ...
If you cd to the tests folder, you can do robot api and it will run all of the .robot files in that folder. Or, from the parent of the tests folder you can do robot tests/api.
Test suite directories are thoroughly documented in the robot framework user guide, in a section titled Test suite directories

There's actually an easy way to do this from the command line using pybot. I've verified this with Pycharm.
First, put all of your .robot files in a single folder. Then on the command line, instead of typing pybot TestSuite/Test1.robot, type pybot TestSuite/*. This will run all of the tests in (I observed) alphabetical order.
It may be alphabetical order because that's the order they were in my folder and not because it actually runs them that way, but to fix that prefix each test with a number, like 1_Login.robot, 2_Get_customer_data.robot ,3_Get_product_info.robot. This will execute everything in the numerical order. If you have more than 10, use 01, 02, 03, etc. to make sure everything works properly.
To have a one-click action to run this, you could put that line into a .cmd file, but that may be outside the scope of your question.

Related

Use relative path in init function cannot be found in test

I'm running into an annoying problem when using relative path in the init function which cannot be found by the unit test.
Say I have a project structured like:
.
├── conf
│   └── blacklist
├── filter
│   ├── filter.go
│   └── filter_test.go
And in the init function of filter.go, I try to load the blacklist by using relative path conf/blacklist to avoid loading it multiple times.
Since the default working directory is exactly the project root directory, it works well with compiled binary. However filter_test.go will panic by
panic: open conf/blacklist: no such file or directory, becasue go test always uses the package directory as the working directory.
I also referred this solution, which could dynamically build the relative path to find the config file. It did work well until I add the covermode=set flag and try to do some coverage testing. It seems that the covermode would rewrite and generate some middle packages like _obj, which makes the dynamical relative path not working any more.
Has anyone run into the same problem before? I'm still new to golang for a couple of months. Any suggestions will be appreciated.
A simple solution is to run tests using something like SRC_ROOT=$PWD go test and then inside the test that want to access files use os.Getenv("SRC_ROOT").

Why are `__init__.py` and `BUILD` needed inside TensorFlow's `models/tutorials/rnn/translate`?

Inside the tensorflow/models/tutorials/rnn/translate folder, we have a few files including __init__.py and BUILD.
Without __init__.py and BUILD files, the translate script can still manage to run.
What is the purpose of __init__.py and BUILD here? Are we supposed to install or build it using these two files?
The BUILD file supports using Bazel for hermetic building and testing of the model code. In particular a BUILD file is present in that directory to define the integration test translate_test.py and its dependencies, so that we can run it on continuous integration system (e.g. Jenkins).
The __init__.py file causes Python to treat that directory as a package. See this question for a discussion of why __init__.py is often present in a Python source directory. While this file is not strictly necessary to invoke translate.py directly from that directory, it is necessary if we want to import the code from translate.py into a different module.
(Note that when you run a Python binary through Bazel, the build system will automatically generate __init__.py files if they are missing. However, TensorFlow's repositories often have explicit __init__.py files in Python source directories so that you can run the code without invoking Bazel.)

GO in IntelliJ IDEA. Multiple File and Error Undefined: Data

I want use IntelliJ IDE Community Edition to write code in GO (GoLang). I instaled right plugin, and instaled all need tools to build application.
My application consists with two below file. Each is in direcytory ../EventServer.
Main.go
Data.go
If I want to run project from IntelliJ using function Run (Ctlr+Shift+F10) and I get below error
/usr/lib/go/bin/go build -o "/tmp/Build Main.go and run0go" -gcflags "-N -l" /my/home/blah/EventServer/Main.go
# command-line-arguments
./Main.go:11: undefined: Data
I can without any problem compiled code from terminal come in to direcytory with project and execution command
:~/Pulpit/EventServer$ go build
./EventServer
Hello
dane w strukturze someone
tree direcytory and files looks like
EventServer$ tree -a
.
├── Data.go
├── EventServer
├── EventServer.iml
├── .idea
│   ├── compiler.xml
│   ├── copyright
│   │   └── profiles_settings.xml
│   ├── libraries
│   │   └── GOPATH__EventServer_.xml
│   ├── misc.xml
│   ├── modules.xml
│   ├── .name
│   ├── vcs.xml
│   └── workspace.xml
└── Main.go
I suppose that command to run is bad, because compiler trying build program with only one file Main.go but not with all files. Right command should be
$ go run *.go
But I do not know where Can I set this.
I also set GOPATH to:
export GOPATH=$HOME/Pulpit/EventServer
This also hasn't help
CODE
Main.go
package main
import (
"fmt"
)
func main() {
fmt.Println("Hello")
abcd := Data{"someone" , "1234567"}
fmt.Printf("dane w strukturze %s ", abcd.Name)
}
And Data.go
package main
type Data struct {
Name string
Phone string
}
SYSTEM: LINUX
----------------------SOLVED-------------------------------------------SOLVED---------------------
Steps
Project must be found in directory for/example/MyProject/src/HERE_DIRECTORY_WITH_YOUR_SOURCE_GO_FILE
sub direcytory src is important
Go to Run --> Edit Configurations
Find below position
Change Run Kind to Package
In Position Package write Your folder with Your code (Shold be Highlight it is correct)
Click On PLUS icon in left top corner, and Add Go Application
Apply changes
In the right top corner main window IDE You see small icon Play
Chose early defined Go Application my is Unamed
Click Play
An Joy :D
Let say you are having a project with src/ sub directory and two .go files inside: hello.go and typetest.go and both defines the same package "main". One of them (hello.go) also implements func main().
To make it compile as whole you need to make sure two things are configured properly: GOPATH and Run/Debug configuration.
Open the Project Libraries/GOPATH settings:
For Gogland
File -> Settings -> Go
For Intellij IDEA
File -> Settings -> Languages & Frameworks -> Go -> Go Libraries
Makes sure the GOPATH for the project looks something like that:
GOPATH settings
Next, open Run -> Edit Configurations and make sure your configuration looks similar to this:
Run/Debug configuration
You can add second file (in your case Data.go) as Go tool arguments field in Run/Debug Configurations. I read about this here: https://github.com/go-lang-plugin-org/go-lang-idea-plugin/issues/2013 near the end of discussion. Example below (I used 2 files HelGo.go and aaa.go) worked for me:
Go Run Configuration with 2 files
Intellij objected in these 2 files were in different folders, and so both of them has to be same package (main).
On the other side I couldn't make any advises on this page work at all.
I solved the same problem with a complete list of checking. Usually, there are multiple problems combined with each other.
First of all, check your GOPATH settings, there are two ways:
use the command line to check your go environment:
$ go env
you will get a global environment setting, make sure your GO folders are correct. If you are using some local packages, be aware of them too.
check your Build tool setting, whether you have added all your resources files into your dependencies. This is another way to set up your project GOPATH, and do not affects the global settings.
Secondly, check your Run/Debug configuration, and make sure your settings have located the main package or main file. No matter which kind of configuration you use, this is always the start.
Sarp Kaya, just follow Mbded steps. Additional step is, ensure that your additional GOPATH should be there.
For example, this is our ~/.profile GOPATH
export GOPATH=$HOME/lib/Go:$HOME/Development/Go
The first path used by go get processes etc, while your active go development directory goes to the next path.
According to our config, the RightApp exact path should be $HOME/Development/Go/src/RightApp.

Testing Chef roles and environments

I'm new to Chef and have been using Test Kitchen to test the validity of my cookbooks, which works great. Now I'm trying to ensure that environment-specific attributes are correct on production nodes prior to running Chef initially. These would be defined in a role.
For example, I may have recipes that converge using a Vagrant box with dev settings, which validates the cookbook. I want to be able to test that a production node's role. I think I want these tests as the source of truth describing my environment. Looking at Test Kitchen's documentation, this seems beyond its scope.
Is my assumption correct? Is there a better approach to test a cookbook before the first time Chef is run on a production node to ensure it has the correct settings?
I pleasantly discovered that chef_zero uses the "test/integration" directory as it's chef repository.
Just create your roles under
test/integration/roles
Example
Standard Chef cookbook layout.
├── attributes
│   └── default.rb
├── Berksfile
├── Berksfile.lock
├── chefignore
├── .kitchen.yml
├── metadata.rb
├── README.md
├── recipes
│   └── default.rb
└── test
└── integration
├── default
│   └── serverspec
│   ├── default_spec.rb
│   └── spec_helper.rb
└── roles
└── demo.json
.kitchen.yml
---
driver:
name: vagrant
provisioner:
name: chef_zero
platforms:
- name: ubuntu-14.04
suites:
- name: default
run_list:
- role[demo]
attributes:
Notes:
Provisioner is chef_zero
The runlist is configured to use a role
recipes/default.rb
file "/opt/helloworld.txt" do
content "#{node['demo']['greeting']}"
end
attributes/default.rb
default['demo']['greeting'] = "hello world"
Notes:
Cookbook won't compile without a default
test/integration/default/serverspec/default_spec.rb
require 'spec_helper'
describe file('/opt/helloworld.txt') do
it { should be_file }
its(:content) { should match /this came from my role/ }
end
Notes:
Integration test is looking for the content that is set by the role attribute
test/integration/roles/demo.json
{
"name": "demo",
"default_attributes": {
"demo": {
"greeting": "this came from my role"
}
},
"run_list": [
"recipe[demo]"
]
}
You can set both roles and environments in your .kitchen.yml, so you certainly can test this with test kitchen.
....
provisioner:
roles_path: path/to/your/role/files
client_rb:
environment: your_environment
.....
That said, I personally prefer to use role cookbooks. If you have a fixed set of environments, as we do, then you can also use simple conditionals in the attributes files of your role cookbook to adjust attributes based on environment too. That way, you have a single cookbook that defines the entire configuration of your node by wrapping other cookbooks and setting variables. With that setup, it is very easy to setup kitchen tests that validate the exact production system.
When coming to validating attributes the part of Test Kitchen your should be using is ChefSpec.
You can define a complete runlist in a spec file and ensure the rendered files are correct.
There's a part of Chefspec documentation about it here.
Another way to do this is to have a "role cookbook", instead of using a role on chef server, you define the attributes you wish to define in an attribute file and make this cookbook depends on what the role runlist would be.
This role cookbook recipe would have include_recipe only referencing the recipe you would have set in the role runlist.
The main advantage here is that you can include your specs in this cookbook independently of the referenced cookbooks.

How to use grunt-contrib-copy to copy to root AND change directory structure

I have an express app with my dev views in /assets/views. I figure I need to separate development and production views because in production I'll be editing the HTML when I used grunt-contrib-usemin to concat/uglify scripts.
So here's the problem. My current tree:
assets/views
├── 404.html
├── index.html
├── layout.html
├── question_ask.html
└── question_display.html
Ideally, I want my production-ready views to live on the same level as assets. Using grunt-contrib-copy, it seems to copy the whole tree. I currently am putting it into public since I'm not sure how to set my dest to the root of the project.
copy: {
views: {
src: ['assets/views/*.html'],
dest: 'public/'
}
So there are a few questions here:
Is it bad practice to have dev views and production views? If so, is there another way of producing a view that has references to concat/uglified scripts?
How the heck can I use grunt-contrib-copy to copy to the root of my project? I don't want assets/views obviously, I just want a views/* folder that has the contents of what's in assets/views/*.
Thanks!
You need to specify the flatten option which will remove the directory structure from the destination path. See my answer here: Copy all files from directory to another with Grunt.js copy