How do you run groups of test by directory or regular expression in Rust? - testing

In rust running cargo test runs all (unit, doc, integration, etc) test. Or you can run specific test using filters (2). I don't see a way to run specific groups of test unless all test are in one file.
For example with node.js and mocha you can run test in groups. If I only want to run end-to-end test, I setup a custom mocha-config-e2e.js file and set test directory like spec: 'src/**/test-e2e/*.test.js'. Then do something like "test-e2e": "NODE_ENV=test mocha --config ./mocha-config-test-e2e.js" in package.json. Now when i want to run all end-to-end test, I can run npm run test-e2e from the terminal. I can repeat this for any group I create.
How do I do the same thing in rust? How do I run groups of test by directory or regular expression?
Basically I'm looking to run cargo test-e2e or cargo test-e2e-feature_group1 or something similar where a command runs all test specified by directory or regular expression. This reference shows how to make test in sub-directories run but doesn't reveal how to run specific groups of test.
├── src
├── tests
│ ├── e2e
│ │ ├── feature_group1
│ │ │ ├── example1.rs
│ │ │ ├── example2.rs
│ │ │ ├── example3.rs
│ │ ├── feature_group2
│ │ │ ├── example1.rs
│ │ │ ├── example2.rs
│ │ │ ├── example3.rs
│ ├── integration
│ │ ├── example1.rs
│ │ ├── example1.rs
│ ├── e2e_feature_group1.rs
├── Cargo.toml
The closest I have been able to get is with something like this in tests/e2e_feature_group1.rs:
#[path = "e2e/feature_group1/example1.rs"]
mod example1;
#[path = "e2e/feature_group1/example2.rs"]
mod example2;
...
Then running cargo test --test e2e for all test in tests/e2e/feature_group1 that I add to the e2e_feature_group1.rs file. But that requires creating a tests/[group_name].rs for every group and listing every test that is in the directory instead of using an expression to catch all test in a directory.
Instead it would be nice to call cargo test --regex "tests/e2e/feature_group1/*" or cargo test --config "test_config_e2e1.rs" and put the regex #[path_regex = "e2e/feature_group1/*"] and any other test config I want to run in that file.

Related

Vite build fails on netlify because of a import, but works locally

I have a problem that I have no build error locally and as soon as I deploy with Netlify using Git deploy I get the following error:
[vite]: Rollup failed to resolve import "/dist/css/app.pcss" from "src/main.js".
This is most likely unintended because it can break your application at runtime.
My Folder structure is as-
.
├── README.md
├── dist
│ ├── assets
│ │ ├── favicon
│ │ ├── fonts
│ │ ├── images
│ │ ├── index.e1359b3f.js
│ │ └── index.fb503937.css
│ ├── css
│ │ ├── app.pcss
│ │ └── base
│ │ ├── fonts.pcss
│ │ ├── global.pcss
│ │ ├── headings.pcss
│ │ └── modern-css-reset.pcss
│ └── index.html
├── index.html
├── package-lock.json
├── package.json
├── postcss.config.cjs
├── public
│ ├── assets
│ │ ├── favicon
│ │ ├── fonts
│ │ └── images
│ └── css
│ ├── app.pcss
│ └── base
│ ├── fonts.pcss
│ ├── global.pcss
│ ├── headings.pcss
│ └── modern-css-reset.pcss
├── src
│ ├── App.vue
│ ├── _components
│ ├── _pages
│ ├── main.js
│ └── router
├── tailwind.config.cjs
└── vite.config.js
And I'm importing /dist/css/app.pcss in src/main.js like that:
import '/dist/css/app.pcss'
I will be grateful if you have any ideas:)
The goal is a successful build on netlify via git deploy and thus a successful import of the app.pcss file after the build.
Rather than importing from dist, OP moved his style files towards /src/css and it's now working successfully on Netlify.
Indeed, dist is mainly used as a target for a production app and never as an asset source.
While public, is more for hosting a light resource (like a .pdf) or alike, something that a visitor may need to download at some point on his computer.

How to encapsulate redis?

I am using redis in telegram bots and now I want to deploy few bots on one cloud server. If it possible to encapsulate data from redis for each project to prevent mistakes?
My directory structure:
/webapps/
├── bot1 <= virtualenv for the application Hello
│ ├── bin
│ │ ├── activate
│ │ ├── gunicorn <= Hello app's gunicorn
│ │ ├── gunicorn_start <= Hello app's gunicorn start script
│ │ └── python
│ ├── hello <= Hello app's bot project directory
│ │ └── hello
│ │ ├── settings.py <= hello.settings
│ │ └── wsgi.py <= hello.wsgi
│ ├── logs <= Hello app's logs will be saved here
│ ├── media
│ ├── run <= Gunicorn's socket file will be placed here
│ └── static
└── bot2 <= analogous virtualenv for the application Goodbye
├── bin
│ ├── activate
│ ├── gunicorn
│ ├── gunicorn_start
│ └── python
├── goodbye
│ └── goodbye
│ ├── settings.py
│ └── wsgi.py
├── logs
├── media
├── run
└── static

Master report is not generating for all suites with HTML REPORTER (wdio5)

I have setting up a configuration in 'wdio.conf.js' for "rpii html reporter". But its not generating master report for all suites.
const { ReportAggregator, HtmlReporter } = require('#rpii/wdio-html-reporter');
exports.config = {
reporters: ['spec', [HtmlReporter, {
debug: true,
outputDir: './reports/html-reports/',
filename: 'report.html',
reportTitle: 'Test Report Title',
showInBrowser:true
}
]],
onPrepare: function (config, capabilities) {
let reportAggregator = new ReportAggregator({
outputDir: './reports/html-reports/',
filename: 'master-report.html',
reportTitle: 'Master Report'
});
reportAggregator.clean() ;
global.reportAggregator = reportAggregator;
},
onComplete: function(exitCode, config, capabilities, results) {
(async () => {
await global.reportAggregator.createReport( {
config: config,
capabilities: capabilities,
results : results
});
})();
}
}
I expect single report with multiple test cases. But I'm getting multiple reports for each test cases.
The topic is pretty old atm, but I just addressed a similar issue in my project - cannot generate the report at all. In most of the case, it is just a matter of configuration, but there is no solid document or guideline for this painful wdio reporter configuration. So here I am, after a whole week of research and testing around, these are viable config you will need and other fellows out there who is/was facing the same issue.
First, let assume your project structure would be something like the below tree
.
├── some_folder1
│ ├── some_sub_folder1
│ ├── some_sub_folder2
├── some_folder2
├── #report
│ ├── html-reports
│ ├── template
│ │ ├── sanity-mobile-report-template.hbs
│ │ ├── wdio-html-template.hbs
├── specs
│ ├── test1
│ │ ├── test1.doSuccess.spec.js
│ │ ├── test1.doFail.spec.js
│ ├── test2
│ │ ├── test2.doSuccess.spec.js
│ │ ├── test2.doFail.spec.js
├── node-modules
├── package.json
Second, you should have templates for your reports, in my case, it is located in #report/template wdio-html-template.hbs and sanity-mobile-report-template.hbs for HtmlReporter and ReportAggregator respectively. As Rich Peters has notices above
Each suite is executed individually and an html and json file are
generated. wdio does not aggregate the suites, so this is done by the
report aggregator collecting all the files and creating an aggregate
file when complete
The HtmlReporter will actually need to find it template for generating the content for each .spec file, then there is a need for another template requested by ReportAggregator
Third, you need correct specs and suites declaration in your wdio config, generic for specs, and file specifically for suites.
Final, run your test using --suite parameter, reference to wdio guideline
My final project structure would look like this, notice the changes
.
├── some_folder1
│ ├── some_sub_folder1
│ ├── some_sub_folder2
├── some_folder2
├── #report
│ ├── html-reports
│ ├── ├── screenshots
│ ├── ├── suite-0-0
│ ├── ├── ├── 0-0
│ ├── ├── ├── ├── report.html
│ ├── ├── ├── ├── report.json
│ ├── ├── ├── 0-1
│ ├── ├── ├── ├── report.html
│ ├── ├── ├── ├── report.json
│ ├── ├── master-report.html
│ ├── ├── master-report.json
│ ├── template
│ │ ├── sanity-mobile-report-template.hbs
│ │ ├── wdio-html-template.hbs
├── specs
│ ├── test1
│ │ ├── test1.doSuccess.spec.js
│ │ ├── test1.doFail.spec.js
│ ├── test2
│ │ ├── test2.doSuccess.spec.js
│ │ ├── test2.doFail.spec.js
├── node-modules
├── package.json
Each suite is executed individually and an html and json file are generated. wdio does not aggregate the suites, so this is done by the report aggregator collecting all the files and creating an aggregate file when complete.

How do I find the path to the top of my ROOT source tree?

I'm trying to build ROOT on my MacBook using the terminal; I'm very novice when it comes to programming and downloading these things and I haven't been able to find anything that can explain to me what I need to do. This is what I've done so far: downloaded and unpacked ROOT, installed CMake, and emac. I've just been following the instructions CERN has on their website Building Root.
I made a directory to contain the build, but now I'm on the step which says "Execute the cmake command on the shell replacing path/to/source with the path to the top of your ROOT source tree." However, I have no idea what the path is to the top of my ROOT source tree, nor do I even know what that is to be honest. I'm trying to use ROOT with Xcode because it's really the only compiler I'm familiar with.
How can I find what the path is to the top of my ROOT source tree?
Tree here means the directory tree, so the "directory" or "folder" to which you unpacked root.
So, if your directory structure looks like this:
Downloads
├── PlaneTicket
├── oldThings
│ ├── Pictures
│ ├── Movies
│ ├── PDF-Documents
├── backup
│ ├── data
│ └── dataset
├── backup2
│ ├── data
│ └── dataset
├── build (<- I assume your build directory is there)
├── ROOT
│ ├── bindings
│ │ ├── doc
│ │ ├── pyroot
│ │ ├── r
│ │ └── ruby
│ ├── build
│ │ ├── misc
│ │ ├── package
│ │ ├── rmkdepend
│ │ ├── unix
│ │ └── win
│ ├── cmake
│ │ ├── modules
│ │ ├── patches
│ │ └── scripts
│ ├── config
│ ├── core
│ │ ...
│ ├── doc
│ │ ...
│ ├── documentation
│ │ ...
│ ...
└── very_old_files
Then your cmake commands should looks like this
cmake ../ROOT
If you've followed CERN's instructions that far, you can simply use the command 'dir' or 'ls' and the path to source will print to screen. Type 'cmake', and then copy/paste that in front of it to build ROOT.

Silex: documentation's folder structure

I've been working with Silex a few times now. And I like it, but sometimes, the documentation confuses me simply because they use another folder structure.
Who can tell me which folder structure they use exactly in Silex 2.0?
Documentation
├── composer.json
├── composer.lock
├── vendor
│ └── ...
└── web
└── index.php
Where are the views, controllers etcetera stored?
Silex is not a "convention over configuration" framework: it does not prescribe nor care what the structure of your file system or application organisation is; that is why there's no mention of such things in the docs.
Just organise things the way that best suits your own needs.
Just for example, a directory structure I usually use.
├── config
│ └── dev.php
│ └── test.php
│ └── ...
├── src PSR-4 compatible directory structure
│ └── Component Customized components (Symfony's components or any other)
│ └── Security
│ └── ...
│ └── Validator
│ └── ...
│ └── ...
│ └── Controller
│ └── DataFixtures
│ └── Migrations
│ └── Provider My service providers
│ └── Serivice My services
│ └── Auth
│ └── ...
│ └── Registration
│ └── ...
│ └── ...
│ └── Application.php Customized Silex application class
├── tests
├── var
│ └── cache
│ └── log
│ └── ...
├── vendor
│ └── ...
├── web
│ └── index.php
│ └── index-test.php
├── composer.json
├── composer.lock
And my implementation on GitHub. It's currently WIP, but you can use it as a boilerplate for your Silex application.