I had a question regarding the fixtures which I wasn't able to find in the docs. Can we pass the a particular JSON fixture in the command line. For example, I have two json files user1.json and user2.json and on runtime I want to pass the fixture I require for the test.
Can I do something like npx cypress run --fixture name that dynamically passes the json file.
Related
i have an app that can show many popups in various scenarios, and i would like to verify their text using XCUITest. but would like to be able to do that with no effort for multiple text configurations. for multiple languages for instance.
Is there a way to pass arguments through the .xctestrun file or through the "xcodebuild test-without-building" command? some way to pass the dictionary, or a file that i can parse at the beginning of the XCTestCase to know the correct text values to predict? preferably without the need to rebuild the project.
Found the answer.
The test host (and your XCTestCases) can view its arguments same as the test target, using NSProcessInfo.processInfo.environment and NSProcessInfo.processInfo.arguments.
Using the scheme for "Test" in XCode, you can add arguments and environment variables that the test host itself can read. The test host can read these by using the process info as mentioned above.
Another way to do this would be by editing the xctestrun file for your test. In it, you can add the key CommandLineArguments as an array of strings for the process info arguments, or add EnvironmentVariables as a dictionary from key to string value.
An easy way to go about adding the arguments/variables to the xctestrun file manually would be to first add them to the Test scheme in XCode, see the changes to the xctestrun file, and modify them accordingly.
other xctestrun variables are described in https://www.manpagez.com/man/5/xcodebuild.xctestrun/
I have been building ui automation frameworks with Cypress for some time, but always using the Cypress-Cucumber-Preprocessor.
Now I need to build one without cucumber, just plain ol' mocha, but I found a problem. Seems like I can't use tagged hooks to execute code for specific tests (scenarios in Cucumber)
The scenario is basically this. I have a spec file with several tests. I have a "before" hook that seeds test data to a Mongo db, and eventually I might need to add a hook or hooks to execute something (whatever) before a specific test.
With Cucumber you have a way to tag a given scenario (#tag) and then you can create a hook that will be executed ONLY before or after that specific scenario
#tag
Scenario: Tagged scenario
Given condition
When I do this
Then I should see that
before({tag : '#tag'}, () => {
code
})
I haven't found a way to do this with mocha in Cypress... Anyone has found a way?
thx
You can use BeforeEach or Before, that does predominantly the same thing in Mocha.
I'm using testCafe for my functional test.
My project used a lot of XHR request and I don't want to waste my time to generate each single mock.
Exists an automocker like this: https://github.com/scottschafer/cypressautomocker for testcafe?
TestCafe does not provide the described functionality out of the box. However, you can use the combination of RequestLogger and RequestMock
The idea is in that you can create a JSON file with request results at the first run using the RequestLogger.
Then, based on results of the first run, you can configure your RequestMock object to respond with the results from the file for all consequent requests.
I want to generate tests on the fly by getting the json including array of data from an API indicating what should I do in each test and how many tests I need to do.
I tried to put the fetch part in "beforeAll" but anyway it is not working because jest wants all tests (it) at execution time.
I have written a custom reporter for mocha. I would like to be able to pass in variables to it, is there a way to pass in variables?
For example, I would like to pass in a Project and a suite. My custom reporter then is able to report these results to a 3rd party application, but I need to be able to report the suite and project in the test pass or test fail methods.
Thanks