What is the equivalent of method
Yii::getLogger()->getExecutionTime()
in Yii2, that present the total time for serving the current request?
You should simply use :
Yii::getLogger()->getElapsedTime();
Read more : http://www.yiiframework.com/doc-2.0/yii-log-logger.html#getElapsedTime()-detail
Related
In my Ruby on Rails application there are 2 Localhost servers running. I am writing test cases for the 1st server and so I have to mock the 2nd server.
For this I am using VCR to record the responses I get from the 2nd server and play the recorded cassette while running the tests on the 1st server.
I am stuck at the part where the 1st server makes a request to 2nd server(the session_id in the URL changes each time) and I want the response to be same every time it makes a request.
Using VCR you can match requests on any parameters you wish (method, host, path, etc...) using the match_requests_on cassette option or a fully custom matcher - https://relishapp.com/vcr/vcr/v/3-0-3/docs/request-matching
I made this work via params ignoring. So for you something like this could work:
VCR.use_cassette('name_of_your_cassette', match_requests_on: [:method, VCR.request_matchers.uri_without_params('session_id')]) do
# here is your http query
end
In my case it was query that was changing so I ignored that in vcr request matcher.
I run some load tests (all endpoints) and we do have a known issue in our code: if multiple POST requests are sent in the same time we do get a duplicate error based on a timestamp field in our database.
All I want to do is to count timeouts (based on the message received "Service is not available. Request timeout") in a variable and accept this as a normal behavior (don't fail the tests).
For now I've added a Response Assertion for this (in order to keep the tests running) but I cannot tell if or how many timeout actually happen.
How can I count this?
Thank you
I would recommend doing this as follows:
Add JSR223 Listener to your Test Plan
Put the following code into "Script" area:
if (prev.getResponseDataAsString().contains('Service is not available. Request timeout')) {
prev.setSampleLabel('False negative')
}
That's it, if sampler will contain Service is not available. Request timeout in the response body - JMeter will change its title to False negative.
You can even mark it as passed by adding prev.setSuccessful(false) line to your script. See Apache Groovy - Why and How You Should Use It article fore more information on what else you can do with Groovy in JMeter tests
If you just need to find out the count based on the response message then you can save the performance results in a csv file using simple data writer (configure for csv only) and then filter csv based on the response message to get the required count. Or you can use Display only "errors" option to get all the errors and then filter out based on the expected error message.
If you need to find out at the runtime then you can use aggregate report listener and use "Errors" checkbox to get the count of failure but this will include other failures also.
But, if you need to get the count at the run time to use it later then it is a different case. I am assuming that it is not the case.
Thanks,
I've just created an extractor with import.io. This extractor uses chaining. Firstly I'm extracting some urls from one page and with these extracted urls, I'm extracting detail pages. When detail pages' extraction finish, I want to get the results. But how can I be sure that extraction is completed. Is there any api endpoint for checking the status of extraction?
I found "GET /store/connector/{id}" endpoint from legacy. But when I try this, I got 404. You can take a look at the screenshot.
Another question is, I want to schedule my extractor twice a day. Is this possible?
Thanks
Associated with each Extractor are Crawl Runs. A crawl run represents the running of an extractor with a specific configuration (training, list of URLs, etc). The state of each of a crawl run can have one of the following values:
STARTED => Currently running
CANCELLED => Started but cancelled by the user
FINISHED => Run was complete
Additional metadata that is included is as follows:
Started At - When the run started
Stopped At - When the run finished
Total URL Count - Total number of URLs in the run
Success URL Count - # of successful URLs queried
Failed URL Count - # of failed URLs queried
Row Count - Total number of rows returned in the run
The REST API to get the list of craw runs associated with an extractor is as follows:
curl -s X GET "https://store.import.io/store/crawlrun/_search?_sort=_meta.creationTimestamp&_page=1&_perPage=30&extractorId=$EXTRACTOR_ID&_apikey=$IMPORT_IO_API_KEY"
where
$EXTRACTOR_ID - Extractor to list crawl runs
$IMPORT_IO_API_KEY - Import.io API from your account
In my Laravel application I used Auth::user() in multiple places. I am just worried that Laravel might be doing some queries on each call of Auth::user()
Kindly advice
No the user model is cached. Let's take a look at Illuminate\Auth\Guard#user:
public function user()
{
if ($this->loggedOut) return;
// If we have already retrieved the user for the current request we can just
// return it back immediately. We do not want to pull the user data every
// request into the method because that would tremendously slow an app.
if ( ! is_null($this->user))
{
return $this->user;
}
As the comment says, after retrieving the user for the first time, it will be stored in $this->user and just returned back on the second call.
For same Request, If you run Auth::user() multiple time, it will only run 1 query and not multiple time.
But , if you go and call for another request with Auth::user() , it will go and run 1 query again.
This cannot be cached for all request after first request has been made due to security point of view.
So, It runs 1 query per request irrespective of number of time you are calling.
I see use of some session here to avoid run multiple query, so you can try these code : http://laravel.usercv.com/post/16/using-session-against-authuser-in-laravel-4-and-5-cache-authuser
Thanks
I am trying to integrate SEOMOZ API with VCR.
As the API request to SEOMOZ contains parameters that change for the same request over time, I need to implement a custom matcher.
Here is what my API call looks like :
http://lsapi.seomoz.com/linkscape/url-metrics/#{self.url}?Cols=#{cols}&AccessID=#{moz_id}&Expires=#{expires}&Signature=#{url_safe_signature}
I also make calls to other endpoints such as Twitter,Facebook etc etc. For which the default matcher does the job well.
How can I override the matcher behavior just for SEOMOZ. Also on what parameters should I best match the request in this case.
You'll want to match on all parameters except Signature and Expires.
Another option you might consider (we use it internally when using VCR with this sort of API) is to record the time of the test in a file with the cassettes, and use Timecop or something equivalent to ensure you're re-running the recorded test at the "same time" every time you run it.
You can use the VCR.request_matchers.uri_without_params custom matcher, see https://relishapp.com/vcr/vcr/v/2-5-0/docs/request-matching/uri-without-param-s
You would use it like this in your case:
VCR.configure do |c|
# ...
c.default_cassette_options = {
# ...
# the default is: match_requests_on: [:method :uri]
match_requests_on: [:method, VCR.request_matchers.uri_without_params(:Signature, :Expires)]
}
end