What is the need of uploading CSV file for performance testing in blazemeter? - testing

I am new in Testing world and just started working on performance and load testing, I just want to know that while we record script in Blazemeter and upload a jmx file for test then why we upload a CSV file with it and What data we need to enter in that CSV file. pls help.
Thank you

You can generate data for testing (e.g. user names and passwords etc.) and save in CSV format and then read them from CSV file and use in your test scenarios if it's needed. Please refer to the Blazemeter documentation - Using CSV DATA SET CONFIG

Related

How to do load test for Registration in JMETER

Iam very new to JMETER, how to test registration for multiple users & login. Is it possible using CSV file or any other method for multiple users ??
Can anyone please guide me
Depending on what you're trying to achieve:
If you have a file with username/password combinations - the most commonly used approach is reading the file with CSV Data Set Config
Alternatives are:
JDBC Test Elements - if the credentials are in the database
Redis Data Set - the credentials are in Redis
etc.
If you're testing registration and "run and forget" approach is good for you - you can generate random usernames and passwords using suitable JMeter Functions like __RandomString, __Random(), __time(), etc.
You can use Random CSV DataSet Config plugin in addition to the solutions given by Dmitri.
I have demonstrated use of CSV DataSet Config element and Random CSV DataSet Config in this video.

What is the best way or tool to open to read a raw clickstream blob storage data in azure

I have clickstream blob storage about 800mb average file sizes and when i open the file it defaults to text file. How do i open and read the data possibly json format or column format. I would also like to understand if i can build an API to consume that data. I recently built an azure function app http trigger but the file is too large to open it up and the function times out. So any suggestion on those two would be appreciated
Thank you

I can't see my transformation files (PDI) in Pentaho user console

I have csv files in the local file system that contain output results of a program. I want to use Pentaho CDE to visualize those data in graphs (Pie charts ...)
I can do that if I upload my csv files directly as a datasource. But I would like to read from my file in real time as data come.
I saw that I have to use PDI and I created a transformation from the input csv file to an output file. I can visualize the output.
I saved the transformation file .ktr in pentaho-solutions directory But I can't see it in Pentaho user Console. I think that I have to use a kettle over kettleTransFromFile but I can't see my transformation file to load it ! I refreshed the cache but still can't see it ...
Am I doing this wrong ?
Thank you for your time

Is there a way to read test data from XLS for a Gherkin (cucumber-jvm) feature file ?

I would like have layered design for tests and test data , looking for any option to read test data from XLS or any other files .
If there is any API available in cucumber jvm please let me know.
Using Apache POI you should be able to Read data from an XLS file
All you need to do is add the logic to your Feature steps definitions.
Apache POI- http://poi.apache.org/
Hope this helps

uploading a file in coldfusion

We would like to know if there is a way to upload a file by using coldfusion.Our requirement is to upload a csv file..So intially we move the csv file data to a temporary table and then we shud write a proc and call it that would kick of the actual upload process.
The simple answer is, yes. ColdFusion makes this is quite a simple process. Just read the CF9 docs concerning cffile (http://help.adobe.com/en_US/ColdFusion/9.0/CFMLRef/WSc3ff6d0ea77859461172e0811cbec22c24-738f.html)