I was trying to upload a file from my local to the web page. I’m able to select the file using the Robot class, but I’m stuck on a part. When it uploads the file, I see a message in UI that the file is in “scanning for viruses”. But, it doesn’t complete. And I can’t move forward because the file upload is not complete as it’s still being scanned for viruses. (Screenshot attached)
However, in my application, when I open a session manually and upload a file manually, it’s being scanned for say max 2-3 seconds and after that, I see the upload is complete. Only, when the session is opened by Katalon, I’m not able to complete the upload as it gets stuck on scanning.
Can you please help me figure out why would it behave differently when done from an automated script as compared to a manual execution? Is it something related to browser settings when Katalon opens it? And how to handle this part in an automated script?
Please have a look at this. Looking forward to a reply.
enter image description here
Related
We have a process in place built on Excel VBA that uploads a file to FTP Server. On the other side, our client downloads it. Very randomly, they complain that the file they received is blank (the file name is the same though). We then check at our end and see that the file that was uploaded was never blank. So here comes the problem: we're always arguing whether it was our error or theirs.
I figured that there might be a couple of reasons behind it but I have a few questions to ask before coming to conclusions:
If, say, the file was never uploaded (a possibility), what happens when the client runs a download process at their end? Can that download process generate a blank file with the same name as our output file? It sounds impossible to me but since the client is following up on this issue, I have to ask this silly question.
How does the mechanism work - what are the steps that happen on FTP server the moment my process completes uploading the file? I sometimes see that as soon as I upload the file, a 0kb file is created and then a second later (or less) the file with right size appears? Could it be possible that their process is running right before this actual file creation?
Thank you in advance for your help!
I have Google'd this subject a lot over the past few days, but I cannot find a best practice solution. My question is basically how do I script in LR a fileupload? My app consists of a browse button, a pop up that lets me locate the file and it closes after I have located the file. Finally the app has a upload button to upload the file.
My script is recorded using URL mode and I guess I need to create some kind of custom request? URL mode creates somewhat complex scripts and placing custom requests inside these script is challenging.
BTW: I have not tried to record and play back the file upload process described yet, and using URL mode might just solve it without further customization? Or did someone actually made file upload using LR and URL mode work? A small example would be greatly appreciated!
Different applications will go about uploading a file from a client to a server in different ways. Your best bet is to record your application doing the upload and taking a look at what LoadRunner records.
Mark the point before and after the upload as you record by creating a transaction so you can easily find the spot in your code where the upload actually happens.
It would be far to much for me to ask for a full solution. However, could you point me in the right direction in what I need to look up, learn etc as its the first time I am going to attempt something like this.
What I want to do, is in my Mac application, have a list of items which are files which I want to store online. Then from inside the application the user can download any of the items stored at that location online. If I add new items to download online I want the app to automatically add them to the list for download.
That make sense? Anyway, its the first time I have done anything like this using an online server and accessing it via an app, so any support would be hugely appreciated.
Sounds like you want an ftp type server, you can then get list of remote files, upload and download file, if you do search for Cocoa ftp I am sure you will find someone has written a nice wrapper class for ftp, there are even complete open source apps for ftp whose code you can examine, FileZilla, other you could just use NSTask, and call the ftp command line tool on all macs.
I'm looking to write an automated script that
Opens up a browser instance with a specific URL
Print the page as PDF output to a pre-defined location and document name
Simulate a click event on the web page that goes to the next report
Repeat 2 and 3 for a fixed number of times.
I'm not sure how to start doing this. Thought of using Javascript, but it won't be able to automate the printing process.
There is no control of the server, therefore I cannot use a query to get the collection of those reports.
The reason for the script is that there are many such reports, and the server can be very slow at times, it would be better to have them locally.
UPDATE: Forgot to mention that log in is required for the server.
I think scripting an off-the-shelf browser is very much the Hard Way to solve your problem. If you can at all predict the URLs for the individual report, use a command-line tool such as wget or curl to download them, and then look at this community wiki for rendering the downloaded HTML as PDF.
Or do you even need to go to PDF? If all you're interested in is having the reports available locally, why not keep them as HTML and view them in a browser (with a file: URL) rather than a PDF viewer?
Remote clients will upload images (and perhaps some instructional files in specially formatted text) to a "drop folder." Once the upload is complete we need to begin processing these images. It would be an easy, but flawed, solution to just have a script automatically begin processing any files in the folder every few seconds (the files can be move out of the folder once processed); but problems would arise when attempting to process large images which are only partially transfered.
What are some tricks I can use to ensure the files are fully uploaded before processing them?
A few of my own thoughts:
The script can check the validity of the file; ie, a partial jpeg would result in an error and you could respond to that error in the script, this would be fairly CPU intensive though. Some files have special markers on the end, but I can't count on this, I'm not sure what formats I'll be dealing with.
I've heard of "file handles" but haven't really figured out the basics of what they are and how I can tell if there is a "file handle" on a particular file. Basically the FTP daemon (actually, I'm on Windows, so "service") would keep a "handle" on the file while it's being uploaded and you would know not to process that file. These are just a few of my thoughts but I'm not really sure if they will work or if there are better or more accepted ways of solving this problem.
If you have an server-side script upload system (PHP, ASP, JSP, whatever), you could instruct the script to call another script to process the files, or to create a flag-file indicating the upload is done, something like this.
If your server is Linux-based, you can use lsof to check if the file is open. As your ftp/script/cgi will close the file after upload completes, lsof will not show the file in the list.
If your server is Windows-based, you can use Process Explorer to list the open files.
By what method are your users uploading the images?