Alternatives to JUpload for very large (> 2GB) files - file-upload

Our customers need to upload very large files (around 10GB or so) to our web-service. Ignoring any server side complexities, what options exist for the client side? JUpload seems to be the most common, but it's file selector is very unfriendly looking. Our clients would be happy to install a browser extension, so it doesn't need to be limited to Java, and something that uses the native OS file selector would be preferred. Browsers used by customers include the usual (IE, Firefox, Safari and Chrome), but we may be able to limit clients to a particular browser, so something that is not cross-browser may be acceptable, but is certainly not preferred.

We use https://transloadit.com/ for this purpose, but some web browsers may not support larger than 2GB files.

Related

Site functionality diminished over VPN/company network

We currently experience a diminished with one of our customers at our main production site. All subpages and resources seem to be affected as well.
The customer reports a completely broken experience for themselves with the site not working correctly at all, mostly due to assets not loading correctly.
We already started investigating and have found that - so far - nothing seems to be wrong with the site itself.
Quick rundown:
The production site has a Cloudflare layer and almost all of it's assets are delivered either via CDNjs or Amazon's Cloudfront (behind Cloudflare) - all assets are reachable via HTTP as well
The site uses SSL and enforces it (the dynamic cert from Cloudflare)
We could secure a HAR from one of the requests for the request to one of our sites, the request times are extremely long. If you like to try, here is an online HAR viewer, be sure to uncheck validation of the file.
The customer uses Internet Explorer 8 and Chrome (39). While the site is not optimized for IE8. It should run fine in Chrome, in fact, in runs in most browsers above IE9 just fine for all of us.
Notes
We already ruled out:
Virtual delivery problems (there could be physical limitations we are not aware of)
General faultiness of our setup (We tried three different open VPNs to verify this)
Being on the customers blacklist by accident (although we cannot be entirely sure of this)
SSL Server name indication (SNI) problems
(Potentially) a general problem with the customers network, the customer does not report any problems with "the rest of the internet".
The customer will not give access to their VPN/disclose security details so we cannot really test for the situation ourselves. We suspect that the customer uses an internal proxy that might cause the problems described, but we are not sure.
Questions
My questions here are:
Is there any known problem caused by internal networking in conjunction with our setup that can cause this behaviour?.
Are there potential problems on our end that we could have overlooked or things that we do different from other sites?
It seems the connection is being done (or routed) through a low bandwidth high latency link (or a very congested one). Most of the dns lookups and connects seems to be taking ~10s.
In the HAR you can see that it affects fonts.googleapis.com and cdnjs.cloudflare.com. https://www.google-analytics.com/analytics.js has no data captured. To me the affirmation that the customer does not report any problems with "the rest of the internet" seems kind of dubious, seeing that in this HAR it hasn't been able to load the analytics js and access to usual cdns are very slow.
My guesses (pick one or more):
they are testing in a machine different than the one they have no problems with "the rest of the internet"
this machine is very, very slow
it has some kind of content filtering, antivirus, whatever filtering the web (perhaps with a ssl certificate installed in order to forge & inspect https traffic)
the access is done through a congested route, or a low bandwidth high latency link
Two hotspots:
It happens sometime for CDN points to be inconsistent, I spent a lot of time to understand this issue. How? In a live session with the client when I opened each resource loaded one by one I understand there are differences between CDN access points (Mine eastern Europe - His central Europe ). CDN hosting was one of the biggest US player in the world, anyhow we fixed this by invalidating(deleting) all files from CDN as so new/correct ones were loaded.
You need to have CDN that supports serving files over HTTPS, then use that CDN for the SSL requests.

Recording scripts - "page not found" because of single protocol?

I have found a strange issue which I do not completely understand. When I run the LoadRunner with just a single protocol, the browser (when recording starts) is ran but says "page not found" (as if the proxy was not set).
How come? The protocols specify what traffic will be captured but I assumed in just does not record the ones not specified. But why the browser could not find the page in single protocol and could in multiple?
I've found that the single protocol mode (I assume web here) is somewhat erratic and does not work all the time. The workaround is to use the multiple protocol mode, but select only Web (HTTP/HTML). This works much better.
The actual reasons for why this is the case are unknown, but at least give it a try!
As for other issues:
Check that your PROXY settings are correct when you invoke IE for recording. Your issue sounds a little like a proxy issue, but please post more details if none of the above works.
Over 90% of recording issues can be tracked to environment items, specifically do you have the right match up between version of LR and version/manufacturer of your browser plus are you signed in with the proper credentials plus do you have any conflicting software packages loaded, such as antivirus, which could be impacting the recordingf mechansim.
Where to start?
Makes sure you are signed in with Administrative credentials
Disable any antivirus running locally
Validate your browser manufacturer and version with the requirements for your version of LoadRunner

How to stress test simulating heavy load using Selenium

I have a system to test, which is a video ads distribution technology. I need to load every video like 1-2 mins to serve the ads. The videos are played in a Flash client and streamed as FLV streams like in YouTube.
The reason why I need to test it only via browsers -- and every other method won't work -- is to stress test both the video streaming servers and the ads servers simultaneously and displaying ads in real-time.
I have used Selenium, WatiN, Automation Anywhere and many other automation tools. However, when I am trying to start like 10000 browsers on my machine (32GB RAM, 16-core CPU), none of them are able to do the job.
With Selenium, I am able to start the maximum FireFox instances so far, but that's still too low: half of the instances don't run the test.
Any suggestions to do with Selenium?
You aren't going to run 10,000 browsers on your machine. That would give 3.2MB of physical memory per browser instance and I'm pretty sure FireFox just won't like that.
You could create a JMeter script that hits your server with many threads. It won't interact with the UI but would simulate the load of many clients hitting whatever URLs you tell it. I believe it also includes the ability to record a session and play it back for easy setup of your sessions.
Selenium isn't really optimized for load/stress testing, especially if you're running your browsers locally. Running 1000+ browsers is going to choke even the beefiest server. Though RAM is an obvious bottleneck, you also have limited CPU resources and bandwidth. The latter being a primary concern if you are loading videos.
Not to mention you'd be testing from a single IP with 10k browsers, so load balancing may not kick in properly, as well as the actual distribution of video ads to specific virtual users.
If you want to stick with existing Selenium tests, I've had good experiences with BrowserMob. They basically have a huge grid to do real browser load-testing, distributed across AWS.
Another recommendation would be an actual performance testing tool. I'd recommend Soasta CloudTest. They have a free version that runs 100 users so you can see if it will be a good fit for you. I have found that scripting for CloudTest is relatively simple.
Disclaimer: My experiences with both companies have been as a paying customer and I have never worked for either.
If you are using Windows machine then as per my experience there is a limit on number of browser window instances to be opened. As per my test last time, it does restrict between 100-150 browser windows.
I would recommend you using headless robot, which doesn't require opening browser window. I think latest version of Selenium has that capability. But it seems to be more like a load test as you are trying to simulate 10,000+ user instances, I would recommend you using load testing tool like JMeter or LoadRunner.
It looks to me that you are trying to verify what the client will see based on high traffic, no?
In that case, Joel is quite correct. If you absolutely have to see what the client sees, you could use threaded hits and just dump the results in a database. That'll show you anything the client would see anyway, and it's a lot easier to sort through than thousands of browser instances.
Either way, your client will not see errors if there are no errors present on the server side. If you're testing functionality in bandwidth restricted environments, CPU-intensive environments, or memory-intensive environments, those are much easier achieved than running thousands of browser instances.
Your post smells of some form of ad-based fraud to me, but either way: have you considered using different web browsers besides Firefox? PhantomJS is a headless webkit-based browser that is compatible with Selenium. It supports all the core browser features like DOM handling, CSS selectors, Javascript and Canvas. I do not know if it supports Flash.
This post has a decent list of other headless and automatable web-browsers that you might consider.
Also, if each browser instance is instantiating a Flash plugin, don't neglect the possibility that the issue could be with Flash and not Firefox. Alternatively, why instantiate several different Firefox processes? Can you accomplish what you want through the use of tabs instead?
The in-house way to this wiht selenium is using browsermob proxy and multiple broswser agents to recreate the experience of different users, changing the ip is more difficult because it requires changing your home network.
Here is a good example

Automatic file selection for upload

Is it possible for a website to automatically find a folder on usb stick and upload all the files in it to the web server by clicking only one button?
The problem is that I don't know how to make upload form automatically detect usb stick as the drive name(ie. G:, F:, etc) may vary from computer to computer, so hard coding path is not possible.
Ps. I'm using yii framework for site development, but can add a new page that will handle this in any other language as the client really wants this feature.
Web sites are not allowed to set default files to upload (it's a major security risk!). Also, web sites cannot scan the hard drive/enumerate what file systems exist on a system, again, for security purposes.
It might be possibly to do this with Flash/Silverlight/Java. Java seems the most likely to allow a web developer to do this (Java plugin seems to be quite willing to give out every permission under the Sun).
Short answer: No.
Long answer: Allowing automatic uploads in web browsers would be a huge security hole so the browsers intentionally prevent it. Even if you manage to find a hole that permits it, the browser makers will break it as soon as they find out.
However, if you have an environment where an actual separate program can be installed on the end user's computer you could easily write a program to do automated uploads of specified directories when launched.

Is there an HTTP proxy tool that can substitute browsed content?

What I'm looking for is some sort of a proxy tool that will allow me to specify a local file to load instead of one specified in the web page that is being browsed. I have tried Burp Suite which is almost working - it allows us to intercept a file and replace it by pasting the contents of the file we are swapping in into an input field. The file content is compiled code (Flash content) so we are pasting in bytecode, but something isn't working.
The reason is we are a 3rd party software developer without access to our client's development or testing environments. Our content must interact correctly with the rest of the content on their webpage (there are elements on their page that communicate with our content) and to test any changes we make takes several hours turnaround to get our files uploaded to their servers. So what we need is some sort of hacking tool to let us test our work with their web pages, hence the requirement to specify a file in a webpage to swap with a local version.
The autoresponder feature in Fiddler Web Debugging Proxy might do what you need, if it's only static content.
I've been using HTTP::Proxy for a long time, and it has always helped me fiddle with things on the fly.
You might be able to do this with Greasemonkey but I'm not sure if the tests will be totally reliable.
http://diveintogreasemonkey.org/patterns/replace-element.html
And if Greasemonkey seems plain wrong for you I would take it as the perfect excuse to try out mouseHole. Now I have to admit that I've never tried it but since _why also made Hpricot I expect it to be fun, productive, and different.