Avoiding Safari's repeated requests for IndexedDB permission beyond 50 MB - safari

I'm trying to get Safari to run a series of tests (web-platform-tests) served from my local machine. The tests create a large amount of data in IndexedDB for which Safari requires permission (requests are for larger than 50 MB) but this gets too cumbersome to approve permission each time when cycling through hundreds of tests.
In Preferences->Privacy->Cookies and website data->Manage Website Data..., there is an entry, "Local documents on your computer (Databases)" apparently indicating the presence of this data, but it does not provide any configuration options, nor does Preferences->Privacy->Cookies and website data->Always allow work to avoid the prompting.
Is there any other config which can allow me to get around the need for manual permission? (I'm asking here instead of superuser, as I don't know if there might also be an API which can persist overcoming the limit as well.)

Related

Static files as API GET targets

I'm creating a RESTful backend API for eventual use by a phone app, and am toying with the idea of making some of the API read functions nothing more than static files, created and periodically updated by my server-side code, that the app will simply GET directly.
Is this a good idea?
My hope is to significantly reduce the CPU and memory load on the server by not requiring any code to run at all for many of the API calls. However, there could potentially be a huge number of these files (at least one per user of the phone app, which will be a public app listed in the app stores that I naturally hope will get lots of downloads) and I'm wondering if that alone will lead to latency issues I'm trying to avoid.
Here are more details:
It's an Apache server
The hardware is a hosting provider's VPS with about 1gb memory and 20gb free disk space
The average file size (in terms of content and not disk footprint) will probably be < 1kb
I imagine my server-side code might update a given user's data once a day or so at most.
The app will probably do GETs on these files just a few times a day. (There's no real-time interaction going on.)
I might password protect the directory the files will be in at the .htaccess level, though there's no personal or proprietary information in any of the files, so maybe I don't need to, but if I do, will that make a difference in terms of the main question of feasibility and performance?
Thanks for any help you can give me.
This is generally a good thing to do: anything that can be static rather than dynamic is a win for performance and cost (it's why we do caching!), but the main issue with with authorization (which you'll still need to do for each incoming request).
You might also want to consider using a cloud service for storage of the static data (e.g., Amazon S3 or Google Cloud Storage). There are neat ways to provide temporary authorized URLs that you can pass to users so that they can read the data for a short time and then must re-authorize to continue having access.

jmeter Load Test Serevr down issues

I was used a load of 100 using ultimate thread group for execution in NON GUI Mode .
The Execution takes place around 5 mins. only . After that my test environment got shut down. I am not able to drill down the issues. What could be the reason for server downs. my environment supports for 500 users.
How do you know your environment supports 500 users?
100 threads don't necessarily map to 100 real users, you need to consider a lot of stuff while designing your test, in particular:
Real users don't hammer the server non-stop, they need some time to "think" between operations. So make sure you add Timers between requests and configure them to represent reasonable think times.
Real users use real browsers, real browsers download embedded resources (images, scripts, styles, fonts, etc) but they do it only once, on subsequent requests the resources are being returned from cache and no actual request is being made. Make sure to add HTTP Cache Manager to your Test Plan
You need to add the load gradually, this way you will be able to state what was amount of threads (virtual users) where response time start exceeding acceptable values or errors start occurring. Generate a HTML Reporting Dashboard, look into metrics and correlate them with the increasing load.
Make sure that your application under test has enough headroom to operate in terms of CPU, RAM, Disk space, etc. You can monitor these counters using JMeter PerfMon Plugin.
Check your application logs, most probably they will have some clue to the root cause of the failure. If you're familiar with the programming language your application is written in - using a profiler tool during the load test can tell you the full story regarding what's going on, what are the most resources consuming functions and objects, etc.

Test website program performance under pressure visualization

I want to know that how can I test my website (web-based program) performance with the factors of speed and response time when using MS-SQL Server and ASP.net
Actually I want to know when my users increased to 1,000,000 and more, how the speed and performance changed?
Thank you
There are a number of tools to run load tests against web sites; I like JMeter (http://jmeter.apache.org/) - open source, free, easy to use - but there are lots of others - google "web performance testing" and take your pick.
All those tools allow you to specify a number of concurrent users, wait times between page requests, and then specify one or more user journeys through the site. They will give you a report showing response times as the number of users changes.
You can install the load testing applications on any machine; most have the concept of "controller", and "load agent". The controller orchestrates the load test, while the load agents execute the tests. Generating the equivalent load of 1 million visitors is likely to require significant horse power - you may need to use one of the cloud providers of load testing solutions. Again, Google is your friend here.

Storage limit for indexeddb on IE10

We are building a web-app that store lots of files as blobs with indexedDB. If the user uses our app at its maximum, we could store as much as 15GB of file in indexeddb.
We ran into a problem with IE10, that I strongly suspect is a quota issue.
After having successfully saved some files, a new call to store.put(data, key); will never ends.
Basically, the function will be called, but no success event nor error event will be called.
If I look into the IndexedDB folder of IE 10 I'll see a handfull of what looks like temporary files (of 512 kB each) getting created and removed indefinitely.
When looking at the "Cache and Database" paramaters window, I see that my site's database has reached 250 MB.
Looking further, I found this blog entry http://msdnrss.thecoderblogs.com/2012/12/using-html5javascript-in-windows-store-apps-data-access-and-storage-mechanism-ii/ which incidently says that the storage limit for Windows Store apps is 250 MB.
I am not using any Windows Store mechanism, but I figured I could be victim of the same arbitrary limit.
So, my question is :
Is there any way to bypass this limit ? User is asked for permission to exceed a 10 MB limit, but I saw no question popping to the user when the 250 MB was reached.
Is there any other way to store more than 250 MB of data with IE10.
Thanks, I'll take any clues.
I afraid you can't. Providing the storage limit and asking the user to allow more space is the responsibility of the browser vendor. So I don't think the first option is applicable.
I know the user can allow a website to exceed a give limit (internet options > General > Browsing history > settings > caches and databases), but I don't know if that will overrule the 250MB. It can be that this is a hardcoded limit you can't exceed.
This limit is bound to a domain meaning you can't solve it by creating multiple databases. The only solution would be to store on multiple domains, but in that case you can't cross access them. Also as I see the 250MB limit will be for indexeddb API and File API combined

How to avoid spammer to use my FTP, bandwidth and mySQL of my site?

THE PROBLEM
My server gave me an ultimatum (3 business days):
"We regret to say That database is currently consuming excessive resources on our servers Which causes our servers to degrade performance Affecting ITS customers to other database driven sites are hosted on this server That. The database / tables / queries statistical information's are provided below:
AVG Queries / logged / killed
79500/0/0
There are Several Reasons where the queries gets Increased. Unused plugins will Increase the number of queries. If the plugins are not causing the issue, you can go ahead and block the IP addresses of the spammers Which will optimize the queries. Also you can look for any spam Existed contents in the database and clear them up.
You need to check for the top hitters in the Stats page. Depending upon the bandwidth accessed, top hits and IP you need to take specific actions on Them to optimize the database queries. you need to block the Unknown robot (Identified by 'bot *'). Since These bots are scraping content from your website, blog comment spamming your area, harvesting email addresses, sniffing for security holes in your scripts, trying to use your mail form scripts as relays to send spam email. .htaccess Editor tool is available to block the IP address."
THE BACKGROUND
The site is made ​​100% from us in VB. NET, mySQL and platform of Win (except the Snitz Forum). The only point from which we received SPAM was a form for comments which now has a captcha. We talk of more than 4000 files between tools articles, forums, etc. for a total of 19GB of space. Only upload it takes me 2 weeks.
STATISTICS OF ROBOTS
Awstats tells us for the month of February 2012:
ROBOT AND SPIDER
Googlebot
+303 2572945 accesses
5:35 GB
Unknown robot (Identified by 'bot *')
772520 accesses +2740
259.55 MB
BaiDuSpider
+95 96 639 access
320.02 MB
Google AdSense
35907 accesses
486.16 MB
MJ12bot
33567 +1208 access
844.52 MB
Yandex bot
+104 18 876 access
433.84 MB
[...]
STATISTICS OF IP
IP
41.82.76.159
11681 pages
12078 accesses
581.68 MB
87.1.153.254
9807 pages
10734 accesses
788.55 MB
[...]
other
249561 pages
4055612 accesses
59.29 GB
THE SITUATION
Help!!! I don't know how to block IP with .htaccess and I don't know what IP! I'm not sure! Awstats ends without the past 4 days!
I already tried in the past to change the password of FTP and account, nothing! The goal is not I think are generic attacks aimed at obtaining backlinks and redirects (often do not work)!
This isn't really an htaccess issue. Look at your own stats. You've had ~4M hits generating some 12Kb per hit in the last 4 days. I ran the OpenOffice.org user forums for 5 years and this sort off access rate can be typical for a busy forum. I used to run on a dedicated quad-core box, but migrated this a modern single core VM and when tuned, this took this sort of load.
The relative Bot volumetrics are also not surprising as a % of these volumes, nor are the 75K D/B queries.
I think that what your hosting provider is pointing out is that you are using an unacceptable amount of system (D/B) resources for your type of account. You either need to upgrade your hosting plan or examine how you can optimise your database use. E.g. are your tables properly indexed and do you routinely do a Check/Analyze/Optimize of all tables. If not then you should!
It may well be that spammers are exploiting your forum for SPAM link posts, but you need to look at the content in the first instance to see if this is the case.