Why is my Azure App Configuration instance is being throttled? - azure-app-configuration

I have a basic instance of Azure App Configuration. It's supposed to allow 1000 requests per day.
The usage chart for the previous day looks like this:
But I can no longer open Configuration Explorer or Feature Manager. Getting these errors:
I don't understand why requests started to being throttled since the usage is nowhere near 1000 per day.

Throttling is based on the 24-hour day of the UTC time (not the local time). Please make sure the whole day is counted.

Related

Time Travel error when updating Access Control on BigQuery dataset

We use python to programmatically grant authorized view / routine access to a large number of views to various datasets.
However since this week we have been receiving the following error :
Dataset time travel window can only be modified once in 1 hours. The previous change happened 0 hours ago
This is preventing our current deployment process.
And so far we have not been able to find a work around to resolve this error. Note that we do not touch the time travel configurations at all as a part of our process.
This seems to be an issue with the BigQuery API.
Google have said that they will be rolling back the breaking change to restore functionality within the day

Azure S2 Database Automatically creating and deleting a database every day

I currently have an Azure S2 database running via the new Azure Portal.
I notice my billing was higher than it should be and after investigating further, I noticed there were new databases appearing every day then disappearing.
Basically, something is running a CreateDatabase and DeleteDatabase event every evening, and I'm being charged an extra hour each day.
Microsofts response is:
"Our Operations Team investigated the issue and found that these databases did indeed exist in a 1 hour windows at midnight PST every day. It looks like you may have some workload which is doing this unknowingly or an application with permissions which is unknowingly creating these databases and then dropping them. "
I haven't set up any scripts to do this, and I have no apps running that could be doing this.
How can I find out what's happening?
Regards
Ben

Financial App - Recurring

I'm making a financial app and I run into some problems with recurring money like fixed payment, salary, bank saving, ... I tried to add these payments on a certain day by comparing the current day and day of payments. The code is something like this:
If Date.Now.Day = GetPayDate(date) then
//code here //
It's in a start up event and it works but the problem is if users don't open the app on that day, the app will ignore and nothing will be added.
I'm using ADO.net with sql database. It's an app on local client without real time data.
In order to work correctly, users don't have to log on but the app must be run, so I tried to fix it by adding an auto start function on it. But it's not an option because users may not use computer for few days.
Is there any other way to resolve this problem? I just need some solutions or ideas about it, so even if users don't use the app in 2 or 3 months, it still calculate everything once they log on.
Sounds like you really need a windows service that runs on startup, or a scheduled task. A windows service is a type of C# / VB.Net application that's designed to run in the background, and has no UI. The Windows task scheduler can start a program on a regular basis.
For more info about windows services, see https://msdn.microsoft.com/en-us/library/zt39148a%28v=vs.110%29.aspx. For more information on scheduled tasks, see http://www.7tutorials.com/task-scheduler. For a discussion about which is better, see Which is better to use for a recurring job: Service or Scheduled Task?
Or you could compare the current date to >= the pay date if you don't mind paying a few days late.

How to identify process that generates data transfer out in EC2?

I am hosting a small web-based application with Apache Web Server on EC2. On my monthly fee I usually see ~40GB usage of data transfer out, which cause about $5 or so a month.
Although this is no big money, I am curious on how these data transfer out were generated. I am sure at Midnight there won't be anyone actually visiting the web-based application. And yet there are some data transfer out at ~50M per hour (as I can see from the details report from amazon).
Is there any way to figure out what process actually generates those data-transfer out activity (even at Midnight when no one uses the web-application)?
thanks!
J.
How you looked at Boundary, may be they can help. They can monitor data going in and out of your EC2 instance (networking) You can see details like what ports the packets are coming from and where they are going to.
You have to install and agent on your machine and sign up for a trial.

Robust and Accurate IIS Reporting tool for fail over IIS Services

Good day,
I need to be able to produce IIS usage reports for our SharePoint 2007 Custom application. The application runs on 2 IIS 6 Service farm for load balance/fail over purposes.
Here is the list of requirements that my management poses:
1.Daily Visitors (per farm).
2.Daily Hits (per farm).
3.Daily activity (hits, page views, visitors, avg. session duration).
4.Activity by hour of Day (for the whole farm).
5.Activity by day of week (for the whole farm combined).
6.Activity by month.
7.Page access statistics / Most popular pages.
8.Top authenticated users.
9.Browser use statistics.
10.Client OS Use statistics.
So I need to combine report results from the 2 IIS Boxes on the load balanced rotation.
So far I have tried these tools.:
1.Web Log Expert - produces desired report types, can combine IIS logs from multiple locations. But the tool has some major bugs, such as:
a. some important information is being missed from the report: in authenticated user report the test user I log into the application is missing from the report, the user is not specified in the ignore filter and that user is found in the ISS logs.
b. Bug with time and dates. Even though there is an option to adjust the time from GMT to whatever, that changes is not being obeyed by the software. It can be fixed however by converting the W3C standard log files into ncsa format with convlog utility. However, in this case, the Browser and OS usage data is gone and lost from the report.
2. Samurize - I am a bit perplexed with configuring it to report on the W3C log files. There is a lack of good tutorials or other information on that software as well.
Please recommend the tools that worked out for you and ideally answer at least a number of specified requirements.
Thanks.
Nintex has a program - no idea if it's any good (but their workflow does rock).