lfd High 5 minute load average alert - 6.43 - apache

I hope I'm in the right place
I'm having a notification from my VPS server about a High load with apachestatus.html file containing some info as below
Srv PID Acc M CPU SS Req Dur Conn Child Slot Client Protocol VHost Request
0-5 30706 4/420/7207 K 74.78 2 276 6462984 19.8 8.93 330.71 162.158.134.120 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 0/417/7174 W 74.58 0 0 6900859 0.0 12.54 333.76 172.68.118.172 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 4/408/7155 K 74.77 2 327 6759891 19.4 8.37 253.45 172.70.131.21 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/503/7197 _ 74.66 5 426 6972175 0.0 10.28 374.51 141.101.68.26 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 1/399/7251 K 74.77 2 266 6528061 8.1 14.55 291.27 172.69.69.41 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/476/7234 _ 74.64 5 523 6451687 0.0 10.01 262.18 172.68.118.76 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 2/434/7173 K 74.76 3 265 6972079 12.3 11.78 306.76 108.162.242.9 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 1/425/7154 K 74.76 3 476 6313351 7.6 9.34 342.79 162.158.118.180 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 1/414/7221 K 74.81 0 720 6698788 8.1 9.59 273.30 162.158.62.32 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/468/7240 _ 74.60 8 586 7918329 0.0 10.25 293.01 172.69.69.199 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/425/7207 _ 74.61 7 248 6691162 0.0 10.99 414.17 172.70.222.194 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 6/411/7239 K 74.82 0 955 6807718 27.3 10.01 357.15 172.69.69.185 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 1/452/7217 K 74.81 0 1123 6568176 8.1 10.20 293.10 172.70.110.180 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 2/414/7224 K 74.80 1 369 7153413 11.4 11.21 331.23 172.70.82.54 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 1/402/7203 K 74.74 4 566 6535237 8.1 7.79 263.32 172.70.114.138 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 2/428/7259 K 74.76 4 717 6779567 11.9 11.26 392.65 172.70.178.78 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 2/416/7125 K 74.76 3 527 6258884 12.3 9.67 251.22 162.158.134.142 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/422/7143 _ 74.69 2 849 6620206 0.0 25.50 448.90 141.101.69.169 http/1.1 site1.example.com:443 GET /yf3q920TPF? HTTP/1.1
0-5 30706 0/395/7124 _ 74.64 6 494 6426369 0.0 132.05 358.19 172.70.54.252 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 0/417/7151 _ 74.67 4 1589 6614715 0.0 10.91 286.53 108.162.216.190 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/399/7163 _ 74.67 3 1056 6955007 0.0 7.52 291.21 172.70.54.2 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/409/7272 W 74.55 2 0 6460489 0.0 26.08 257.75 172.70.54.16 http/1.1 site1.example.com:443 POST / HTTP/1.1
0-5 30706 2/438/7279 K 74.82 0 927 6925756 11.4 15.17 278.39 172.70.127.11 http/1.1 site1.example.com:443 GET /auth/signin HTTP/1.1
0-5 30706 0/399/7107 _ 74.61 8 683 6887484 0.0 9.19 385.11 172.70.130.56 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
0-5 30706 0/427/7302 W 74.60 0 0 7626256 0.0 10.43 305.48 127.0.0.1 http/1.1 server.example.com:80 GET /whm-server-status HTTP/1.1
1-5 30734 0/391/7145 _ 72.13 2 869 6507383 0.0 8.44 312.92 172.70.54.2 http/1.1 site1.example.com:443 POST /auth/signin HTTP/1.1
1-5 30734 1/406/7146 K 72.16 4 2000 7151012 87.6 9.10 435.53 172.71.114.58 http/1.1 site2.example.com:443 POST / HTTP/1.1
It looks like an attack but its source is "Cloudflare IPs"
(I'm using CloudFlare service to protect my websites)
How can I prevent this from happening again?

Related

Is there a way to retrieve the latest of the first `status` value from the table?

I have been trying to write a query that returns the latest of the first status matching record with the query itself. Let's say I have the table records below, I would like to retrieve the APP_ID and the INSERTED_AT columns where the STATUS should be the first of Offer Sent that occurs after the latest Selectable status and if there are any Offer Declined should ignore it.
ID
APP_ID
INSERTED_AT
UPDATED_AT
STATUS
1
209377
2021-05-13 20:57:30.000 +0530
2021-05-13 20:57:30.000 +0530
Selectable
2
209377
2021-05-13 20:57:58.000 +0530
2021-05-13 20:57:58.000 +0530
Selectable
3
209377
2021-05-14 18:40:08.000 +0530
2021-05-14 18:40:08.000 +0530
Offer Eligible
4
209377
2021-05-14 18:40:14.000 +0530
2021-05-14 18:40:14.000 +0530
Offer Sent
5
209377
2021-05-15 18:57:50.000 +0530
2021-05-15 18:57:50.000 +0530
Offer Sent
6
209377
2021-05-15 20:44:29.000 +0530
2021-05-15 20:44:29.000 +0530
Offer Sent
7
209377
2021-05-17 22:45:13.000 +0530
2021-05-17 22:45:13.000 +0530
Offer Accepted
8
110011
2021-05-13 20:57:30.000 +0530
2021-05-13 20:57:30.000 +0530
Selectable
9
110011
2021-05-13 20:57:58.000 +0530
2021-05-13 20:57:58.000 +0530
Offer Eligible
10
110011
2021-06-14 18:40:08.000 +0530
2021-05-14 18:40:08.000 +0530
Offer Sent
11
300110
2021-05-14 18:40:14.000 +0530
2021-05-14 18:40:14.000 +0530
Selectable
12
300110
2021-05-15 18:57:50.000 +0530
2021-05-15 18:57:50.000 +0530
Offer Eligible
13
300110
2021-05-15 20:44:29.000 +0530
2021-05-15 20:44:29.000 +0530
Offer Sent
14
300110
2021-05-17 22:45:13.000 +0530
2021-05-17 22:45:13.000 +0530
Offer Declined
Here are the expected results I am looking for
APP_ID
INSERTED_AT
STATUS
209377
2021-05-14 18:40:14.000 +0530
Offer Sent
110011
2021-05-14 18:40:08.000 +0530
Offer Sent
300110
nil
Offer Declined
Try this
Select *, max(case when status ='Selected' then
When lag(status) over (order by
null) ='Offer Sent' then 'Offer Sent'
Else nil end)
From
(
Select APP_ID, INSERTED_AT,
Status
From table
Where status in ('Selected', 'Offer Sent', 'Offer
Declined'))

How to select data for especific time intervals after using Pandas’ resample function?

I used Pandas’ resample function for calculating the sales of a list of proucts every 6 months.
I used the resample function for ‘6M’ and using apply({“column-name”:”sum”}).
Now I’d like to create a table with the sum of the sales for the first six months.
How can I extract the sum of the first 6 months, given that all products have records for more than 3 years, and none of them have the same start date?
Thanks in advance for any suggestions.
Here is an example of the data:
Product Date sales
Product 1 6/30/2017 20
12/31/2017 60
6/30/2018 50
12/31/2018 100
Product 2 1/31/2017 30
7/31/2017 150
1/31/2018 200
7/31/2018 300
1/31/2019 100
While waiting for your data, I worked on this. See if this is something that will be helpful for you.
import pandas as pd
df = pd.DataFrame({'Date':['2018-01-10','2018-02-15','2018-03-18',
'2018-07-10','2018-09-12','2018-10-14',
'2018-11-16','2018-12-20','2019-01-10',
'2019-04-15','2019-06-12','2019-10-18',
'2019-12-02','2020-01-05','2020-02-25',
'2020-03-15','2020-04-11','2020-07-22'],
'Sales':[200,300,100,250,150,350,150,200,250,
200,300,100,250,150,350,150,200,250]})
#first breakdown the data by Yearly Quarters
df['YQtr'] = pd.PeriodIndex(pd.to_datetime(df.Date), freq='Q')
#next create a column to identify Half Yearly - H1 for Jan-Jun & H2 for Jul-Dec
df.loc[df['YQtr'].astype(str).str[-2:].isin(['Q1','Q2']),'HYear'] = df['YQtr'].astype(str).str[:-2]+'H1'
df.loc[df['YQtr'].astype(str).str[-2:].isin(['Q3','Q4']),'HYear'] = df['YQtr'].astype(str).str[:-2]+'H2'
#Do a cummulative sum on Half Year to get sales by H1 & H2 for each year
df['HYear_cumsum'] = df.groupby('HYear')['Sales'].cumsum()
#Now filter out only the rows with the max value. That's the H1 & H2 sales figure
df1 = df[df.groupby('HYear')['HYear_cumsum'].transform('max')== df['HYear_cumsum']]
print (df)
print (df1)
The output of this will be:
Source Data + Half Year cumulative sum:
Date Sales YQtr HYear HYear_cumsum
0 2018-01-10 200 2018Q1 2018H1 200
1 2018-02-15 300 2018Q1 2018H1 500
2 2018-03-18 100 2018Q1 2018H1 600
3 2018-07-10 250 2018Q3 2018H2 250
4 2018-09-12 150 2018Q3 2018H2 400
5 2018-10-14 350 2018Q4 2018H2 750
6 2018-11-16 150 2018Q4 2018H2 900
7 2018-12-20 200 2018Q4 2018H2 1100
8 2019-01-10 250 2019Q1 2019H1 250
9 2019-04-15 200 2019Q2 2019H1 450
10 2019-06-12 300 2019Q2 2019H1 750
11 2019-10-18 100 2019Q4 2019H2 100
12 2019-12-02 250 2019Q4 2019H2 350
13 2020-01-05 150 2020Q1 2020H1 150
14 2020-02-25 350 2020Q1 2020H1 500
15 2020-03-15 150 2020Q1 2020H1 650
16 2020-04-11 200 2020Q2 2020H1 850
17 2020-07-22 250 2020Q3 2020H2 250
The half year cumulative sum for each half year.
Date Sales YQtr HYear HYear_cumsum
2 2018-03-18 100 2018Q1 2018H1 600
7 2018-12-20 200 2018Q4 2018H2 1100
10 2019-06-12 300 2019Q2 2019H1 750
12 2019-12-02 250 2019Q4 2019H2 350
16 2020-04-11 200 2020Q2 2020H1 850
17 2020-07-22 250 2020Q3 2020H2 250
I will look at your sample data and work on it later tonight.

Pandas - Slice between two indexes

I need to process data with tensorflow for classification. Therefore I need to create DataFrames for each unit which was processed in my machine. The machine continously writes process data and also writes when a unit enters and leaves the machine.
A value in 'uid_in' means the unit with the logged number entered the machine, 'uid_out' means the unit left the machine.
I need to create a DataFrame like this for each unit processes by the machine.
[...]
time uhz1 uhz2 lhz1 lh2 uid_in uid_out
5 08:05:00 201 200 101 100 1.0 NaN #Unit1 enters the machine
6 08:06:00 201 200 99 101 2.0 NaN
[...]
14 08:14:00 199 199 99 101 10.0 NaN
15 08:15:00 201 201 100 100 11.0 1.0 #Unit1 leaves the machine
[...]
How can I create the Dataframe df.loc[enter:leave] for each unit automatically?
When I try to pass a DataFrame.index it does not work in df.loc
start = df[df.uid_in.isin([123])]
end = df[df.uid_out.isin([123])]
unit1_df = df.loc[start:end]
Your code almost worked out!
Original DataFrame:
time uhz1 uhz2 lhz1 lh2 uid_in uid_out
0 08:00:00 201 199 100 100 NaN NaN
1 08:01:00 199 199 100 99 NaN NaN
[...]
5 08:05:00 201 200 101 100 1.0 NaN
[...]
55 08:55:00 241 241 140 140 NaN 41.0
[...]
58 08:58:00 244 244 143 143 NaN NaN
59 08:59:00 245 245 144 144 NaN NaN
New code:
start = df[df.uid_in.eq(1.0)].index[0]
end = df[df.uid_out.eq(1.0)].index[0]
unit1_df = df.loc[start:end]
print(unit1_df)
Output
time uhz1 uhz2 lhz1 lh2 uid_in uid_out
5 08:05:00 201 200 101 100 1.0 NaN
6 08:06:00 201 200 99 101 2.0 NaN
[...]
14 08:14:00 199 199 99 101 10.0 NaN
15 08:15:00 201 201 100 100 11.0 1.0
I think you were pretty close. I modified your statements and picked out the start and end indices of start and end, as Ian indicated.
""" time uhz1 uhz2 lhz1 lh2 uid_in uid_out
5 08:05:00 201 200 101 100 1.0 NaN
6 08:06:00 201 200 99 101 2.0 NaN
14 08:14:00 199 199 99 101 10.0 NaN
15 08:15:00 201 201 100 100 11.0 1.0
"""
import pandas as pd
df = pd.read_clipboard()
start = df.uid_in.eq(1.0).index[0]
end = df.uid_out.eq(1.0).index[0]
unit1_df = df.loc[start:end]
unit1_df
Output:
time uhz1 uhz2 lhz1 lh2 uid_in uid_out
5 08:05:00 201 200 101 100 1.0 NaN
6 08:06:00 201 200 99 101 2.0 NaN
14 08:14:00 199 199 99 101 10.0 NaN
15 08:15:00 201 201 100 100 11.0 1.0
One-liner:
unit1_df = df.loc[df.uid_in.eq(1.0).index[0]:df.uid_out.eq(1.0).index[0]]

how can we give index while calculating 3 days moving average

I have a data sets like below and want to calculate the max value 3 days moving average and tried this code
pd.rolling_mean(data['prec'], 3).max()
this code gives the moving average but without date
year month day prec
0 1981 1 1 1.5
1 1981 1 2 0.0
2 1981 1 3 0.0
3 1981 1 4 0.4
4 1981 1 5 0.0
5 1981 1 6 1.0
6 1981 1 7 1.9
7 1981 1 8 0.6
8 1981 1 9 3.7
9 1981 1 10 0.0
10 1981 1 11 0.0
11 1981 1 12 0.0
12 1981 1 13 0.0
13 1981 1 14 12.2
14 1981 1 15 1.7
15 1981 1 16 0.6
16 1981 1 17 0.9
17 1981 1 18 0.6
18 1981 1 19 0.4
19 1981 1 20 0.2
20 1981 1 21 1.4
21 1981 1 22 3.2
22 1981 1 23 0.0
the format which I want is
year month day prec
.... .. .. ...
can anyone help to solve this problem
Assign the result of pd.rolling_mean or pd.rolling_max to a DataFrame column:
import pandas as pd
df = pd.read_table('data', sep='\s+')
df['moving average'] = pd.rolling_mean(df['prec'], 3)
df['max of moving average'] = pd.rolling_max(df['moving average'], 3)
yields
In [32]: df
Out[32]:
year month day prec moving average max of moving average
0 1981 1 1 1.5 NaN NaN
1 1981 1 2 0.0 NaN NaN
2 1981 1 3 0.0 5.000000e-01 NaN
3 1981 1 4 0.4 1.333333e-01 NaN
4 1981 1 5 0.0 1.333333e-01 0.500000
5 1981 1 6 1.0 4.666667e-01 0.466667
6 1981 1 7 1.9 9.666667e-01 0.966667
7 1981 1 8 0.6 1.166667e+00 1.166667
8 1981 1 9 3.7 2.066667e+00 2.066667
9 1981 1 10 0.0 1.433333e+00 2.066667
10 1981 1 11 0.0 1.233333e+00 2.066667
11 1981 1 12 0.0 1.480297e-16 1.433333
12 1981 1 13 0.0 1.480297e-16 1.233333
13 1981 1 14 12.2 4.066667e+00 4.066667
14 1981 1 15 1.7 4.633333e+00 4.633333
15 1981 1 16 0.6 4.833333e+00 4.833333
16 1981 1 17 0.9 1.066667e+00 4.833333
17 1981 1 18 0.6 7.000000e-01 4.833333
18 1981 1 19 0.4 6.333333e-01 1.066667
19 1981 1 20 0.2 4.000000e-01 0.700000
20 1981 1 21 1.4 6.666667e-01 0.666667
21 1981 1 22 3.2 1.600000e+00 1.600000
22 1981 1 23 0.0 1.533333e+00 1.600000

WCF : Moving from IIS 7 to IIS 8

I have moved my my wcf service from iis7 to 8.i can browse to the svc file but i cannot browse to any other methods through get or post method.it shows the below error
The sever encountered an error processing the request.see server logs for more details
the log file is shown below
Software: Microsoft Internet Information Services 8.5
Version: 1.0
Date: 2014-12-17 04:25:48
Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) cs(Referer) sc-status sc-substatus sc-win32-status time-taken
2014-12-17 04:25:48 (ipaddress) GET /service - 786 - (ipaddress) Mozilla/5.0+(Windows+NT+6.3;+WOW64;+Trident/7.0;+rv:11.0)+like+Gecko - 301 0 0 120
2014-12-17 04:25:48 (ipaddress) GET /service/ - 786 - (ipaddress) Mozilla/5.0+(Windows+NT+6.3;+WOW64;+Trident/7.0;+rv:11.0)+like+Gecko - 200 0 0 3
2014-12-17 04:25:53 (ipaddress) GET /service/MposService.svc - 786 - (ipaddress) Mozilla/5.0+(Windows+NT+6.3;+WOW64;+Trident/7.0;+rv:11.0)+like+Gecko (ipaddress):786/service/ 200 0 0 904
2014-12-17 04:27:42 (ipaddress) GET /service/MposService.svc - 786 - publicip Mozilla/5.0+(Windows+NT+6.1;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/39.0.2171.95+Safari/537.36 - 200 0 0 628
2014-12-17 04:27:42 (ipaddress) GET /favicon.ico - 786 - public ip Mozilla/5.0+(Windows+NT+6.1;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/39.0.2171.95+Safari/537.36 - 404 0 2 470
2014-12-17 04:28:24 (ipaddress) GET /service/MposService.svc/getCustomer section=s1 786 - 117.213.26.161 Mozilla/5.0+(Windows+NT+6.1;+WOW64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/39.0.2171.95+Safari/537.36 - 400 0 0 640