I'm trying to geocode a lot of data. I have a lot of machines across which to spread the load (so that I won't go over the 2,500 requests per IP address per day). I am using a script to make the requests with either wget or cURL. However, both wget and cURL yield the same "request denied" message. That being said, when I make the request from my browser, it works perfectly. An example request is:
wget http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&sensor=true
And the resulting output is:
[1] 93930
05:00 PM ~: --2011-12-19 17:00:25-- http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA
Resolving maps.googleapis.com... 72.14.204.95
Connecting to maps.googleapis.com|72.14.204.95|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [application/json]
Saving to: `json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA'
[ <=> ] 54 --.-K/s in 0s
2011-12-19 17:00:25 (1.32 MB/s) - `json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA' saved [54]
The file it wrote to only contains:
{
"results" : [],
"status" : "REQUEST_DENIED"
}
Any help is much appreciated.
The '&' character that separates the address and sensor parameters isn't getting passed along to the wget command, but instead is telling your shell to run wget in the background. The resulting query is missing the required 'sensor' parameter, which should be set to true or false based on your input.
wget "http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&sensor=false"
Related
When running an application in EKS, the container logs are coming in the format of:
{ log: 2021-08-04 12:28:24,803INFO hostname com.application.RequestListener Sent response with request_id=1 stream: stdout time: 2021-08-04T12:28:24.803533339Z }
As I am only interested in the value relating to log, if there a way I can extract only:
2021-08-04 12:28:24,803INFO hostname com.application.RequestListener Sent response with request_id=1
I have tried setting source to log in outputs.conf, however, this does not seem to work and I still get the json format present in the splunk search head
(sorry for my bad english)
I did the follow istructions:
https://support.google.com/domains/answer/6147083?hl=it
(i made a dynamic dns)
i made a script in linux with:
https://username:password#domains.google.com/nic/update?hostname=www.systemcamera.org
i hoped it's ok, but the domain doesn't works, when i type www.systemcamera.org doesn't works, but when i type the ip address it's works
in the script i typed:
wget
https://username:pswd.google.com/nic/update?hostname=www.systemcamera.org
-O dns_update_result$
i runned (./script), but i don't find my website
the result when i run the script:
--2018-06-26 13:14:31-- https://5RodFTlBOsTBB1gL:password#domains.google.com/nic/update?hostname=www.systemcamera.org
Resolving domains.google.com (domains.google.com)... 172.217.23.110, 2a00:1450:4002:800::200e
Connecting to domains.google.com (domains.google.com)|172.217.23.110|:443... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Authentication selected: Basic realm="Google Domains Dynamic Dns (www.systemcamera.org)"
Reusing existing connection to domains.google.com:443.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/plain]
Saving to: ‘dns_update_results.txt’
dns_update_results.txt [ <=> ] 19 --.-KB/s in 0s
I have an url like
http:\/\/23.246.50.145\/?o=AQFsBQ9SB1l-S4Ch0gw3lM0zSs4ReWo3_PsOqoF35YR0eHrAqxQ7GIRonzVp_nrrJ4m9cKer-YAmV-rgYJXHJ1NE9JjLqf78Jp7l9Y-z2njJVV2CpXDAPNoh91iqrDA2vRuNdlvYbbSJ4Sj5Mp-xiegEeAKXnQ&v=3&e=1494051078&t=XUSkPrEHFE2Tas3F9KxR7CPINPI
If i type this url in a browser it starts downloading a file that will named just download Which is actually a music file so i just rename it to .mp4
If I try to use wget it wont work
wget "http:\/\/23.246.50.145\/?o=AQFsBQ9SB1l- S4Ch0gw3lM0zSs4ReWo3_PsOqoF35YR0eHrAqxQ7GIRonzVp_nrrJ4m9cKer-YAmV-rgYJXHJ1NE9JjLqf78Jp7l9Y-z2njJVV2CpXDAPNoh91iqrDA2vRuNdlvYbbSJ4Sj5Mp-xiegEeAKXnQ&v=3&e=1494051078&t=XUSkPrEHFE2Tas3F9KxR7CPINPI"
Will get this error
--2017-05-04 09:10:51-- http://23.246.50.145/?o=AQFsBQ9SB1l-S4Ch0gw3lM0zSs4ReWo3_PsOqoF35YR0eHrAqxQ7GIRonzVp_nrrJ4m9cKer-YAmV-rgYJXHJ1NE9JjLqf78Jp7l9Y-z2njJVV2CpXDAPNoh91iqrDA2vRuNdlvYbbSJ4Sj5Mp-xiegEeAKXnQ&v=3&e=1494051078&t=XUSkPrEHFE2Tas3F9KxR7CPINPI
Connecting to 23.246.50.145:80... connected.
HTTP request sent, awaiting response... 420
2017-05-04 09:10:51 ERROR 420: (no description).
Why is this error?
I got forbidden. You must use ' instead of "
like
wget -d 'http://23.246.50.145/?o=AQFsBQ9SB1l-S4Ch0gw3lM0zSs4ReWo3_PsOqoF35YR0eHrAqxQ7GIRonzVp_nrrJ4m9cKer-YAmV-rgYJXHJ1NE9JjLqf78Jp7l9Y-z2njJVV2CpXDAPNoh91iqrDA2vRuNdlvYbbSJ4Sj5Mp-xiegEeAKXnQ&v=3&e=1494051078&t=XUSkPrEHFE2Tas3F9KxR7CPINPI'
I'm trying to get an Authentication Token in order to start querying some information.
The problem is that when I execute the Token Script in Linux and I type my name and password, the server doesn't give me the token. I just get an empty space.
Does anybody know how to proceed?
EDIT: I just type
wget --no-check-certificate https://raw.githubusercontent.com/fgalan/oauth2-example-orion-client/master/token_script.sh
bash token_script.sh
In the Command Window in Ubuntu. The script asks me a user name and a password. When I write them, I just receive an empty space.
Thank you very much
[root#fi-next /]# wget --no-check-certificate https://raw.githubusercontent.com/fgalan/oauth2-example-orion-client/master/token_script.sh
--2017-08-01 09:59:27-- https://raw.githubusercontent.com/fgalan/oauth2-example-orion-client/master/token_script.sh
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 288 [text/plain]
Saving to: ‘token_script.sh’
100%[====================================>] 288 --.-K/s in 0s
2017-08-01 09:59:27 (25.1 MB/s) - ‘token_script.sh’ saved [288/288]
[root#fi-next /]# bash token_script.sh
Username: fernando.mendez.external#atos.net
Password:
Token: xxxxxxxxxxxxxxxxxxxxxxxxxxx
Hi, it's worked for me.
You must have an account in fiware lab
Regards
Currently we have an Apache 2.2.3 server with mod_ssl 2.2.3 running Django, with users authenticating by using a x509 certificate.
So far the system is running perfectly except for a single user, who when trying to upload a file receives 400 Bad Request error, and the contents of the ssl_error_log regarding this operation are:
[<date>] [error] [client <client ip>] request failed: error reading the headers, referer: <referrer url>
The contents of the ssl_access_log are:
<client ip> - - [<date>] "POST <target page> HTTP/1.1" 400 321
Also, the user's browser is Firefox as far as I know.
I am completely unable to reproduce this bug and so far none of the other users have experienced it. Could you point out some reasons for this to happen?
I've experienced connectivity that stops the upstream after an X amount of bytes is sent. X was a pretty low value, as in enough to request some simple pages, but not to deal with ajax requests much less upload files. As far as I recall, this connectivity problem occurred only when tethering (from a specific Android phone, but I didnt even test other phones).
So if the upstream gets interrupted and the upload stalls, it makes sense apache would return this error, according to this post: "Apache waits a time equal to the Timeout directive (defaults to 5 minutes if not defined) for a response from the client. It is likely Apache is waiting for the CRLF that indicates the end of the headers, yet it is never received.."