HttpWebRequest SSL Authorization form - ssl

I've never tried before, but now I really need to get through authorization on Sprint's site (www.sprint.com).
Could you guys help me to understand how this actually works?
I'm trying to do like this, but obviously I'm missing something. Either something about cookies
or ssl or other stuff, I don't know.
HttpWebRequest webRequest = (HttpWebRequest)HttpWebRequest.Create(
"https://sso.sprintpcs.com/sso/Login.do");
CookieContainer cookieContainer = new CookieContainer();
webRequest.CookieContainer = cookieContainer;
webRequest.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0;
chromeframe; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729;
.NET CLR 3.0.30729; Tablet PC 2.0; .NET4.0C; .NET4.0E)";
webRequest.Accept = "image/jpeg, application/x-ms-application, image/gif, application/xaml+xml,
image/pjpeg, application/x-ms-xbap, application/x-shockwave-flash,
application/vnd.ms-excel, application/msword, */*";
webRequest.Method = "POST";
webRequest.Host = "manage.sprintpcs.com";
string strUserId = "kindauser";
string strPass = "kindapass";
ASCIIEncoding encoding = new ASCIIEncoding();
string postData = "userid=" + strUserId + "&password="
+ strPass + "&userExperince=USC allowlogin=false";
byte[] data = encoding.GetBytes(postData);
Stream requestStream = webRequest.GetRequestStream();
requestStream.Write(data,0,data.Length);
HttpWebResponse myHttpWebResponse = (HttpWebResponse)webRequest.GetResponse();

I would do the following - and this applies to all cases where you want to interact wit a website.
1) get firefox, along with firebug extension.
2) clear the firefox content and cookie cache
3) use firefox to do the scenario - logging into the website, for eg.
4) At this point, firebug shows you the exact sequence of requests sent along withh the cookie headers, etc.
5) Now try to replicate this using code.

Related

Cannot load page with either WebClient or HttpWebRequest

Regardless of whether I use WebClient or HttpWebRequest, loading this page times out. What am I doing wrong? It can't be https, since other https sites load just fine.
Below is my latest attempt, which adds all headers that I see in Firefox's inspector.
One interesting behavior is that I cannot monitor this with Fiddler, because everything works properly when Fiddler is running.
Using client As WebClient = New WebClient()
client.Headers(HttpRequestHeader.Accept) = "text/html, image/png, image/jpeg, image/gif, */*;q=0.1"
client.Headers(HttpRequestHeader.UserAgent) = "Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12"
client.Headers(HttpRequestHeader.AcceptLanguage) = "en-US;en;q=0.5"
client.Headers(HttpRequestHeader.AcceptEncoding) = "gzip, deflate, br"
client.Headers(HttpRequestHeader.Referer) = "http://www.torontohydro.com/sites/electricsystem/Pages/foryourhome.aspx"
client.Headers("DNT") = "1"
client.Headers(HttpRequestHeader.KeepAlive) = "keep-alive"
client.Headers(HttpRequestHeader.Upgrade) = "1"
client.Headers(HttpRequestHeader.CacheControl) = "max-age=0"
Dim x = New Uri("https://css.torontohydro.com/")
Dim data as string = client.DownloadString(x)
End Using
All of this is excess code. Boiling it down to just a couple of lines causes the same hang.
Using client as WebClient = New WebClient()
Dim data as string = client.DownloadString("https://css.torontohydro.com")
End Using
And this is the HttpWebRequest code, in a nutshell, which also hangs getting the response.
Dim getRequest As HttpWebRequest = CreateWebRequest("https://css.torontohydro.com/")
getRequest.CachePolicy = New Cache.RequestCachePolicy(Cache.RequestCacheLevel.BypassCache)
Using webResponse As HttpWebResponse = CType(getRequest.GetResponse(), HttpWebResponse)
'no need for any more code, since the above line is where things hang
So this ended up being due to the project still being in .NET 3.5. .NET was trying to load the site, being https, using SSL. Adding this line fixed the problem:
ServicePointManager.SecurityProtocol = 3072
I had to use 3072 since 3.5 does not contain a definition for SecurityProtocolType.Tls12.

How to handle Compressed Request in WCF service

We have a WCF REST service hosted on IIS 7 with .NET Framework 4.5. The client is sending data in GZip compressed format with request headers:
Content-Encoding:gzip
Content-Type: application/xml
But we are getting bad request from the server, if the request is in compressed format. We enabled Request compression by implementation of IHttpModule that will filter/modify incoming requests. From my understanding, this is failing because WCF uses original content length (that of compressed data) instead of Decompressed data. So here are my questions:
Is there any way we can fix this content length issues in IIS7/.NET 4.5? My HTTP module implementation is given below:
httpApplication.Request.Filter = New GZipStream(httpApplication.Request.Filter, CompressionMode.Decompress)`
If fixing the content length issue is not possible at server side, is there any way I can send original content length from client with a compressed request? Client side implementation is as follows:
using (Stream requeststream = serviceRequest.GetRequestStream())
{
if (useCompression)
{
using (GZipStream zipStream = new GZipStream(requeststream, CompressionMode.Compress))
{
zipStream.Write(bytes, 0, bytes.Length);
zipStream.Close();
requeststream.Close();
}
serviceRequest.Headers.Add("Content-Encoding", "gzip");
}
else
{
requeststream.Write(bytes, 0, bytes.Length);
requeststream.Close();
}
}
I was able to get gzip working in WCF using wsHTTPBinding and this as the base of the web request:
private HttpWebRequest GetWebRequest()
{
dynamic objHTTPReq = (HttpWebRequest)WebRequest.CreateDefault(_URI);
objHTTPReq.ContentType = "text/xml; charset=\"utf-8\"";
objHTTPReq.Method = "POST";
objHTTPReq.Accept = "gzip, deflate";
objHTTPReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.2; OfficeLiveConnector.1.3;OfficeLivePatch.0.0; Zune 3.0; MS-RTC LM 8)";
objHTTPReq.Headers.Add("SOAPAction", "http://xxx.yyyy.zzzz");
return objHTTPReq;
}
So give that a try. Good luck.

VB.NET Update Cookies from POST

I've been working on this problem for the last few days and finally made some progress.. Today I managed to force the cookie through the request and the server has finally authenticated the request, however I am unable to update the cookies and transfer the authenticated cookies to the next few pages.
'post form data to page
strUrl = "https://e926.net/user/authenticate"
webRequest2 = HttpWebRequest.Create(strUrl)
webRequest2.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.3) Gecko/20100401 Firefox/4.0 (.NET CLR 3.5.30729)"
webRequest2.AllowAutoRedirect = True
webRequest2.Method = WebRequestMethods.Http.Post
webRequest2.ContentType = "application/x-www-form-urlencoded"
webRequest2.CookieContainer = cookies
webRequest2.ContentLength = postData.Length
requestWriter = New StreamWriter(webRequest2.GetRequestStream)
requestWriter.Write(postData)
requestWriter.Close()
Dim response2 As HttpWebResponse = CType(webRequest2.GetResponse(), HttpWebResponse)
Dim strCookies2 As String = response2.Headers("Set-Cookie")
MsgBox(strCookies2)
strCookies2 = System.Text.RegularExpressions.Regex.Split(strCookies2, "((e926=.*))")(1)
strCookies2 = strCookies2.Split(";")(0)
strCookies2 = strCookies2.Replace("e926=", "")
cookie.Name = "e926"
cookie.Value = strCookies2
cookie.Domain = ".e926.net"
cookie.HttpOnly = True
cookie.Path = "/"
cookies.Add(cookie)
'recieve authenticated cookie
webRequest2.GetResponse().Close()
This is the page code that actually submits the login details and deals with the actual login request, I can see in Fiddler that the 'user' cookie is sent and the 'e926/auth' cookie is updated, but I have been unable to get the updated cookies from the headers or any other method I have tried..
The page is PHP and doesn't allow 'GET' requests and of course these wouldn't help anyways since the cookies never seem to transfer properly, and the cookies have to be updated from the request.
So my question is, how do I get the updated cookies from the page in VB.NET?
All I had to do was change 'auto-redirect' from 'true' to false and it forced the cookies to be gathered from the 'auth' page as opposed to the 'home' page.

Basic Authentication Webpage Login VB.NET

Hey all, I am having an issue with trying to automate our UPS installations. The webpage uses basic authentication and prompts for a login when loading the page. We do not have access to the registry to enable this feature in IE since it was disabled. I have tried useing an httpwebrequest and response to pull the cookie but it doesn't ever appear to send one back. My logic for that was going to be to use that cookie for the web browser control so it wouldn't then ask for the login. Here is my code that I have for that:
Dim request As HttpWebRequest = DirectCast(WebRequest.Create("http://10.106.206.249"), HttpWebRequest)
Dim mycache = New CredentialCache()
mycache.Add(New Uri("http://10.106.206.249"), "Basic", New NetworkCredential("User", "Pass"))
request.Credentials = mycache
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.14) Gecko/20080404 Firefox/2.0.0.14"
request.CookieContainer = New CookieContainer()
Dim response As HttpWebResponse = CType(request.GetResponse(), HttpWebResponse)
Dim cook As Cookie
For Each cook In response.Cookies
Console.WriteLine("Cookie:")
Console.WriteLine("{0} = {1}", cook.Name, cook.Value)
Console.WriteLine("Domain: {0}", cook.Domain)
Console.WriteLine("Path: {0}", cook.Path)
Console.WriteLine("Port: {0}", cook.Port)
Console.WriteLine("Secure: {0}", cook.Secure)
Console.WriteLine("When issued: {0}", cook.TimeStamp)
Console.WriteLine("Expires: {0} (expired? {1})", cook.Expires, cook.Expired)
Console.WriteLine("Don't save: {0}", cook.Discard)
Console.WriteLine("Comment: {0}", cook.Comment)
Console.WriteLine("Uri for comments: {0}", cook.CommentUri)
Console.WriteLine("Version: RFC {0}", IIf(cook.Version = 1, "2109", "2965"))
' Show the string representation of the cookie.
Console.WriteLine("String: {0}", cook.ToString())
Next cook
I know this works to some extent because if I use the incorrect creds I get an unathorized error thrown. So it appears either I am not catching the cookie or one is not being sent.
Another way I have tried is by sending a header with a regular Web.Navigate but that just acts like it is loading the page and prompts for login:
Dim authData
Dim authHeader As String
authData = System.Text.UnicodeEncoding.UTF8.GetBytes("User:Pass")
authHeader = "Authorization: Basic: " & System.Convert.ToBase64String(System.Text.Encoding.ASCII.GetBytes("User:Pass")) & Chr(13) & Chr(10)
Web.Navigate("http://10.106.206.249", False, Nothing, authHeader)
Anyone have any insight to see if maybe I am just doing something wrong here?
A simpler solution would be this:
Web.Navigate("http://Administrator:retail#10.106.206.249")
Note that if you have an #-sign in your password you'll have to UrlEncode it. (I'm not 100% sure whether the password will still work then)

Login to gmail account

I need to be able to login to my gmail account, then i get cookies and will have access to other google services. But i can't login to my gmail(or any goolgle) account. I found some posts on this site how to do it, but none works for me. i do :
string formUrl = "https://www.google.com/accounts/ServiceLoginAuth";
string formParams = string.Format("Email={0}&Passwd={1}&signIn={2}&PersistentCookie={3}&GALX={4}",
"autokuzov.top", "1QAZ2wsx", "Sign in", "yes", "CfFosrEhu-0");
string cookieHeader;
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Referer = "https://www.google.com/accounts/ServiceLoginAuth";
req.Method = "POST";
req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; ru; rv:1.9.2.7) Gecko/20100713 Firefox/3.6.7";
req.AllowAutoRedirect = false;
req.CookieContainer = new CookieContainer();
req.Headers.Add(HttpRequestHeader.CacheControl, "no-cache=set-cookie");
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
using (StreamReader sr = new StreamReader(resp.GetResponseStream()))
{
string s = sr.ReadToEnd();
}
Response return : "Your browser's cookie functionality is turned off. Please turn it on."
I also tried make req.Headers.Add(HttpRequestHeader.CacheControl, "no-cache=set-cookie"); but it was unseccussfull too.
Does anybody know where is a problem ?
"Your browser's cookie functionality
is turned off. Please turn it on."
You will probably need to have 3rd party cookies enabled in your browser. These are off by default in some browsers. You get the same warning in Firefox when using the Gmail Manager plugin if you disable 3rd party cookies.