WIF: LocalSTS not authenticating - asp.net-mvc-4

I'm totally new to WIF and to start playing with it a bit I tried the simplest "f5 experience" with an MVC4 application; according to the tutorials I found, which sadly for the most part refer to releases before .net 4.5, I should just create an MVC app, configure it with the identity and access tool and hit f5 to get up and running with local STS. Yet, I'm probably missing something obvious because when I try to access a restricted page I always end bumped back to the homepage.
Here is what I did, you can easily repro the issue with these steps (VS2012 in Win8 with WIF SDK; ensure to launch VS with admin rights):
create a new ASP.NET MVC4 Internet application. Set its port to 7777 (just picking the port number used in most code samples for the sake of commodity).
update all the NuGet packages (this is optional).
right click the solution, choose identity and access and set the IP=local STS, then click OK. Then reopen the identity and access popup, and choose generate a controller, then click OK.
add an [Authorize] attribute to the About action of the Home controller.
hit F5 and click the About link. As expected, the login view appears, prompting me to login: the only option is of course localSTS. When I click it, I am returned to the homepage and no authentication occurs. I can repeat the process, but nothing changes, so I can never access the secured About page.
The link underlying the localSTS anchor is:
http://localhost:14743/wsFederationSTS/Issue?wa=wsignin1.0&wtrealm=http%3a%2f%2flocalhost%3a7777%2f&wctx=rm%3d0%26id%3d664ff3c2-95b1-40b3-b538-a8357233ea7e%26ru%3dhttp%253a%252f%252flocalhost%253a7777%252f&wct=2013-03-10T13%3a39%3a32Z
AFAIK, its parameters look OK.
If I examine the network traffic, I cannot see any relevant item (if I understand well, I would expect a response setting some cookies for the current session, representing the IClaimsPrincipal).
(BTW, by looking at the web.config, I can see that in modules WSFederationAuthenticationModule is referenced from System.Identity.Services, which is NOT included in the solution after configuring Identity and access; I suppose this is a bug in the tool. Anyway I added a reference to it, but nothing changed.)
Update
Thank you for the reply! If I examine the traffic, here are the relevant GET/POSTs. I get no cookie. I tried recreating the whole test solution, even skipping step #2 above to keep it minimal, but nothing changed.
As for your suggestion #2, I tried to add in global asax:
FederatedAuthentication.WSFederationAuthenticationModule.SecurityTokenValidated
+= (sender, e) => FederatedAuthentication.SessionAuthenticationModule.IsReferenceMode = true;
but the SessionAuthenticationModule is null at the time this code executes and thus a corresponding exception is thrown. I cannot find up-to-date code samples or articles about this, yet WIF seems a very promising tech; I'd like it to be easy for security newbies like me; my main purpose is applying it to a site providing both MVC controllers and WebApi controllers, to a wide range of consumers (JS code, mobile apps, WinRT apps, the site pages themselves...). Any suggestion?
(1) a GET which gets 307, temporary redirect
GET /wsFederationSTS/Issue?wa=wsignin1.0&wtrealm=http%3a%2f%2flocalhost%3a7777%2f&wctx=rm%3d0%26id%3d664ff3c2-95b1-40b3-b538-a8357233ea7e%26ru%3dhttp%253a%252f%252flocalhost%253a7777%252f&wct=2013-03-10T13%3a39%3a32Z HTTP/1.1
Host: localhost:14743
Connection: keep-alive
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.22 (KHTML, like Gecko) Chrome/25.0.1364.160 Safari/537.22
Referer: http://localhost:7777/HrdAuthentication/Login?ReturnUrl=%2fHome%2fAbout
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Cookie:
(2) GET with signin request:
GET /wsFederationSTS/Issue/?wa=wsignin1.0&wtrealm=http%3a%2f%2flocalhost%3a7777%2f&wctx=rm%3d0%26id%3d664ff3c2-95b1-40b3-b538-a8357233ea7e%26ru%3dhttp%253a%252f%252flocalhost%253a7777%252f&wct=2013-03-10T13%3a39%3a32Z HTTP/1.1
Host: localhost:14743
Connection: keep-alive
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.22 (KHTML, like Gecko) Chrome/25.0.1364.160 Safari/537.22
Referer: http://localhost:7777/HrdAuthentication/Login?ReturnUrl=%2fHome%2fAbout
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Cookie:
(3) a POST to the homepage: the respone is of course the homepage content; no cookies set.
POST / HTTP/1.1
Host: localhost:7777
Connection: keep-alive
Content-Length: 7063
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Origin: http://localhost:14743
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.22 (KHTML, like Gecko) Chrome/25.0.1364.160 Safari/537.22
Content-Type: application/x-www-form-urlencoded
Referer: http://localhost:14743/wsFederationSTS/Issue/?wa=wsignin1.0&wtrealm=http%3a%2f%2flocalhost%3a7777%2f&wctx=rm%3d0%26id%3d664ff3c2-95b1-40b3-b538-a8357233ea7e%26ru%3dhttp%253a%252f%252flocalhost%253a7777%252f&wct=2013-03-10T13%3a39%3a32Z
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Cookie:
A working variation
I found a way for letting it work, maybe this can be useful to someone else: if you follow the above procedure without changing from the local IIS server to the VS development server, it seems it's working and I'm still redirected to the Home page (I wonder why:), but as an authenticated user; at this point, I can click the About link again to effectively enter the page.

I reproduce your steps and everything worked. Chrome debugger show that you must have two steps:
GET http://localhost:12263/wsFederationSTS/Issue/?wa=wsignin1.0&wtrealm=http%3a%2f%2flocalhost%3a54306%2f&wctx=rm%3d0%26id%3dc6e46b99-417b-49b6-96a0-40efcead898f%26ru%3dhttp%253a%252f%252flocalhost%253a54306%252f&wct=2013-03-10T18%3a14%3a19Z
POST http://localhost:54306/ with wa:wsignin1.0 and wresult:trust:RequestSecurityTokenResponseCollection
Result of POST is Set-Cookie named FedAuth and FedAuth1. Cookie is split because of cookie size limit.
Please check this in debugger.
p.s. Once i saw a same behavior - cookie is not set normally. Problem was in cookie size and solved by switching to ReferenceMode. Don't forget to to register it in Application_Start:
FederatedAuthentication.WSFederationAuthenticationModule.SessionSecurityTokenCreated
+= this.WSFederationAuthenticationModule_SessionSecurityTokenCreated;

Related

Servicestack UnAuthorized

I am getting "The remote server returned an error: (401) Unauthorized." when using the servicestack utils to call a remote API. It requires auth basic or JWT. Calling the same API from PostMan works fine.
var json = $"http://localhost:5000/API/Proxy/IsValidGuidForSetupDevice?Guid=82870f2ca21148739bff0854b306415c".GetJsonFromUrl(requestFilter: webReq => { webReq.AddBasicAuth("DevAdmin", "test1"); });
if i call below with the same user pass in a browser window. i get a valid connect and a bearer token back. The user\pass is good.
http://localhost:5000/auth/credentials?username=DevAdmin&password=test1&format=json
AM I missing something in request filter of the Util? Maybe I should be calling it differently and using a bearer token?
Update after tracing with fiddler
I moved the code into a unit test in the same project as the servicestack service. Just to remove any variables.
Fiddler shows me that : No Proxy-Authenticate Header is present. WWW-Authenticate Header is present: Basic realm="/auth/apikey" GET http://localhost:5000/auth
Raw View
HTTP/1.1
Host: localhost:5000
Connection: keep-alive
Accept: application/json, text/javascript, */*; q=0.01
DNT: 1
X-Requested-With: XMLHttpRequest
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36
Sec-Fetch-Site: same-origin
Sec-Fetch-Mode: cors
Referer: http://localhost:5000/
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cookie: ss-opt=perm; loginLevel=none; X-UAId=2; ss-id=XPc7ivcCrXuN5tEWwARG; ss-pid=TDeEjUiKck82foJGLGtX
Playing around with it. I can get it to login by calling the URL directly with user and pass. The i get the bearer token back and I am able to pass it.
var json = $"http://localhost:5000//auth/credentials?username=DevAdmin&password=test1".GetJsonFromUrl();
var o = JsonObject.Parse(json);
var token = o.Get<string>("bearerToken");
var jsonr = $"http://localhost:5000/API/Proxy/IsValidGuidForSetupDevice?Guid=bc464658d6a640febbd53ba17c351919".GetJsonFromUrl(
requestFilter: webReq => { webReq.AddBearerToken(token); });
I still can't call this in one call with auth headers and I still don't know why.
Whenever diagnosing different behavior between HTTP Requests of different clients you should inspect the raw HTTP Traffic between the different clients using a HTTP Packet sniffer like Fiddler or WireShark, or if the request is made from a browser you can use their built-in Web Network Inspector tools.
Seeing and comparing raw HTTP Request/Responses are the best way to identify how the requests differ.
The AddBasicAuth() extension method lets you add HTTP Basic Auth to the Web Request, if you want to send the JWT Token instead you would use the AddBearerToken() method instead.

recreate this web request in vb net

i?m trying to make automatic image set in pinterest account with WebClient.
I'd like to recreate this http request:
(Request-Line) POST /upload-image/?img=Desert.jpg HTTP/1.1
Host www.pinterest.com
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:39.0) Gecko/20100101 Firefox/39.0
Accept text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding gzip, deflate
X-Requested-With XMLHttpRequest
X-File-Name Desert.jpg
Cache-Control no-cache
X-CSRFToken RqwJCawJyAGYIZfzob51qxrEGj4GJcSA
Referer https://www.pinterest.com
Content-Length 846128
Content-Type multipart/form-data; boundary=---------------------------5431268530037
Cookie _pinterest_cm=TWc9PSY1YlkwcmtVRGlNRzRQZXpiZXJseVl6TnFHYnEvZlhpNDZPcExCQnhKN3UvdUUveWI0c3p4bWJKUmhoZy9YRG9sS3dNZTZFSFNhN2V3VWhJM1JkbUlxbC92VjhHUGFldlRTVVJTNlA1L1M0SDE5QXhLcHVWS2ZrSUh3NTN2ODA0WSZ5dnpJQkVRUmx5TVJGTEdmQm5EVmRGQXNqbDQ9; csrftoken=RqwJCawJyAGYIZfzob51qxrEGj4GJcSA; _pinterest_sess="TWc9PSYvRm93OFNxbGkxWTJ5bzVoZUFudHJVVDI4bndmL2I5SFVIQjZkVk1KWFJ3WDBmNndOWFR0QnBPazltZFRmcnJpcGU5akhQZS8vcEZWTzM5ODJWNVdKS2syekwwc1p1TVVNeEt3Z3NYa1lsMVFXcFpXYUdkRlE1RElQYTBYeXhyQkFkYVFmSHZnVkRyYWhYcURDYzhhWEpuR2dvekE1SlB6cXp4akNNdzJ6QysrR2MzRGNyRXJKczRuRHZDTm1uQkdLMUJrUnF6UjdZakhDUGNVRnQ4T1ZoQUFIQWJSU1VNUHNjUHV5VjlZbk1INU1FMFNhdGJiVFZRdUNDWFNlMGJvcDBFeXk4a3cxT25ROHpSOXFzcTZ6NFBHekFjNkFNQUtnaktQNGQ1VkhnNDdlSXNQTGhmTzhDWm5UaDNoZzdqbEFHQWQ1RjJXWVo5bjNXVkVUWnVUWXNiL1JLTFdqNDBvMWllT0VyRDRNN1lXN0diQmlWRjdGdWF5UGUzYkNLYlMvamJUSVFwcFZoVVVUL2ROVkFIQUNYODQxR3R5eDFrQ0VpTGhmaGZ1Y2VOdGt6aUdLQmtCTkRYdkpkVGhmLzMvMnVOWHAwQVdZWEs2alE4eTUwd3E1SlJPRWFDc3VKTXByb2tISm8rcldRQT0mejAwN0hvdlRhbU8zYmNJT0lsSm9PSldheGpJPQ=="; sessionFunnelEventLogged=1; __utma=229774877.448600758.1436737610.1436739423.1436745191.3; __utmz=229774877.1436737610.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); c_dpr=1; __utmc=229774877; _b="ARLbRMvYKUdKiaBWDA2Oxko87z7iIN4MuGnJALvZK8vehgzT11AKeoa13PH4l9VjVMU="; _pinterest_pfob=disabled; __utmb=229774877.3.9.1436745219732; __utmt=1
Connection keep-alive
Pragma no-cache
I have try this code, but i can't obtain Content-Length and Content-Type.
Dim wc As New WebClient
wc.UseDefaultCredentials = True
wc.Credentials = New NetworkCredential("pippomio#yahoo.com", "88Y71nR3764")
wc.Headers.Add("Host", "www.pinterest.com")
wc.Headers.Add("User-Agent", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.56 Safari/536.5")
wc.Headers.Add("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8")
wc.Headers.Add("Accept-Language", "it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3")
wc.Headers.Add("Accept-Encoding", "gzip, deflate")
wc.Headers.Add("X-Requested-With", "XMLHttpRequest")
wc.Headers.Add("X-Requested-With", "XMLHttpRequest")
wc.Headers.Add("X-File-Name", "Hydrangeas.jpg")
wc.Headers.Add("Cache-Control", "no-cache")
wc.Headers.Add("X-CSRFToken", token)
wc.Headers.Add("Referer", "https://www.pinterest.com")
wc.Headers.Add("Connection", "keep-alive")
wc.Headers.Add("Pragma", "no - cache")
Dim Response As Byte() = wc.UploadFile("https://www.pinterest.com/upload-image/?img=Hydrangeas.jpg", "POST", "Hydrangeas.jpg")
In wich way can I do this request in vb net?
Thanks
First, I recommend you to check this and this tutorial to learn how to send/receive HTTP requests on correct way.
Second, you should not re-do any web browser actions in your program since it is usually not a good practice as the frontend architecture should be subject an unexpected change any time. Instead of this, you should check Pinterest API, especially the Users API which can help you to achieve your plans. Usually API interfaces are not a subject of random changes, they are more reliable than replaying front-end operations and more stable, has more capabilities to the load.
(Pinterest API seems working only from Firefox, if you get an empty area at right side with a big "None" text, then browse the link from Firefox - it seems can handle the page)

Logout via Set-Cookie fails

We moved our website a while ago to a new hoster and experience sporadically issues where people cannot logout anymore. Not sure if that has anything to do with the hosting environment or with a code change.
This is the Wireshark log of the relevant bit - all is happening in the same TCP stream.
Logout request from the browser (note the authentication cookie):
GET /cirrus/logout HTTP/1.1
Host: subdomain.domain.com
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.9; rv:26.0) Gecko/20100101 Firefox/26.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: http://subdomain.domain.com/cirrus/CA/Admin/AccountSwitch
Cookie: USER.AUTH=AOvDEjH3w6xIxUC0sYNOAQR5BZ7pPmEF0RMxqohERN87Ti03Eqxd7rQC/BveqmaszmFg8QoSonP+Z+mtQQivKpvloFsQYretYKR8ENubj+moUBF479K5e4albKxS9mBEWT5Xy/XCnEyCPqLASGLY09ywkmIilNU1Ox4J3fCtYXHelE/hyzuKe9y3ui5AKEbbGs3sN9q1zYjVjHKKiNIGaHvjJ2zn7ZUs042B82Jc9RHzt0JW8dnnrl3mAkN1lJQogtlG+ynQSCyQD8YzgO8IpOnSXLJLaCMGMQcvSyX4YKJU/9sxgA5r5cZVCkHLsReS3eIJtXoxktMO6nxVZJY6MX1YwuJOgLRQvwBy9FFnQ6ye
X-LogDigger-CliVer: client-firefox 2.1.5
X-LogDigger: logme=0&reqid=fda96ee5-2db4-f543-81b5-64bdb022d358&
Connection: keep-alive
Server response. It clears the cookie value and redirects
HTTP/1.1 302 Found
Server: nginx
Date: Fri, 22 Nov 2013 14:40:22 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 124
Connection: keep-alive
Cache-Control: private, no-cache="Set-Cookie"
Location: /cirrus
Set-Cookie: USER.AUTH=; expires=Fri, 22-Jul-2005 14:40:17 GMT; path=/cirrus
X-Powered-By: ASP.NET
X-UA-Compatible: chrome=IE8
<html><head><title>Object moved</title></head><body>
<h2>Object moved to here.</h2>
</body></html>
Browser follows the redirection, but with the old cookie value:
GET /cirrus HTTP/1.1
Host: subdomain.domain.com
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.9; rv:26.0) Gecko/20100101 Firefox/26.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: http://subdomain.domain.com/cirrus/CA/Admin/AccountSwitch
Cookie: USER.AUTH=AOvDEjH3w6xIxUC0sYNOAQR5BZ7pPmEF0RMxqohERN87Ti03Eqxd7rQC/BveqmaszmFg8QoSonP+Z+mtQQivKpvloFsQYretYKR8ENubj+moUBF479K5e4albKxS9mBEWT5Xy/XCnEyCPqLASGLY09ywkmIilNU1Ox4J3fCtYXHelE/hyzuKe9y3ui5AKEbbGs3sN9q1zYjVjHKKiNIGaHvjJ2zn7ZUs042B82Jc9RHzt0JW8dnnrl3mAkN1lJQogtlG+ynQSCyQD8YzgO8IpOnSXLJLaCMGMQcvSyX4YKJU/9sxgA5r5cZVCkHLsReS3eIJtXoxktMO6nxVZJY6MX1YwuJOgLRQvwBy9FFnQ6ye
X-LogDigger-CliVer: client-firefox 2.1.5
X-LogDigger: logme=0&reqid=0052e1e1-2306-d64d-a308-20f9fce4702e&
Connection: keep-alive
Is there anything obvious missing in the Set-Cookie header which could prevent the browser from deleting the cookie?
To change the value for an existing cookie, the following cookie parameters must match:
name
path
domain
name and path are set explecitely, the domain is not. Could that be the problem?
Edit: As it has been asked why the expiration date is set in the past, a bit more background.
This is using a slight modification of the AppHarbor Security plug-in: https://github.com/appharbor/AppHarbor.Web.Security
The modification is to include the path to the cookie. Please find here the modified logout method:
public void SignOut(string path)
{
_context.Response.Cookies.Remove(_configuration.CookieName);
_context.Response.Cookies.Add(new HttpCookie(_configuration.CookieName, "")
{
Expires = DateTime.UtcNow.AddMonths(-100),
Path = path
});
}
The expiration date in the past is done by the AppHarbor plug-in and is common practice. See http://msdn.microsoft.com/en-us/library/ms178195(v=vs.100).aspx
At a guess i'd say the historical expiry date is causing the whole Set-Cookie line to be ignored (why set a cookie that expired 8 years ago?).
expires=Fri, 22-Jul-2005
We have had issues with deleting cookies in the past and yes the domain and path must match the domain and path of the cookie you are trying to delete.
Try setting the correct domain and path in the HttpCookie.
Great question, and excellent notes. I've had this problem recently also.
There is one fail-safe approach to this, beyond what you ought to already be doing:
Set expiration in the past.
Set a path and domain.
Put bogus data in the cookie being removed!
Set-Cookie: USER.AUTH=invalid; expires=Fri, 22-Jul-2005 14:40:17 GMT; path=/cirrus; domain=subdomain.domain.com
The fail-safe approach goes like this:
Add a special string to all cookies, at the end. Unless that string exists, reject the cookie and forcibly reset it. For example, all new cookies must look like this:
Set-Cookie: USER.AUTH=AOvDEjH3w6xIxUC0sYNOAQR5BZ7pPmEF0RMxqohERN87Ti03Eqxd7rQC/BveqmaszmFg8QoSonP+Z+mtQQivKpvloFsQYretYKR8ENubj+moUBF479K5e4albKxS9mBEWT5Xy/XCnEyCPqLASGLY09ywkmIilNU1Ox4J3fCtYXHelE/hyzuKe9y3ui5AKEbbGs3sN9q1zYjVjHKKiNIGaHvjJ2zn7ZUs042B82Jc9RHzt0JW8dnnrl3mAkN1lJQogtlG+ynQSCyQD8YzgO8IpOnSXLJLaCMGMQcvSyX4YKJU/9sxgA5r5cZVCkHLsReS3eIJtXoxktMO6nxVZJY6MX1YwuJOgLRQvwBy9FFnQ6ye|1386510233; expires=Fri, 22-Jul-2005 14:40:17 GMT; path=/cirrus; domain=subdomain.domain.com
Notice the change: That extremely long string stored in USER.AUTH ends with |1386510233, which is the unix epoch of the moment when the cookie was set.
This adds a simple extra step to cookie parsing. Now you need to test for the presence of | and to discard the unix epoch unless you care to know when the cookie was set. To make it go faster, you can just check for string[length-10]==| rather than parsing the whole string. In the way I do it, I split the string at | and check for two values after the split. This bypasses a two-part parsing process, but this aspect is language specific and really just preferential when it comes to your choice of tactic. If you plan to discard the value, just check the specific index where you expect the | to be.
In the future if you change hosts again, you can test that unix epoch and reject cookies older than a certain point in time. This at the very most adds two extra processes to your cookie handler: removing the |unixepoch and if desired, checking when that time was to reject a cookie if you change hosts again. This adds about 0.001s to a pageload, or less. That is worth it compared to customer service failures and mass brain damage.
Your new cookie strategy allows you to easily reject all cookies without the |unixepoch immediately, because you know they are old cookies. And yes, people might complain about this approach, but it is the only way to truly know the cookie is valid, really. You cannot rely on the client side to provide you valid cookies. And you cannot keep a record of every single cookie out there, unless you want to warehouse a ton of data. If you warehouse every cookie and check it every time, that can add 0.01s to a pageload versus 0.001s in this strategy, so the warehousing route is not worth it.
An alternative approach is to use USER.AUTHENTICATION rather than USER.AUTH as your new cookie value, but that is more invasive perhaps. And you don't gain the benefit of what I said above if/when you change hosts again.
Good luck with your transition. I hope you get this sorted out. Using the strategy above, I was able to.

NSURLConnection and Authenticating to webservices behind ssl?

I'm currently trying to connect to a webservice placed on https://xxx.xxx.xx/myapp
It has anonymous access and SSL enabled for testing purposes atm.
While trying to connect from the 3G network, i get Status 403: Access denied. You do not have permission to view this directory or page using the credentials that you supplied.
I get these headers while trying to connect to the webservice locally:
Headers
Request URL:https://xxx.xxx.xx/myapp
Request Method:GET
Status Code:200 OK
Request Headers
GET /myapp/ HTTP/1.1
Host: xxx.xxx.xxx
Connection: keep-alive
Authorization: Basic amViZTAyOlE3ZSVNNHNB
User-Agent: Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Encoding: gzip,deflate,sdch
Accept-Language: sv-SE,sv;q=0.8,en-US;q=0.6,en;q=0.4
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Response Headers
HTTP/1.1 200 OK
Content-Type: text/html; charset=UTF-8
Server: Microsoft-IIS/7.0
X-Powered-By: ASP.NET
Date: Thu, 16 Feb 2012 12:26:13 GMT
Content-Length: 622
But when accessing outside the local area, we get the big ol 403. Which in turn wants credentials to grant the user access to the webservice.
However, i've tried using the ASIHTTPRequest library without success, and that project has been abandoned. And they suggest going back to NSURLConnection.
And i have no clue where to start, not even which direction to take.
-connection:(connection *)connection didRecieveAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge
The above delegate method of NSURLConnection doesnt even trigger. So i have no idea what so ever how to authenticate myself.
All i get is the parsed results of the xml elements of the 403-page.
I needs dem seriouz helps! plx.
This was all just a major f-up.
The site had ssl required and enabled, and setting ssl required for the virtual directories does some kind of superduper meta-blocking.
So, by disabling ssl required for the virtual directories, it runs over ssl and is not blocking 3G access..

Why does yandex return 405, when google return 200 Ok?

I have following problem with site http://huti.ru. When trying to add any of its pages in http://webmaster.yandex.ru/addurl.xml (Yandex - russian search engine) wrote "The server returns a status code http 405 (expected code 200)." What can caouse such different behevior for brawusers and yandex crawler? (Google indexes like normal)
Enviroment: tomcat, java 6
Your server does not allow HEAD requests. Seems that the robot first tries a HEAD before the actual GET.
As
http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html
states: HEAD should be identical to GET, except that it does never return a message body, but only the response headers for a particular request.
Note: I did a simple
HEAD / HTTP/1.0
request. Same with HTTP/1.1 + Host: huti.ru.
Check your server logs for the actual content of the response to the Yandex request.
HTTP 405 is Method Not Allowed, and is usually returned if the user agent has used an HTTP verb not supported for the particular resource.
For example, using Fiddler, I issued several requests to http://huti.ru, and I got 200 response for the HEAD, GET, and POST, but I got 405 for the TRACE. It's conceivable that Yandex issues either TRACE or OPTIONS, before making a request for the actual page as a form of a ping to determine if the page exists.
Note: #smilingthax mentioned that your server returns 405 on HEAD. However, issuing the following request from Fiddler worked for me:
HEAD http://huti.ru/ HTTP/1.1
Host: huti.ru
Proxy-Connection: keep-alive
Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.23 Safari/534.10
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Thus, your problem might be specific to HEAD requests with particular headers.
I think that 405 means that the page has already been indexed.