FtpWebRequest 530 Not Logged In only with EnableSsl - c++-cli

I'm trying to bolt some C++/CLI FtpWebRequest code on top of a legacy C++ application (VS2008). The first snippet of code below works fine WITHOUT EnableSsl, but as soon as I add EnableSsl I get 530 Not Logged In. So, the EXACT same code, credentials, URI, etc, just with or without the EnableSsl line.
All the other answers I could find ended up being server configuration or credential problems, but Filezilla client works fine with explicit FTPS, and the second snippet of code below (C# in VS2015) works fine with EnableSsl, and the first snippet works fine without EnableSsl.
How can I get FtpWebRequest working with EnableSsl in my VS2008 C++/CLI application?
EDIT: In VS2008, the C# code also has the same "works without EnableSsl / 530 with EnableSsl behavior" as the C++/CLI code. So, C# vs C++/CLI is no longer a data point, but I'm still hoping someone knows how to get EnableSsl working in VS2008.
// First snippet, C++/CLI VS2008, works fine WITHOUT EnableSsl line, 530 WITH EnableSsl line
FtpWebRequest^ ftpRequest = dynamic_cast<FtpWebRequest^>(WebRequest::Create(gcnew Uri(_T("ftp://server/path/dst_file.ext"))));
ftpRequest->Credentials = gcnew NetworkCredential(_T("username"),_T("password"));
ftpRequest->Method = WebRequestMethods::Ftp::UploadFile;
ftpRequest->UseBinary = true;
ftpRequest->EnableSsl = true;
StreamReader^ srcStream = gcnew StreamReader(_T("src_file.ext"));
array<Byte>^ fileData = Encoding::UTF8->GetBytes(srcStream->ReadToEnd());
srcStream->Close();
ftpRequest->ContentLength = fileData->Length;
Stream^ reqStream = ftpRequest->GetRequestStream();
reqStream->Write(fileData,0,fileData->Length);
reqStream->Close();
FtpWebResponse^ ftpResponse = dynamic_cast<FtpWebResponse^>(ftpRequest->GetResponse());
ftpResponse->Close();
// Second snippet, C# VS2015, works fine WITH EnableSsl
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://server/path/dst_file.ext");
request.Credentials = new NetworkCredential("username", "password");
request.Method = WebRequestMethods.Ftp.UploadFile;
request.UseBinary = true;
request.EnableSsl = true;
StreamWriter writer = new StreamWriter(request.GetRequestStream());
writer.Write(new StreamReader("src_file.ext").ReadToEnd());
writer.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
response.Close();

It seems like the problem is that VS2008 doesn't support .NET versions with recent cryptographic protocols, and there isn't any practical way to work around it without upgrading (https://blogs.perficient.com/microsoft/2016/04/tsl-1-2-and-net-support/).

Related

Attaching an Image to Work item in Visual Studio Team Services (was Visual Studio Online)

I'm sending an attachment through the Visual Studio Team Services API and it all look like its fine, until I look at the attachment on the work item.
The attachment should be a picture, but it a little black box with a white cross.
Has anyone had this issue and does anyone know what I've done wrong?
I get the image and convert it to a 64 Base string
FileInfo info = new FileInfo(attachment.Path);
byte[] bytes = File.ReadAllBytes(info.FullName);
String file = Convert.ToBase64String(bytes);
Then I send it to the API. This returns a message saying its been successful.
using (System.Net.Http.HttpClient client = new System.Net.Http.HttpClient())
{
client.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Basic",
Convert.ToBase64String(System.Text.ASCIIEncoding.ASCII.GetBytes(getConnectionDetails())));
using (System.Net.Http.HttpResponseMessage response = client.PostAsync(SetURL(url),
new StringContent(binaryString,Encoding.UTF8,"application/json")).Result)
{
response.EnsureSuccessStatusCode();
responseString = await response.Content.ReadAsStringAsync();
}
}
I think its something small, that I'm missing!
This is the link to the document, I have used.
API document
Try it this way:
...
string uri = "https://xxxxxx.visualstudio.com/_apis/wit/attachments?fileName=test.jpg&api-version=1.0";
string filepath = "C:\\images\\test.jpg";
FileStream files = new FileStream(filepath,FileMode.Open);
StreamContent streamcontent = new StreamContent(files);
...
HttpResponseMessage response = hc.PostAsync(uri, streamcontent).Result;
...

Having problems when trying to login to a website using C++/cli

I've been roughly searching over the internet for some answers to my problem, but I still couldn't figure out how to log in to a website properly.
Firstly, I'm going to explain what I've done until this moment.
» I opened this website: http://side.utad.pt/cursos/einformatica/ upon which I want to log in.
» After opening its souce code, I found the Url that the Login form posts to is:
https://side.utad.pt/side-secure3/login.pl
I tried to open it but an Internal Error came out so I tried to access that url without /login.pl instead but i don't have permission to. As I can't get this Url working, I thought about using the first link itself.
» By using tamper data extension(Firefox) I found that there are 3 post arguments: sessionid, username and password. Username and password are input by the user himself.
To get sessionid, I simply searched for it inside source code and took it from there:
String^ formUrl = "http://side.utad.pt/cursos/einformatica/";
String^ pageSource;
WebClient^ client = gcnew WebClient();
pageSource = client->DownloadString(formUrl);
delete client;
client = nullptr;int index = pageSource->IndexOf("sessionid");
int startIndex = index + 34;
String^ _sessionid = pageSource->Substring(startIndex, 32);
Until here, everything was fine apart from the Url problem.
» Started formatting all the data gathered(which I believe is the correct way):
String^ formParams;
// format data
formParams = "sessionid="+ _sessionid+"&username="+username+"&password="+password;
» After that, I started working with the "body" of the code:
WebRequest^ req = WebRequest::Create(formUrl);
// encode our data
array<Byte>^ bytes = System::Text::Encoding::ASCII->GetBytes(formParams);
req->ContentType = "application/x-www-form-urlencoded";
req->Method = "POST";
req->ContentLength = bytes->Length;
Stream^ os = req->GetRequestStream();
os->Write(bytes,0,bytes->Length);
os->Close();
Am I doing it correctly until here?
» I wanted to check if i'm logged in or not, so I thought about getting another source code, but this time on pos-login page(can be accessed without logging in but we're always carried to that page after logging in):
// this code is added below os->Close();
WebResponse^ resp = req->GetResponse();
String^ cookieHeader;
cookieHeader = resp->Headers["Set-cookie"];
WebRequest^ getRequest = WebRequest::Create("http://side.utad.pt/cursos/einformatica/principal"); // Exception 1
getRequest->Headers->Add("Cookie", cookieHeader);
WebResponse^ getResponse = getRequest->GetResponse();
StreamReader^ sr = gcnew StreamReader(getRequest->GetRequestStream()); // Exception 2
pageSource = ""; // reset
pageSource = sr->ReadToEnd();
Firstly, the first line does raise an exception - most of the times - but I don't know the cause: 'The server commited a protocol violation Section=ResponseStatusLine'
Secondly and lastly, when that line doesn't raise an exception, this does(cannot send a content-body with this verb-type)
StreamReader^ sr = gcnew StreamReader(getRequest->GetRequestStream());
Any ideas to get this working?
I think that the problem here is related to the cookies. I might have not saved them properly..
Thanks

Stream text to client via handler ASP.NET

To get around twitters streaming API not having a crossdomain file to access it from client side( in this case Silverlight) I have made a Generic Handler file in a web project which basically downloads the stream from twitter and as it reads it, writes it to the client.
Here is the handler code:
context.Response.Buffer = false;
context.Response.ContentType = "text/plain";
WebRequest request = WebRequest.Create("http://stream.twitter.com/1/statuses/filter.json?locations=-180,-90,180,90");
request.Credentials = new NetworkCredential("username", "password");
StreamReader responseStream = new StreamReader(request.GetResponse().GetResponseStream(), Encoding.GetEncoding("utf-8"));
while (!responseStream.EndOfStream)
{
string line = "(~!-/" + responseStream.ReadLine() + "~!-/)";
context.Response.BinaryWrite((Encoding.UTF8.GetBytes(line)));}
And this does work, but the problem is that once the client disconnects the handler just carry's on downloading. So how do I tell if the client is still busy receiving the request and if not, end the while loop?
Also, my second problem is that on the client side doing a "ReadLine()" does nothing, presumably because it is counting the entire stream as one line so never gets the full response. To work around that I read it byte by byte and when it sees "(~!-/" around something it know that is one line. VERY hacky, I know.
Thanks!
Found the answer!
while (context.Response.IsClientConnected)
:)

Why am I getting System.FormatException: String was not recognized as a valid Boolean on a fraction of our customers machines?

Our c#.net software connects to an online app to deal with accounts and a shop. It does this using HttpWebRequest and HttpWebResponse.
An example of this interaction, and one area where the exception in the title has come from is:
var request = HttpWebRequest.Create(onlineApp + string.Format("isvalid.ashx?username={0}&password={1}", HttpUtility.UrlEncode(username), HttpUtility.UrlEncode(password))) as HttpWebRequest;
request.Method = "GET";
using (var response = request.GetResponse() as HttpWebResponse)
using (var ms = new MemoryStream())
{
var responseStream = response.GetResponseStream();
byte[] buffer = new byte[4096];
int read;
do
{
read = responseStream.Read(buffer, 0, buffer.Length);
ms.Write(buffer, 0, read);
} while (read > 0);
ms.Position = 0;
return Convert.ToBoolean(Encoding.ASCII.GetString(ms.ToArray()));
}
The online app will respond either 'true' or 'false'. In all our testing it gets one of these values, but for a couple of customers (out of hundreds) we get this exception System.FormatException: String was not recognized as a valid Boolean Which sounds like the response is being garbled by something. If we ask them to go to the online app in their web browser, they see the correct response. The clients are usually on school networks which can be fairly restrictive and often under proxy servers, but most cope fine once they've put the proxy details in or added a firewall exception. Is there something that could be messing up the response from the server, or is something wrong with our code?
Indeed, it's possible that the return result is somehow different.
Is there any particular reason you are doing the reasonably elaborate method of reading the repsonse there? Why not:
string data;
using(HttpWebResponse response = request.GetResponse() as HttpWebResponse){
StreamReader str = new StreamReader(response.GetResponseStream());
data = str.ReadToEnd();
str.Close();
}
string cleanResult = data.Trim().ToLower();
// log this
return Convert.ToBoolean(cleanResult);
First thing to note is I would definitely use something like:
bool myBool = false;
Boolean.TryParse(Encoding.ASCII.GetString(ms.ToArray()), myBool);
return myBool;
It's not some localisation issue is it? It's expecting the Swahili version of 'true', and getting confused. Are all the sites in one country, with the same language, etc?
I'd add logging, as suggested by others, and see what results you're seeing.
I'd also lean towards changing the code as silky suggested, though with a few further changes from me (code 'smell' issues, IMO); Use using around the stream reader, as well as the response.
Also, I don't think the use of as is appropriate in this instance. If the Response can't be cast to HttpWebResponse (which, admittedly is unlikely, but still) you'll get a NullRef exception on the response.GetResponseStream() bit which is both a vague error, and you've lost the original line number. Using (HttpWebResponse)request.GetResponse() will give you a more correct error, and the correct line number of the actual error.

IOException when making HttpWebRequest to local ASHX file

Greetings, all. Here is my situation. I am attempting to make an HttpWebRequest to a local handler file and I keep getting the following exception:
Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
Now, I'm using a local handler file because I am writing some integration code for a third party that the site will be using. Until I have a test environment available for me to make requests to, I'm basically mocking the process with a local handler file. Here is the relevant code. Thanks.
WebRequest code (subRequest variable is object passed to the method executing this code):
XmlSerializer serializer;
XmlDocument xmlDoc = null;
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(requestUrl);
webRequest.Method = "POST";
webRequest.ContentType = "text/xml";
webRequest.KeepAlive = true;
webRequest.Accept = "*/*";
serializer = new XmlSerializer(subRequest.GetType());
XmlWriter writer = new XmlTextWriter(webRequest.GetRequestStream(), Encoding.UTF8);
serializer.Serialize(writer, subRequest);
writer.Close();
xmlDoc = new XmlDocument();
xmlDoc.Load(XmlReader.Create(webRequest.GetResponse().GetResponseStream()));
The "requestUrl" is defined as "http://localhost:2718/Handlers/MyHandler.ashx". I can hit the handler file just fine and have stepped through the code. All it does is assemble an XML response as a string and writes it out to the Response object:
context.Response.ContentType = "text/xml";
string newSubscriptionId = Utils.GetUniqueKey();
StringBuilder sb = new StringBuilder();
sb.Append("<?xml version=\"1.0\" encoding=\"utf-8\"?>");
// Assemble XML string here
context.Response.Write(sb.ToString());
As far as I can tell, this is all working just fine. But when my code hits the last line of the WebRequest chunk:
xmlDoc.Load(XmlReader.Create(webRequest.GetResponse().GetResponseStream()));
Is when the exception is thrown. Any ideas? Thanks in advance.
James
First, if you don't intend to reuse the connection or there's not going to be another request to the same schema/server/port, I would set KeepAlive to false.
The problem is that XmlDocument.Load() does not read the entire stream before the server closes the connection or that it keeps reading beyond the end and when the server keep-alive timeout is over, the connection is closed by the server. Also, you never close the response stream. To verify that this theory is correct, do something like:
// Optional -> webRequest.KeepAlive = false;
string xml = null;
using (StreamReader reader = new StreamReader (webRequest.GetResponse().GetResponseStream())) {
xml = reader.ReadToEnd ();
}
xmlDoc.LoadXml (xml);
and see if that fixes your problem.