How To Set useUnsafeHeaderParsing For .NET Compact Framework - compact-framework

In my Windows CE 6.0 app, I am communicating with a proprietary web server device that is returning bad header information (more specifically, it's returning NO header information).
I believe this lack of header information is the reason why my HttpWebRequest methods are not working properly.
I recall that the .NET "regular" Framework allows for us to programmatically configure the System.Net.Configuration assembly to allow for invalid headers (useUnsafeHeaderParsing).
Unfortunately, for me, the System.Net.Configuration assembly is not included in the Compact Framework.
Is there a similar configuration in CF that is exposed that allows us to programmatically allow for invalid headers?

I was unable to find a work-around for setting the UseUnsafeHeaderParsing. I decided to remove the implementation of the HttpWebRequest class and use the TcpClient instead. Using the TcpClient class will ignore any problems that may exist with the HTTP Headers - the TcpClient doesn't even think in those terms.
Anyway, using the TcpClient I am able to get the data (including the HTTP Headers) from the proprietary web server that I mentioned in my original post .
For the record, here is a sample of how to retrieve data from a web server via the TcpClient:
The code below is essentially sending a client side HTTP Header packet to a web server.
static string GetUrl(string hostAddress, int hostPort, string pathAndQueryString)
{
string response = string.Empty;
//Get the stream that will be used to send/receive data
TcpClient socket = new TcpClient();
socket.Connect(hostAddress, hostPort);
NetworkStream ns = socket.GetStream();
//Write the HTTP Header info to the stream
StreamWriter sw = new StreamWriter(ns);
sw.WriteLine(string.Format("GET /{0} HTTP/1.1", pathAndQueryString));
sw.Flush();
//Save the data that lives in the stream (Ha! sounds like an activist!)
string packet = string.Empty;
StreamReader sr = new StreamReader(ns);
do
{
packet = sr.ReadLine();
response += packet;
}
while (packet != null);
socket.Close();
return (response);
}

Related

Design Minimal API and use HttpClient to post a file to it

I have a legacy system interfacing issue that my team has elected to solve by standing up a .NET 7 Minimal API which needs to accept a file upload. It should work for small and large files (let's say at least 500 MiB). The API will be called from a legacy system using HttpClient in a .NET Framework 4.7.1 app.
I can't quite seem to figure out how to design the signature of the Minimal API and how to call it with HttpClient in a way that totally works. It's something I've been hacking at on and off for several days, and haven't documented all of my approaches, but suffice it to say there have been varying results involving, among other things:
4XX and 500 errors returned by the HTTP call
An assortment of exceptions on either side
Calls that throw and never hit a breakpoint on the API side
Calls that get through but the Stream on the API end is not what I expect
Errors being different depending on whether the file being uploaded is small or large
Text files being persisted on the server that contain some of the HTTP headers in addition to their original contents
On the Minimal API side, I've tried all sorts of things in the signature (IFormFile, Stream, PipeReader, HttpRequest). On the calling side, I've tried several approaches (messing with headers, using the Flurl library, various content encodings and MIME types, multipart, etc).
This seems like it should be dead simple, so I'm trying to wipe the slate clean here, start with an example of something that partially works, and hope someone might be able to illuminate the path forward for me.
Example of Minimal API:
// IDocumentStorageManager is an injected dependency that takes an int and a Stream and returns a string of the newly uploaded file's URI
app.MapPost(
"DocumentStorage/CreateDocument2/{documentId:int}",
async (PipeReader pipeReader, int documentId, IDocumentStorageManager documentStorageManager) =>
{
using var ms = new MemoryStream();
await pipeReader.CopyToAsync(ms);
ms.Position = 0;
return await documentStorageManager.CreateDocument(documentId, ms);
});
Call the Minimal API using HttpClient:
// filePath is the path on local disk, uri is the Minimal API's URI
private static async Task<string> UploadWithHttpClient2(string filePath, string uri)
{
var fileStream = File.Open(filePath, FileMode.Open);
var content = new StreamContent(fileStream);
var httpRequestMessage = new HttpRequestMessage(HttpMethod.Post, uri);
var httpClient = new HttpClient();
httpRequestMessage.Content = content;
httpClient.Timeout = TimeSpan.FromMinutes(5);
var result = await httpClient.SendAsync(httpRequestMessage);
return await result.Content.ReadAsStringAsync();
}
In the particular example above, a small (6 bytes) .txt file is uploaded without issue. However, a large (619 MiB) .tif file runs into problems on the call to httpClient.SendAsync which results in the following set of nested Exceptions:
System.Net.Http.HttpRequestException - "Error while copying content to a stream."
System.IO.IOException - "Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.."
System.Net.Sockets.SocketException - "An existing connection was forcibly closed by the remote host."
What's a decent way of writing a Minimal API and calling it with HttpClient that will work for small and large files?
Kestrel allows uploading 30MB per default.
To upload larger files via kestrel you might need to increase the max size limit. This can be done by adding the "RequestSizeLimit" attribute. So for example for 1GB:
app.MapPost(
"DocumentStorage/CreateDocument2/{documentId:int}",
[RequestSizeLimit(1_000_000_000)] async (PipeReader pipeReader, int documentId) =>
{
using var ms = new MemoryStream();
await pipeReader.CopyToAsync(ms);
ms.Position = 0;
return "";
});
You can also remove the size limit globally by setting
builder.WebHost.UseKestrel(o => o.Limits.MaxRequestBodySize = null);
This answer is good but the RequestSizeLimit filter doesn't work for minimal APIs, it's an MVC filter. You can use the IHttpMaxRequestBodySizeFeature to limit the size (assuming you're not running on IIS). Also, I made a change to accept the body as a Stream. This avoids the memory stream copy before calling the CreateDocument API:
app.MapPost(
"DocumentStorage/CreateDocument2/{documentId:int}",
async (Stream stream, int documentId, IDocumentStorageManager documentStorageManager) =>
{
return await documentStorageManager.CreateDocument(documentId, stream);
})
.AddEndpointFilter((context, next) =>
{
const int MaxBytes = 1024 * 1024 * 1024;
var maxRequestBodySizeFeature = context.HttpContext.Features.Get<IHttpMaxRequestBodySizeFeature>();
if (maxRequestBodySizeFeature is not null and { IsReadOnly: true })
{
maxRequestBodySizeFeature.MaxRequestBodySize = MaxBytes;
}
return next(context);
});
If you're running on IIS then https://learn.microsoft.com/en-us/iis/configuration/system.webserver/security/requestfiltering/requestlimits/#configuration

Post request error when sending "application/octet-stream" to an ASP.NET Core Web API service

I need to create an ASP.NET Core 3 Web API that understand this URL
http://myapp.com/MyASPNetCore3WebApi/myController/myWebMethod?user=A0001
and one zipfile which goes as a content. This is the code that calls the needed API, which I need to create:
HttpWebRequest httpWebRequest = (HttpWebRequest)WebRequest.Create(URI);
httpWebRequest.Timeout = -1;
httpWebRequest.KeepAlive = false;
httpWebRequest.Method = "POST";
httpWebRequest.ProtocolVersion = HttpVersion.Version10;
httpWebRequest.ContentType = "application/octet-stream";
httpWebRequest.Accept = "application/octet-stream";
httpWebRequest.ContentLength = data.Length;
Stream requestStream = httpWebRequest.GetRequestStream();
requestStream.Write(data, 0, data.Length);
requestStream.Close();
HttpWebResponse httpWebResponse = (HttpWebResponse)httpWebRequest.GetResponse();
The code above is working fine, it is used everyday, sending data to a java web service, now I am replacing that system for a new one in ASP.NET Core and I can't change the caller's code, that's why I need to create a Web API that understand that URL.
I have wrote this code in my Web API, but I guess I am missing something that I canĀ“t figure it out because I get an error ion the client (code above)
[HttpPost("myWebMethod")]
public FileStreamResult myWebMethod(string user, [FromBody] Stream compress)
{
byte[] zip = ((MemoryStream)compress).ToArray();
byte[] data = ZipHelper.Uncompress(zip);
.....................
}
The error I get in the client is this:-
[System.Net.WebException] {"The remote server returned an error: (415)
Unsupported Media Type."} System.Net.WebException
Thanks in advance for any help
If the goal is to read the raw request content, this can be done using HttpContext controller property. HttpContext has Request property that provides access to the actual HTTP request.
No additional model properties or controller arguments are needed to access raw request stream. It's important to note that FromBody and FromForm binding should not be used in this case.
There are couple notes regarding the code in the example from the original question.
byte[] zip = ((MemoryStream)compress).ToArray();
byte[] data = ZipHelper.Uncompress(zip);
The HttpContext.Request.Body property does not return MemoryStream, it returns its own implementation of a Stream. It means that there is no ToArray method.
When reading the entire content of a request directly into the server's memory, it is better to check the content length, otherwise the client can crash the server by sending a large enough request.
Using *Async methods when reading the content of the request will improve performance.

How to enable gzip compression for content encoding with Jersey (JAX-RS 2.0) client?

I have a Java application that uses the Jersey implementation of JAX-RS 2.0 and I want to enable gzip compression on the client side. The server has it enabled and I have verified that by looking in Chrome at the "Size/Content" in the Developer Tools for the specific URL the client is using.
I see a lot of information and documentation floating around the web about setting the HTTP Headers with filters and decoding response bodies with interceptors and I cannot decipher what I actually need to code in the client.
I have this code:
private synchronized void initialize() {
Client client = ClientBuilder.newClient();
client.register(new HttpBasicAuthFilter(username, password));
WebTarget targetBase = client.target(getBaseUrl());
...
}
What should I add to enable compression?
managed to do it with:
private synchronized void initialize() {
Client client = ClientBuilder.newClient();
client.register(new HttpBasicAuthFilter(username, password));
client.register(GZipEncoder.class);
client.register(EncodingFilter.class);
WebTarget targetBase = client.target(getBaseUrl());
...
}
Pretty much the same as #Jason, but EncodingFilter detects the GzipEncoder for me.
In my example (with JAX RS 2.x) and Jersey where multipart is being used, none of the above worked but this did:
Client client = ClientBuilder.newBuilder()
.register(EncodingFilter.class)
.register(GZipEncoder.class)
.property(ClientProperties.USE_ENCODING, "gzip")
.register(MultiPartFeature.class)
.register(LoggingFilter.class)
.build();
Essentially same as the above answers but had to add that one property for "gzip".
Modify to look like:
private synchronized void initialize() {
Client client = ClientBuilder.newClient();
client.register(new HttpBasicAuthFilter(username, password));
client.register(GZipEncoder.class);
WebTarget targetBase = client.target(getBaseUrl());
...
// new lines here:
Invocation.Builder request = targetBase.request(MEDIA_TYPE);
request.header(HttpHeaders.ACCEPT_ENCODING, "gzip");
...
}
In this example, there are some fields and methods being referenced that I don't include in the example (such as MEDIA_TYPE), you'll have to figure those out yourself. Should be pretty straight forward.
I verified this worked by analyzing the response headers and monitoring the application network usage. I got a 10:1 compression ratio according to the network usage checks I did. That seems about right, yay!
Instead of registering EncodingFilter and GZipEncoder individually you can use EncodingFeature directly. With Jersey 2.32 I had problems with incomplete injections and resulting NullPointerExceptions otherwise.
Client client = ClientBuilder.newClient();
client.register(new EncodingFeature("gzip", GZipEncoder.class));
client.register(new HttpBasicAuthFilter(username, password));
WebTarget targetBase = client.target(getBaseUrl());
Note the difference between setting the useEncoding parameter
client.register(new EncodingFeature("gzip", GZipEncoder.class));
or not
client.register(new EncodingFeature(GZipEncoder.class));
is if the initial request by the client is already gzip encoded or if it merely indicates to the server, that it will understand a compressed reply.

.NET HttpClient hangs after several requests (unless Fiddler is active)

I am using System.Net.Http.HttpClient to post a sequence of requests from a console application to a REST API and to deserialize the JSON responses into strongly-typed objects. My implementation is like this:
using (var client = new HttpClient())
{
var content = new StringContent(data, Encoding.UTF8, "text/html");
var response = client.PostAsync(url, content).Result;
response.EnsureSuccessStatusCode();
return response.Content.ReadAsAsync<MyClass>().Result;
}
However, I am experiencing a problem very similar to one described in this question, whereby everything works fine when the requests are routed via Fiddler, but it hangs after the 4th or 5th request when Fiddler is disabled.
If the cause of the problem is the same, I assume I need to do something more with HttpClient to get it to fully release its resources after each request but I am unable to find any code samples that show how to do this.
Hoping somebody can point me in the right direction.
Many thanks,
Tim
You are not disposing of the HttpResponseMessage object. This can leave open streams with the server, and after some quota of streams with an individual server is filled, no more requests will be sent.
using (var client = new HttpClient())
{
var content = new StringContent(data, Encoding.UTF8, "text/html");
using(var response = client.PostAsync(url, content).Result)
{
response.EnsureSuccessStatusCode();
return response.Content.ReadAsAsync<MyClass>().Result;
}
}

Reporting Services Authentication issue

I am trying to programmatically render a PDF using Azure Reporting Services. I suspect that the actual PDF retrieval is fine, but I cannot find a way to authenticate the connection before requesting the report (via URL). I am working in the services layer of my web application and I cannot use a web reference (might not work with Azure) and it doesn't make sense to use a ReportViewer control (since it's a service layer method).
I have all the details to connect, but I suspect that I require a cookie to authenticate and I'm not sure how to manually create this. Any suggestions/solutions?
Here's my code so far:
string userName = BJConfigurationManager.GetSetting("ReportingServiceUsername");
string password = BJConfigurationManager.GetSetting("ReportingServicePassword");
NetworkCredential networkCredential = new NetworkCredential(userName, password);
Domain.Report report = GetReportById(id);
int timeout = 30; //seconds
string url = "https://bleh.ctp.reporting.database.windows.net/ReportServer/Pages/ReportViewer.aspx?...";
string destinationFileName = "#C:\\Temp.pdf";
// Create a web request to the URL
HttpWebRequest MyRequest = (HttpWebRequest)WebRequest.Create(url);
MyRequest.PreAuthenticate = true;
MyRequest.Credentials = networkCredential;
MyRequest.Timeout = timeout * 1000;
try
{
// Get the web response -- THE RESPONSE COMES BACK AS UNAUTHENTICATED...
HttpWebResponse MyResponse = (HttpWebResponse)MyRequest.GetResponse();
Check out the section titled "SOAP Management Endpoint Programmatic Access":
http://msdn.microsoft.com/en-us/library/windowsazure/771e88b6-ab0f-4910-a5fa-5facd8d56767#SOAPManagement.
It explains how to authenticate using a cookie container without a ReportViewer control.
I don't think that is going to work. Azure Reporting uses Forms Authentication and as I understand it, you aren't going to be able to match the Forms Auth cookie along with the MachineKey for encryption.
I was trying to accomplish the same task..but using a WebRequest was impossible.
I changed the approach using a ServerReport class like this:
ServerReport report;
report = new ServerReport();
report.ReportServerUrl = new Uri(reportServerName + "/ReportServer");
report.ReportPath = "/ReportPath";
report.ReportServerCredentials = new ReportServerCredentials();
report.SetParameters(new Microsoft.Reporting.WebForms.ReportParameter("param1", param1));
report.SetParameters(new Microsoft.Reporting.WebForms.ReportParameter("param2", param1));
return report.Render(reportParams.OutputFormat);
The ReportServerCredentials class must implement the IReportServerCredentials interface like this.
More info about the IReportServerCredentials interface and implementation here.