I have a web page that calls an external web service which returns some XML - calling it like this:
Resp = Req.GetResponse() *As HttpWebResponse
I put a try catch around this in order to capture application errors such as time out. In order to test this, I entered the following in web.config:
<system.web>
<sessionState timeout="1" />
However this isn't working - I think this is 1 minute, not 1 second as I would like. I also tried setting it t0 01.
Since I'm running in debug mode in VS 2010, I don't have IIS settings to mess with.
How can I test for exceptons?
The sessionState timeout is not what you are looking for I would look at setting the timeout on the HttpWebRequest.Timout method.
Req.Timeout = 1; //Milliseconds
Resp = Req.GetResponse() *As HttpWebResponse
Related
After the completion of about 40-45 APIs, the size of the authorization token is around 40 kb. And now after login whenever any request is being sent to the server it gives an error of "Bad Request, Header too long.
For Kestrel Server the blow code has been solved the error -
webBuilder.ConfigureKestrel(options =>
{
options.Limits.MaxRequestHeadersTotalSize = 1048576;
})
But for IIS Server I haven't got any kind of solution
The solution I have tried is
Increased the RequestLimit from web.config file.
Add the MaxFieldLength and MaxRequestBytes in
HKEY_LOCAL_MACHINE/System/CurrentControlSet/Services/HTTP/Parameters
Add the below code in ConfigureServices method
services.Configure(options => { options.AutomaticAuthentication = true; options.ForwardWindowsAuthentication = true; });
and many more tries but not got the final solution.
Please help if anyone can...
Take a look on this link here:
https://learn.microsoft.com/el-GR/troubleshoot/iis/httpsys-registry-windows
There are some registry settings, that limit IIS maxLenght. Depending on your IIS version, this could vary.
They keys you should look for probably are:
MaxFieldLength (per header size)
MaxRequestBytes (total size of request)
I have a REST Service built in vb.net running on IIS. I've noticed that the returned data is sanitized or escaped, e.g.:
<script>alert('testing1')</script>
is returned as:
<script>alert('testing1')</script>
Can anyone tell me what is doing the escaping, would it be .net or IIS and can it be switched on or off?
I'm pretty sure you could have a controller method like this..
Function ReturnJavascript() As Net.Http.HttpResponseMessage
Dim resp As New Net.Http.HttpResponseMessage(Net.HttpStatusCode.OK)
resp.Content = New Net.Http.StringContent("<script>alert('testing1')</script>", System.Text.Encoding.UTF8, "text/plain")
Return resp
End Function
Not sure what you are trying to accomplish. If you don't want to return as a response message, you might attempt returning as content set to text/plain.
I have a Win Form with a picture gallery that uses FtpWebRequest to upload pictures, but after changing to .Net 4.0 I suddenly get 550 error. The error occurs both when uploading files and listing directory.
As seen in my example-code I have implemented the MS solution from http://support.microsoft.com/kb/2134299.
I have checked the username, password and path - everything is correct.
Still, I get an error. I have skimmed Google for every solution without any response.
SetMethodRequiredCWD();
FtpWebRequest reqFTP = (FtpWebRequest)WebRequest.Create(new Uri(pPath));
reqFTP.Credentials = new NetworkCredential(Properties.Settings.Default.FTPUser, Properties.Settings.Default.FTPPass);
reqFTP.Method = WebRequestMethods.Ftp.ListDirectory;
reqFTP.KeepAlive = false;
FtpWebResponse respFTP = (FtpWebResponse)reqFTP.GetResponse();
Stream respStreamFTP = respFTP.GetResponseStream();
StreamReader streamReader = new StreamReader(respStreamFTP, Encoding.Default);
One approach I would recommend is to monitor the request/response exchange between the ftp-client and -server using e.g. Fiddler.
First, record a session in which the error does not manifest by manually using a third party client such as Filezilla to upload the file. Then, record another session with your program as the client. Comparing the exchanged messages may yield some insight to what is wrong.
Try to enable Network Tracing: http://msdn.microsoft.com/en-us/library/a6sbz1dx%28v=vs.100%29.aspx
I am trying to upload a photo to a sharepoint library. If I use a relatively small file (370KB) then it works without any problems.
But if I try to upload a file that is about 3MB large then I get the error:
"Der Remoteserver hat einen Fehler zurückgegeben: NotFound."
translated:
"The remote server returned an error: NotFound."
I read that it should be possible to set the max message size, but I found no way to set such a thing in the ClientContext object.
This is the code I use:
private void UploadFileCallback(object state)
{
var args = (List<object>)state;
var itemContainer = (ISharepointItemContainer)args.ElementAt(0);
var fileInfo = (FileInfo)args.ElementAt(1);
var sharepointList = _context.Web.Lists.GetByTitle(itemContainer.ListName);
Microsoft.SharePoint.Client.File uploadFile;
FileCreationInformation newFile;
using (FileStream fs = fileInfo.OpenRead())
{
byte[] content = new byte[fs.Length];
newFile = new FileCreationInformation();
int dummy = fs.Read(content, 0, (int)fs.Length);
newFile.Content = content;
newFile.Url = itemContainer.AbsoluteUrl + "/" + fileInfo.Name;
uploadFile = sharepointList.RootFolder.Files.Add(newFile);
_context.Load(uploadFile);
}
_context.ExecuteQuery();
if (FileUploadCompleted != null)
{
FileUploadCompleted(this, EventArgs.Empty);
}
}
Does anyone have an idea on how to resolve this issue?
The first thing to try is to go to the Web Applications Management section in the Central Administration site for SharePoint. Select the General Settings for the web app that you are deploying to and increase the maximum upload size.
The second thing to try is to add this to your web.config:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="52428800"/>
</requestFiltering>
</security>
</system.webServer>
This will let you set the size to something larger.
By default, SharePoint has a 50MB max limit per upload. IIS 7 (not sure about other versions) has a 30 MB max limit per upload. You will need to add the XML configuration that Ryan provided to your SharePoint website's web.config, in IIS. This is your front-end web server.
The limit you're reaching is because the webservice that handles Client Object Model requests has a maximum message size. You can either increase that size, but another solution is to use WebDAV to upload the document, this will help if you don't have access to the server.
The .NET Client Object Model has a method File.SaveBinraryDirect() for that, and that's probably your best bet.
If you were using the Silverlight Client Object Model that method is not available and you'll have write some additional code: see this article, second part. The first part descibes how to increase the maximum message size.
This should increase your maximum upload size to the one set in Central Admin (typically 50MB), pointed out in other posts.
The default upload size limit for the SharePoint client object model is 2 MB. You can change that limit by modifying the MaxReceivedMessageSize property of the service.
This can be done in two ways:
programatically - as described in this link - tho this won't work in Silverlight for example
trough the powershell. On the server where you have SharePoint installed, fire up the SharePoint Management Shell (make sure you run it under the farm administrator account) and run the following commands.
$ws = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$ws.ClientRequestServiceSettings.MaxReceivedMessageSize = 52428800
$ws.Update()
This will change the upload limit to 52428800 bytes - or 50 MB. Now, restart the website hosting your SharePoint site (or the entire IIS) for the changes to take effect.
When making an HttpWebRequest within a CLR stored procedure (as per the code below), the first invocation after the Sql Server is (re-)started or after a given (but indeterminate) period of time waits for quite a length of time on the GetResponse() method call.
Is there any way to resolve this that doesn't involve a "hack" such as having a Sql Server Agent job running every few minutes to try and ensure that the first "slow" call is made by the Agent and not "real" production code?
function SqlString MakeWebRequest(string address, string parameters, int connectTO)
{
SqlString returnData;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(String.Concat(address.ToString(), "?", parameters.ToString()));
request.Timeout = (int)connectTO;
request.Method = "GET";
using (WebResponse response = request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
SqlString responseFromServer = reader.ReadToEnd();
returnData = responseFromServer;
}
}
}
response.Close();
return returnData;
}
(Error handling and other non-critical code has ben removed for brevity)
See also this Sql Server forums thread.
This was a problem for me using HttpWebRequest at first. It's due to the the class looking for a proxy to use. If you set the object's Proxy value to null/Nothing, it'll zip right along.
Looks to me like code signing verification. The MS shipped system dlls are all signed and SQL verifies the signatures at load time. Apparently the certificate revocation list is expired and the certificate verification engine times out retrieving a new list. I have blogged about this problem before Fix slow application startup due to code sign validation and the problem is also described in this Technet article: Certificate Revocation and Status Checking.
The solution is pretty arcane and involves registry editing of the key: HKLM\SOFTWARE\Microsoft\Cryptography\OID\EncodingType 0\CertDllCreateCertificateChainEngine\Config:
ChainUrlRetrievalTimeoutMilliseconds This is each individual CRL check call timeout. If is 0 or not present the default value of 15 seconds is used. Change this timeout to a reasonable value like 200 milliseconds.
ChainRevAccumulativeUrlRetrievalTimeoutMilliseconds This is the aggregate CRL retrieval timeout. If set to 0 or not present the default value of 20 seconds is used. Change this timeout to a value like 500 milliseconds.
There is also a more specific solution for Microsoft signed assemblies (this is from the Biztalk documentation, but applies to any assembly load):
Manually load Microsoft Certificate
Revocation lists
When starting a .NET application, the
.NET Framework will attempt to
download the Certificate Revocation
list (CRL) for any signed assembly. If
your system does not have direct
access to the Internet, or is
restricted from accessing the
Microsoft.com domain, this may delay
startup of BizTalk Server. To avoid
this delay at application startup, you
can use the following steps to
manually download and install the code
signing Certificate Revocation Lists
on your system.
Download the latest CRL updates from
http://crl.microsoft.com/pki/crl/products/CodeSignPCA.crl
and
http://crl.microsoft.com/pki/crl/products/CodeSignPCA2.crl.
Move the CodeSignPCA.crl and CodeSignPCA2.crl files to the isolated
system.
From a command prompt, enter the following command to use the certutil
utility to update the local
certificate store with the CRL
downloaded in step 1:
certutil –addstore CA c:\CodeSignPCA.crl
The CRL files are updated regularly,
so you should consider setting a
reoccurring task of downloading and
installing the CRL updates. To view
the next update time, double-click the
.crl file and view the value of the
Next Update field.
Not sure but if the delay long enough that initial DNS lookups could be the culprit?
( how long is the delay verse a normal call? )
and/or
Is this URI internal to the Network / or a different internal network?
I have seen some weird networking delays from using load balance profiles inside a network that isn't setup right, the firewalls, load-balancers, and other network profiles might be "fighting" the initial connections...
I am not a great networking guy, but you might want to see what an SA has to say about this on serverfault.com as well...
good luck
There is always a delay the first time SQLCLR loads the necessary assemblies.
That should be the case not only for your function MakeWebRequest, but also for any .NET function in the SQLCLR.
HttpWebRequest is part of the System.Net assembly, which is not part of the supported libraries.
I'd recommend using the library System.Web.Services instead to make web service calls from inside the SQLCLR.
I have tested and my first cold run (after SQL service restart) was in 3 seconds (not 30 as yours), all others are in 0 sec.
The code sample I've used to build a DLL:
using System;
using System.Data;
using System.Net;
using System.IO;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
namespace MySQLCLR
{
public static class WebRequests
{
public static void MakeWebRequest(string address, string parameters, int connectTO)
{
string returnData;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(String.Concat(address.ToString(), "?", parameters.ToString()));
request.Timeout = (int)connectTO;
request.Method = "GET";
using (WebResponse response = request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
returnData = reader.ReadToEnd();
reader.Close();
}
responseStream.Close();
}
response.Close();
}
SqlDataRecord rec = new SqlDataRecord(new SqlMetaData[] { new SqlMetaData("response", SqlDbType.NVarChar, 10000000) });
rec.SetValue(0, returnData);
SqlContext.Pipe.Send(rec);
}
}
}