Upload large file to Sharepoint with Silverlight - silverlight-4.0

I am trying to upload a photo to a sharepoint library. If I use a relatively small file (370KB) then it works without any problems.
But if I try to upload a file that is about 3MB large then I get the error:
"Der Remoteserver hat einen Fehler zurückgegeben: NotFound."
translated:
"The remote server returned an error: NotFound."
I read that it should be possible to set the max message size, but I found no way to set such a thing in the ClientContext object.
This is the code I use:
private void UploadFileCallback(object state)
{
var args = (List<object>)state;
var itemContainer = (ISharepointItemContainer)args.ElementAt(0);
var fileInfo = (FileInfo)args.ElementAt(1);
var sharepointList = _context.Web.Lists.GetByTitle(itemContainer.ListName);
Microsoft.SharePoint.Client.File uploadFile;
FileCreationInformation newFile;
using (FileStream fs = fileInfo.OpenRead())
{
byte[] content = new byte[fs.Length];
newFile = new FileCreationInformation();
int dummy = fs.Read(content, 0, (int)fs.Length);
newFile.Content = content;
newFile.Url = itemContainer.AbsoluteUrl + "/" + fileInfo.Name;
uploadFile = sharepointList.RootFolder.Files.Add(newFile);
_context.Load(uploadFile);
}
_context.ExecuteQuery();
if (FileUploadCompleted != null)
{
FileUploadCompleted(this, EventArgs.Empty);
}
}
Does anyone have an idea on how to resolve this issue?

The first thing to try is to go to the Web Applications Management section in the Central Administration site for SharePoint. Select the General Settings for the web app that you are deploying to and increase the maximum upload size.
The second thing to try is to add this to your web.config:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="52428800"/>
</requestFiltering>
</security>
</system.webServer>
This will let you set the size to something larger.

By default, SharePoint has a 50MB max limit per upload. IIS 7 (not sure about other versions) has a 30 MB max limit per upload. You will need to add the XML configuration that Ryan provided to your SharePoint website's web.config, in IIS. This is your front-end web server.

The limit you're reaching is because the webservice that handles Client Object Model requests has a maximum message size. You can either increase that size, but another solution is to use WebDAV to upload the document, this will help if you don't have access to the server.
The .NET Client Object Model has a method File.SaveBinraryDirect() for that, and that's probably your best bet.
If you were using the Silverlight Client Object Model that method is not available and you'll have write some additional code: see this article, second part. The first part descibes how to increase the maximum message size.
This should increase your maximum upload size to the one set in Central Admin (typically 50MB), pointed out in other posts.

The default upload size limit for the SharePoint client object model is 2 MB. You can change that limit by modifying the MaxReceivedMessageSize property of the service.
This can be done in two ways:
programatically - as described in this link - tho this won't work in Silverlight for example
trough the powershell. On the server where you have SharePoint installed, fire up the SharePoint Management Shell (make sure you run it under the farm administrator account) and run the following commands.
$ws = [Microsoft.SharePoint.Administration.SPWebService]::ContentService
$ws.ClientRequestServiceSettings.MaxReceivedMessageSize = 52428800
$ws.Update()
This will change the upload limit to 52428800 bytes - or 50 MB. Now, restart the website hosting your SharePoint site (or the entire IIS) for the changes to take effect.

Related

Bad Request, Header too long With Error Code 400 (Run using IIS Server .netCore Application)

After the completion of about 40-45 APIs, the size of the authorization token is around 40 kb. And now after login whenever any request is being sent to the server it gives an error of "Bad Request, Header too long.
For Kestrel Server the blow code has been solved the error -
webBuilder.ConfigureKestrel(options =>
{
options.Limits.MaxRequestHeadersTotalSize = 1048576;
})
But for IIS Server I haven't got any kind of solution
The solution I have tried is
Increased the RequestLimit from web.config file.
Add the MaxFieldLength and MaxRequestBytes in
HKEY_LOCAL_MACHINE/System/CurrentControlSet/Services/HTTP/Parameters
Add the below code in ConfigureServices method
services.Configure(options => { options.AutomaticAuthentication = true; options.ForwardWindowsAuthentication = true; });
and many more tries but not got the final solution.
Please help if anyone can...
Take a look on this link here:
https://learn.microsoft.com/el-GR/troubleshoot/iis/httpsys-registry-windows
There are some registry settings, that limit IIS maxLenght. Depending on your IIS version, this could vary.
They keys you should look for probably are:
MaxFieldLength (per header size)
MaxRequestBytes (total size of request)

azure blob uploadfile bad request

Hi I am new to azure I am trying to upload a file to azure container using
static void UploadBlobFromFile(Uri blobEndpoint, string accountName, string accountKey)
{
// Create service client for credentialed access to the Blob service.
CloudBlobClient blobClient =
new CloudBlobClient(blobEndpoint,
new StorageCredentials(accountName, accountKey));
// Get a reference to a container, which may or may not exist.
CloudBlobContainer container = blobClient.GetContainerReference("StackOverflowAnalysis");
// Create a new container, if it does not exist
//container.CreateIfNotExist();
// Get a reference to a blob, which may or may not exist.
CloudBlockBlob blob = container.GetBlockBlobReference("QueryResults.csv");
// Upload content to the blob, which will create the blob if it does not already exist.
using (var filst = System.IO.File.OpenRead(#"c:\users\hmohamed\Downloads\QueryResults.csv"))
{ blob.UploadFromStream(filst); }
}'
I am getting error Bad request 400; I am trying this in mvc app I have also tried it with console application where i got error the process cannot access file because it is being used by another process. Responses to similar posts advice to run netstat command to fix the problem but I do not know how to use it and what parameters to supply; can some one please help
All letters in a container name must be lowercase. So, please use "stackoverflowanalysis" as your container name.
For more information on naming, please refer to Naming and Referencing Containers, Blobs, and Metadata.

FtpWebResponse GetResponse() gives "The remote server returned an error: (550) File unavailable (e.g., file not found, no access)."

I have a Win Form with a picture gallery that uses FtpWebRequest to upload pictures, but after changing to .Net 4.0 I suddenly get 550 error. The error occurs both when uploading files and listing directory.
As seen in my example-code I have implemented the MS solution from http://support.microsoft.com/kb/2134299.
I have checked the username, password and path - everything is correct.
Still, I get an error. I have skimmed Google for every solution without any response.
SetMethodRequiredCWD();
FtpWebRequest reqFTP = (FtpWebRequest)WebRequest.Create(new Uri(pPath));
reqFTP.Credentials = new NetworkCredential(Properties.Settings.Default.FTPUser, Properties.Settings.Default.FTPPass);
reqFTP.Method = WebRequestMethods.Ftp.ListDirectory;
reqFTP.KeepAlive = false;
FtpWebResponse respFTP = (FtpWebResponse)reqFTP.GetResponse();
Stream respStreamFTP = respFTP.GetResponseStream();
StreamReader streamReader = new StreamReader(respStreamFTP, Encoding.Default);
One approach I would recommend is to monitor the request/response exchange between the ftp-client and -server using e.g. Fiddler.
First, record a session in which the error does not manifest by manually using a third party client such as Filezilla to upload the file. Then, record another session with your program as the client. Comparing the exchanged messages may yield some insight to what is wrong.
Try to enable Network Tracing: http://msdn.microsoft.com/en-us/library/a6sbz1dx%28v=vs.100%29.aspx

VEMap and a GeoRSS feed(hosted separately)

The scenario is as follows:
A WCF web service exists that outputs a valid GeoRSS feed. This lives in its own domain as a number of different applications have access to it.
A web page(on a different site) has been created with an instance of a VEMap(Bing/Virtual Earth map object).
Now, VEMap can accept an input feed in this format via the following:
var layer = new VEShapeLayer();
var veLayerSpec = new VEShapeSourceSpecification(VEDataType.GeoRSS, "someurl", layer);
map.ImportShapeLayerData(veLayerSpec, onComplete, true);
onComplete is a callback function I'm using to replace the default pin graphic with something custom.
The question is in regards to "someurl", which is a path to a local xml file containing the geographic information(georss simple format). I've realized this feed and the map must be hosted in the same domain, so I've created a generic handler that reads the remote feed and returns it in the same format.
var veLayerSpec = new VEShapeSourceSpecification(VEDataType.GeoRSS, "/somelocalhandler.ashx", layer);
When I do this, I get the VEMap error("z is null"). This is the same error one would receive when trying to access a remote feed. When I copy the feed into a local xml file(ie, "feed.xml") there is no error.
The order of operations is currently: remote feed -> local handler -> VEMap import
If I'm over complicating this procedure, let me know! I'm a bit new to the Bing Maps API and might have missed something. Any assistance is appreciated.
The format I have above is actually very close to what I needed. A similar solution was found by Mike McDougall. Although I was passing the RSS feed directly through the handler(writing the read stream directly), I just needed to specify the following from within the handler:
context.Response.ContentType = "text/xml";
context.Response.ContentEncoding = System.Text.Encoding.UTF8;
With the above fix, I'm able to have a remote GeoRSS feed successfully load a separately hosted Virtual Earth map instance.

HttpWebRequest runs slowly first time within SQLCLR

When making an HttpWebRequest within a CLR stored procedure (as per the code below), the first invocation after the Sql Server is (re-)started or after a given (but indeterminate) period of time waits for quite a length of time on the GetResponse() method call.
Is there any way to resolve this that doesn't involve a "hack" such as having a Sql Server Agent job running every few minutes to try and ensure that the first "slow" call is made by the Agent and not "real" production code?
function SqlString MakeWebRequest(string address, string parameters, int connectTO)
{
SqlString returnData;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(String.Concat(address.ToString(), "?", parameters.ToString()));
request.Timeout = (int)connectTO;
request.Method = "GET";
using (WebResponse response = request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
SqlString responseFromServer = reader.ReadToEnd();
returnData = responseFromServer;
}
}
}
response.Close();
return returnData;
}
(Error handling and other non-critical code has ben removed for brevity)
See also this Sql Server forums thread.
This was a problem for me using HttpWebRequest at first. It's due to the the class looking for a proxy to use. If you set the object's Proxy value to null/Nothing, it'll zip right along.
Looks to me like code signing verification. The MS shipped system dlls are all signed and SQL verifies the signatures at load time. Apparently the certificate revocation list is expired and the certificate verification engine times out retrieving a new list. I have blogged about this problem before Fix slow application startup due to code sign validation and the problem is also described in this Technet article: Certificate Revocation and Status Checking.
The solution is pretty arcane and involves registry editing of the key: HKLM\SOFTWARE\Microsoft\Cryptography\OID\EncodingType 0\CertDllCreateCertificateChainEngine\Config:
ChainUrlRetrievalTimeoutMilliseconds This is each individual CRL check call timeout. If is 0 or not present the default value of 15 seconds is used. Change this timeout to a reasonable value like 200 milliseconds.
ChainRevAccumulativeUrlRetrievalTimeoutMilliseconds This is the aggregate CRL retrieval timeout. If set to 0 or not present the default value of 20 seconds is used. Change this timeout to a value like 500 milliseconds.
There is also a more specific solution for Microsoft signed assemblies (this is from the Biztalk documentation, but applies to any assembly load):
Manually load Microsoft Certificate
Revocation lists
When starting a .NET application, the
.NET Framework will attempt to
download the Certificate Revocation
list (CRL) for any signed assembly. If
your system does not have direct
access to the Internet, or is
restricted from accessing the
Microsoft.com domain, this may delay
startup of BizTalk Server. To avoid
this delay at application startup, you
can use the following steps to
manually download and install the code
signing Certificate Revocation Lists
on your system.
Download the latest CRL updates from
http://crl.microsoft.com/pki/crl/products/CodeSignPCA.crl
and
http://crl.microsoft.com/pki/crl/products/CodeSignPCA2.crl.
Move the CodeSignPCA.crl and CodeSignPCA2.crl files to the isolated
system.
From a command prompt, enter the following command to use the certutil
utility to update the local
certificate store with the CRL
downloaded in step 1:
certutil –addstore CA c:\CodeSignPCA.crl
The CRL files are updated regularly,
so you should consider setting a
reoccurring task of downloading and
installing the CRL updates. To view
the next update time, double-click the
.crl file and view the value of the
Next Update field.
Not sure but if the delay long enough that initial DNS lookups could be the culprit?
( how long is the delay verse a normal call? )
and/or
Is this URI internal to the Network / or a different internal network?
I have seen some weird networking delays from using load balance profiles inside a network that isn't setup right, the firewalls, load-balancers, and other network profiles might be "fighting" the initial connections...
I am not a great networking guy, but you might want to see what an SA has to say about this on serverfault.com as well...
good luck
There is always a delay the first time SQLCLR loads the necessary assemblies.
That should be the case not only for your function MakeWebRequest, but also for any .NET function in the SQLCLR.
HttpWebRequest is part of the System.Net assembly, which is not part of the supported libraries.
I'd recommend using the library System.Web.Services instead to make web service calls from inside the SQLCLR.
I have tested and my first cold run (after SQL service restart) was in 3 seconds (not 30 as yours), all others are in 0 sec.
The code sample I've used to build a DLL:
using System;
using System.Data;
using System.Net;
using System.IO;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
namespace MySQLCLR
{
public static class WebRequests
{
public static void MakeWebRequest(string address, string parameters, int connectTO)
{
string returnData;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(String.Concat(address.ToString(), "?", parameters.ToString()));
request.Timeout = (int)connectTO;
request.Method = "GET";
using (WebResponse response = request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
returnData = reader.ReadToEnd();
reader.Close();
}
responseStream.Close();
}
response.Close();
}
SqlDataRecord rec = new SqlDataRecord(new SqlMetaData[] { new SqlMetaData("response", SqlDbType.NVarChar, 10000000) });
rec.SetValue(0, returnData);
SqlContext.Pipe.Send(rec);
}
}
}