What is the RabbitMQ message max size can stored? - rabbitmq

We are uploading 23 mb text file RabbitMQ. We will convert that file to filestream and then we will bind that message to JSONObject.
var path = Server.MapPath("~/App_Data/" + fileName);
var excelFile = new FileInfo(path);
FileStream stream = System.IO.File.Open(path, FileMode.Open, FileAccess.Read);
// Added Code for CommandComponent changes Start
byte[] fileMessage = new byte[stream.Length];
stream.Read(fileMessage, 0, fileMessage.Length);
stream.Close();
TempData["FileMessage"] = fileMessage;
TempData["FileType"] = fileType;
System.IO.File.Delete(path);
// Added Code for CommandComponent changes End
return Json(new { Result = true }, JsonRequestBehavior.AllowGet);

By AMQP specification there is not limit. The body is a buffer where you can put what you prefer.
Obviously there is the network between your application and RabbitMQ and you can't send a big-file just with a simple send.
you have to implement some kind of the streaming

Related

Detecting file size with MultipartFormDataStreamProvider before file is saved?

We are using the MultipartFormDataStreamProviderto save file upload by clients. I have a hard requirement that file size must be greater than 1KB. The easiest thing to do would of course be the save the file to disk and then look at the file unfortunately i can't do it like this. After i save the file to disk i don't have the ability to access it so i need to look at the file before its saved to disk. I've been looking at the properties of the stream provider to try to figure out what the size of the file is but unfortunately i've been unsuccessful.
The test file i'm using is 1025 bytes.
MultipartFormDataStreamProvider.BufferSize is 4096
Headers.ContentDisposition.Size is null
ContentLength is null
Is there a way to determine file size before it's saved to the file system?
Thanks to Guanxi i was able to formulate a solution. I used his code in the link as the basis i just added a little more async/await goodness :). I wanted to add the solution just in case it helps anyone else:
private async Task SaveMultipartStreamToDisk(Guid guid, string fullPath)
{
var user = HttpContext.Current.User.Identity.Name;
var multipartMemoryStreamProvider = await Request.Content.ReadAsMultipartAsync();
foreach (var content in multipartMemoryStreamProvider.Contents)
{
using (content)
{
if (content.Headers.ContentDisposition.FileName != null)
{
var existingFileName = content.Headers.ContentDisposition.FileName.Replace("\"", string.Empty);
Log.Information("Original File name was {OriginalFileName}: {guid} {user}", existingFileName, guid,user);
using (var st = await content.ReadAsStreamAsync())
{
var ext = Path.GetExtension(existingFileName.Replace("\"", string.Empty));
List<string> validExtensions = new List<string>() { ".pdf", ".jpg", ".jpeg", ".png" };
//1024 = 1KB
if (st.Length > 1024 && validExtensions.Contains(ext, StringComparer.OrdinalIgnoreCase))
{
var newFileName = guid + ext;
using (var fs = new FileStream(Path.Combine(fullPath, newFileName), FileMode.Create))
{
await st.CopyToAsync(fs);
Log.Information("Completed writing {file}: {guid} {user}", Path.Combine(fullPath, newFileName), guid, HttpContext.Current.User.Identity.Name);
}
}
else
{
if (st.Length < 1025)
{
Log.Warning("File of length {FileLength} bytes was attempted to be uploaded: {guid} {user}",st.Length,guid,user);
}
else
{
Log.Warning("A file of type {FileType} was attempted to be uploaded: {guid} {user}", ext, guid,user);
}
var responseMessage = new HttpResponseMessage(HttpStatusCode.BadRequest)
{
Content =
st.Length < 1025
? new StringContent(
$"file of length {st.Length} does not meet our minumim file size requirements")
: new StringContent($"a file extension of {ext} is not an acceptable type")
};
throw new HttpResponseException(responseMessage);
}
}
}
}
}
You can also read the request contents without using MultipartFormDataStreamProvider. In that case all of the request contents (including files) would be in memory. I have given an example of how to do that at this link.
In this case you can read header for file size or read stream and check the file size. If it satisfy your criteria then only write it to desire location.

Can I write streams or bytes to an Apache Commons Compress Tarfile?

The Apache Commons compress library seems focused around writing a TarArchiveEntry to TarArchiveOutputStream. But it looks like the only way to create a TarArchiveEntry is with a File object.
I don't have files to write to the Tar, I have byte[]s in memory or preferably streams. And I don't want to write a bunch of temp files to disk just so that I can build a tar.
Is there any way I can do something like:
TarEntry entry = new TarEntry(int size, String filename);
entry.write(byte[] fileContents);
TarArchiveOutputStream tarOut = new TarArchiveOutputStream();
tarOut.write(entry);
tarOut.flush();
tarOut.close();
Or, even better....
InputStream nioTarContentsInputStream = .....
TarEntry entry = new TarEntry(int size, String filename);
entry.write(nioTarContentsInputStream);
TarArchiveOutputStream tarOut = new TarArchiveOutputStream();
tarOut.write(entry);
tarOut.flush();
tarOut.close();
Use the following code:
byte[] test1Content = new byte[] { /* Some data */ };
TarArchiveEntry entry1 = new TarArchiveEntry("test1.txt");
entry1.setSize(test1Content.length);
TarArchiveOutputStream out = new TarArchiveOutputStream(new FileOutputStream("out.tar"));
out.putArchiveEntry(entry1);
out.write(test1Content);
out.closeArchiveEntry();
out.close();
This builds the desired tar file with a single file in it, with the contents from the byte[].

Uploading big files to SkyDrive folder from Windows Phone 8

I'm trying to upload big files to SkyDrive from Windows Phone 8 device.
In order to avoid OurOfMemory exception I have to write file bytes manually (without SDK) to request stream with AllowWriteStreamBuffering = false
var putUrl = string.Format("https://apis.live.net/v5.0/{0}/files/{1}?suppress_response_codes=true&suppress_redirects=true&overwrite=true&access_token={2}", _folderId, fileName, _session.AccessToken);
var webRequest = (HttpWebRequest) WebRequest.Create(putUrl);
webRequest.ContentType = string.Empty;//Empty is required for PUT method.
webRequest.Method = "PUT";
webRequest.AllowWriteStreamBuffering = false;
webRequest.ContentLength = fileStream.Length;
webRequest.BeginGetRequestStream(iar =>
{
using (var requestStream = webRequest.EndGetRequestStream(iar))
{
var buffer = new byte[1024*100];
var read = 0;
while ((read = fileStream.Read(buffer, 0, buffer.Length)) <= buffer.Length)
{
requestStream.Write(buffer, 0, read); //!!!IT HANGS HERE WITHOUT ANY ERROR!!!
if (read < buffer.Length)
break;
}
}
}, webRequest);
It works fine for small and medium files (<50MB) but uploading hangs at requestStream.Write (at different percentage, sometime it is 30%, sometimes 50%) and wouldn't continue execution for large files (100-500 MB).
I appreciate any help.
Thanks,
Alexey Strakh

Differences between emulator's SD Card and real phone's SD Card

My target is to read a file in SD Card and then process it in my program.
All THINGS WORKS FINE IN ANDROID EMULATOR!!!!
Unfortunately, when I work on my smartphone it didn't work at all!
public void receiveVideoRawData() throws IOException{
byte[] buf_rcv = new byte[153600];
File file = new File("/mnt/sdcard/Bluetooth/ardrone.raw");
ByteArrayOutputStream ous = new ByteArrayOutputStream();
InputStream ios = new FileInputStream(file);
int read = 0;
while ( (read = ios.read(buf_rcv)) != -1 ) {
ous.write(buf_rcv, 0, read);
}
ous.close();
ios.close();
ReadRawFileImage readMyRawData=new ReadRawFileImage();
image = readMyRawData.readUINT_RGBImage(buf_rcv);
File outputfile = new File("/mnt/sdcard/Bluetooth/ardroneCVT1.jpg");
OutputStream _outStream = new FileOutputStream(outputfile);
Bitmap pBitmap = image ;
pBitmap.compress(Bitmap.CompressFormat.JPEG, 90, _outStream);
_outStream.flush();
_outStream.close();
}
}
You are not supposed to hardcode the path to external storage directly because this may differ between devices, instead use getExternalStorageDirectory():
File file = new File(Environment.getExternalStorageDirectory()
.getAbsolutePath(), filename);

vb.NET Network.UploadFile Limit

i use My.Computer.Network.UploadFile method for file upload to FTP .
But i have some problem . My problem is Upload Speed .
Ex: i use some FTP program (FileZilla) and my upload speed 4 Mb/sn.
but My.Computer.Network.UploadFile method is 1.20Mb/sn limit .
Why this method is limited ? Can i up Upload speed ?
Use this code pal and inform me if this will be useful:
using System.Net;
// Get the object used to communicate with the server.
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create("ftp://
XXXXXXXXXXXXXXXXXXXXX/" + "C:/XXXXX.zip");
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential("User", "PassWord");
// Copy the contents of the file to the request stream.
Stream ftpStream = request.GetRequestStream();
FileStream file = File.OpenRead("C:/XXXXX.zip");
int length = 1024;
byte[] buffer = new byte[length];
int bytesread = 0;
do
{
bytesread = file.Read(buffer,0,length);
ftpStream.Write(buffer,0,bytesread);
}
while(bytesread != 0);
file.Close();
ftpStream.Close();
MessageBox.Show("Uploaded Successfully");