Canceled WCF REST downloads - wcf

I have a REST service that returns large files via a Stream as the return type. However I need a way to know how much of the file was transferred even if the download is canceled by the client. What would be the best way to accomplish something like this?
So far I've come up with the following:
public Message GetFile(string path)
{
UsageLog log = new UsageLog();
return WebOperationContext.Current.CreateStreamResponse((stream) =>
{
using (FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read))
{
byte[] buffer = new byte[_BufferSize];
int bytesRead = 0;
long totalBytesRead = 0;
while (true)
{
bytesRead = fs.Read(buffer, 0, _BufferSize);
if (bytesRead == 0)
{
break;
}
try
{
stream.Write(buffer, 0, bytesRead);
}
catch (CommunicationException ex)
{
break;
}
totalBytesRead += bytesRead;
}
log.TransferCompletedUtc = DateTime.UtcNow;
log.BytesTransferred = totalBytesRead;
}
},
"application/octet-stream");
}
I'm interested in hearing any other solutions to accomplishing something like this.

Related

iTextSharp Document Isn't Working After Deployment

i'm using iTextSharp to create a pdf document then add it as an attachment to send an email using SendGrid.
The code is working locally but after deploying the project in Azure this function stopped working for some reason. I tried to analyze the problem and i think that the document didn't fully created of attached due to the connection. I can't pin point the exact issue to solve it. Any opinions or discussion is appreciated.
Action:
public async Task<IActionResult> GeneratePDF(int? id, string recipientEmail)
{
//if id valid
if (id == null)
{
return NotFound();
}
var story = await _db.Stories.Include(s => s.Child).Include(s => s.Sentences).ThenInclude(s => s.Image).FirstOrDefaultAsync(s => s.Id == id);
if (story == null)
{
return NotFound();
}
var webRootPath = _hostingEnvironment.WebRootPath;
var path = Path.Combine(webRootPath, "dump"); //folder name
try
{
using (System.IO.MemoryStream memoryStream = new System.IO.MemoryStream())
{
iTextSharp.text.Document document = new iTextSharp.text.Document(iTextSharp.text.PageSize.A4, 10, 10, 10, 10);
PdfWriter writer = PdfWriter.GetInstance(document, memoryStream);
document.Open();
string usedFont = Path.Combine(webRootPath + "\\fonts\\", "arial.TTF");
BaseFont bf = BaseFont.CreateFont(usedFont, BaseFont.IDENTITY_H, BaseFont.EMBEDDED);
iTextSharp.text.Font titleFont = new iTextSharp.text.Font(bf, 40);
iTextSharp.text.Font sentencesFont = new iTextSharp.text.Font(bf, 15);
iTextSharp.text.Font childNamewFont = new iTextSharp.text.Font(bf, 35);
PdfPTable T = new PdfPTable(1);
//Hide the table border
T.DefaultCell.BorderWidth = 0;
T.DefaultCell.HorizontalAlignment = 1;
T.DefaultCell.PaddingTop = 15;
T.DefaultCell.PaddingBottom = 15;
//Set RTL mode
T.RunDirection = PdfWriter.RUN_DIRECTION_RTL;
//Add our text
if (story.Title != null)
{
T.AddCell(new iTextSharp.text.Paragraph(story.Title, titleFont));
}
if (story.Child != null)
{
if (story.Child.FirstName != null && story.Child.LastName != null)
{
T.AddCell(new iTextSharp.text.Phrase(story.Child.FirstName + story.Child.LastName, childNamewFont));
}
}
if (story.Sentences != null)
{
.................
}
document.Add(T);
writer.CloseStream = false;
document.Close();
byte[] bytes = memoryStream.ToArray();
var fileName = path + "\\PDF" + DateTime.Now.ToString("yyyyMMdd-HHMMss") + ".pdf";
using (FileStream fs = new FileStream(fileName, FileMode.Create))
{
fs.Write(bytes, 0, bytes.Length);
}
memoryStream.Position = 0;
memoryStream.Close();
//Send generated pdf as attchment
// Create the file attachment for this email message.
var attachment = Convert.ToBase64String(bytes);
var client = new SendGridClient(Options.SendGridKey);
var msg = new SendGridMessage();
msg.From = new EmailAddress(SD.DefaultEmail, SD.DefaultEmail);
msg.Subject = story.Title;
msg.PlainTextContent = "................";
msg.HtmlContent = "..................";
msg.AddTo(new EmailAddress(recipientEmail));
msg.AddAttachment("Story.pdf", attachment);
try
{
await client.SendEmailAsync(msg);
}
catch (Exception ex)
{
Console.WriteLine("{0} First exception caught.", ex);
}
//Remove form root
if (System.IO.File.Exists(fileName))
{
System.IO.File.Delete(fileName);
}
}
}
catch (FileNotFoundException e)
{
Console.WriteLine($"The file was not found: '{e}'");
}
catch (DirectoryNotFoundException e)
{
Console.WriteLine($"The directory was not found: '{e}'");
}
catch (IOException e)
{
Console.WriteLine($"The file could not be opened: '{e}'");
}
return RedirectToAction("Details", new { id = id });
}
try to edit usedfont variable as bellow :
var usedfont = Path.Combine(webRootPath ,#"\fonts\arial.TTF")
It turns out that the problem is far from iTextSharp. I did a remote debugging from this article.
Two parts of the code was causing the problem.
First, for some reason the folder "dump" was not created on Azure wwwroot folder while locally it is. so, i added these lines:
var webRootPath = _hostingEnvironment.WebRootPath;
var path = Path.Combine(webRootPath, "dump");
if (!Directory.Exists(path)) //Here
Directory.CreateDirectory(path);
Second, after debugging it shows that creating the file was failing every time. I replaced the following lines:
using (FileStream fs = new FileStream(fileName, FileMode.Create))
{
fs.Write(bytes, 0, bytes.Length);
}
memoryStream.Position = 0;
memoryStream.Close();
With:
using (FileStream fs = new FileStream(fileName, FileMode.Create))
using (var binaryWriter = new BinaryWriter(fs))
{
binaryWriter.Write(bytes, 0, bytes.Length);
binaryWriter.Close();
}
memoryStream.Close();
Hope this post helps someone.

Why are results in the browser using Google AutoML so different from an exported tflite file in my android app?

I'm training a model using Google AutoML. I then exported a TFLite model to run on an android app.
With the same image used in the app and in the browser (Google AutoML gives you a chance to test your model in the browser), I get a very accurate answer in the browser but in my app, I get a not-so accurate answer.
Any thoughts on why this could be happening?
Relevant code:
public void predict(String imagePath, final Promise promise) {
imgData.order(ByteOrder.nativeOrder());
labelProbArray = new byte[1][RESULTS_TO_SHOW];
try {
labelList = loadLabelList();
} catch (Exception ex) {
ex.printStackTrace();
}
Log.w("FIND_ ", imagePath);
Bitmap bitmap = BitmapFactory.decodeFile(imagePath);
convertBitmapToByteBuffer(bitmap);
try {
tflite = new Interpreter(loadModelFile());
} catch (Exception ex) {
Log.w("FIND_exception in loading tflite", "1");
}
tflite.run(imgData, labelProbArray);
promise.resolve(getResult());
}
private WritableNativeArray getResult() {
WritableNativeArray result = new WritableNativeArray();
//ArrayList<JSONObject> result = new ArrayList<JSONObject>();
try {
for (int i = 0; i < RESULTS_TO_SHOW; i++) {
WritableNativeMap map = new WritableNativeMap();
map.putString("label", Integer.toString(i));
float output = (float)(labelProbArray[0][i] & 0xFF) / 255f;
map.putString("prob", String.valueOf(output));
result.pushMap(map);
Log.w("FIND_label ", Integer.toString(i));
Log.w("FIND_prob ", String.valueOf(labelProbArray[0][i]));
}
} catch (Exception ex) {
ex.printStackTrace();
}
return result;
}
private List<String> loadLabelList() throws IOException {
Activity activity = getCurrentActivity();
List<String> labelList = new ArrayList<String>();
BufferedReader reader =
new BufferedReader(new InputStreamReader(activity.getAssets().open(LABEL_PATH)));
String line;
while ((line = reader.readLine()) != null) {
labelList.add(line);
}
reader.close();
return labelList;
}
private void convertBitmapToByteBuffer(Bitmap bitmap) {
if (imgData == null) {
return;
}
imgData.rewind();
Log.w("FIND_bitmap width ", String.valueOf(bitmap.getWidth()));
Log.w("FIND_bitmap height ", String.valueOf(bitmap.getHeight()));
bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
// Convert the image to floating point.
int pixel = 0;
//long startTime = SystemClock.uptimeMillis();
for (int i = 0; i < DIM_IMG_SIZE_X; ++i) {
for (int j = 0; j < DIM_IMG_SIZE_Y; ++j) {
final int val = intValues[pixel++];
imgData.put((byte) ( ((val >> 16) & 0xFF)));
imgData.put((byte) ( ((val >> 8) & 0xFF)));
imgData.put((byte) ( (val & 0xFF)));
Log.w("FIND_i", String.valueOf(i));
Log.w("FIND_j", String.valueOf(j));
}
}
}
private MappedByteBuffer loadModelFile() throws IOException {
Activity activity = getCurrentActivity();
AssetFileDescriptor fileDescriptor = activity.getAssets().openFd(MODEL_PATH);
FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel = inputStream.getChannel();
long startOffset = fileDescriptor.getStartOffset();
long declaredLength = fileDescriptor.getDeclaredLength();
return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}
One of the things I don't know if is correct is how I adjust my output data into a probability. In the getResult() method, I write this:
float output = (float)(labelProbArray[0][i] & 0xFF) / 255f;
I divide by 255 in order to get a number between 0 and 1, but I don't know if this is correct.
The .tflite model is a quantized uint8 model if that helps.

how to reduce the size of multiple images before save to database using asp.net fileuploader

I have done the multiple file up-loader and save the images in database as (byte)
while retrieving those all images memory out of exception was thrown so i want to reduce the size of uploaded multiple files through pragmatically (c#,asp.net)
with that i have problem with reducing the sizes of multiple images using file up-loader in .net before saving to database
protected void DataUploading()
{
try
{
int count;
if (uploader.HasFile)
{
string s = System.IO.Path.GetExtension(uploader.FileName);
if ((s == ".JPG") || (s == ".JPEG") || (s == ".PNG") || (s == ".BMP") || (s == ".jpg") || (s == ".jpeg") || (s == ".png") || (s == ".bmp") || (s == ".jpe"))
{
count = ReferrenceCount();
HttpFileCollection fileCollection = Request.Files;
if (count == 0)
{
count = 0;
}
for (int i = 0; i < fileCollection.Count; i++)
{
HttpPostedFile uploadfile = fileCollection[i];
int fileSize = uploadfile.ContentLength;
if (fileSize <= 31457280)//3145728-5242880
{
string fileName = Path.GetFileName(uploadfile.FileName);
if (uploadfile.ContentLength > 0)
{
string contentType = uploader.PostedFile.ContentType;
using (Stream fs = uploadfile.InputStream)
{
using (BinaryReader br = new BinaryReader(fs))
{
byte[] bytes = br.ReadBytes((Int32)fs.Length);
byte[] newbytes = reducesize(uploadfile);
imagelist.Add(bytes);
refcounts.Add(count);
count += 1;
}
}
}
}
else
{
//image compression code to be written here
ScriptManager.RegisterClientScriptBlock(this, this.GetType(), "alertMessage", "alert('FILE SIZE SHOULD BE LIMIT TO 5MB')", true);
}
}
}
else
{
ScriptManager.RegisterClientScriptBlock(this, this.GetType(), "alertMessage", "alert('ONLY JPEG,BMP,PNG ARE ALLOWED')", true);
}
for (int ij = 0; ij < imagelist.Count; ij++)
{
using (con = new SqlConnection(constr))
{
string query = string.Empty;
query = "INSERT INTO INVENTORY_IMAGES(CLAIM_NUMBER,PHOTOS,REF_NO,UPDATEDATE,MODIFIEDDATE,MODIFIEDBY,CHASSIS) values (#Name, #Data,#REF,#DATE,#datetime,#user,#chassis)";
using (cmd = new SqlCommand(query))
{
cmd.Connection = con;
cmd.Parameters.AddWithValue("#Name", txtclaimnumber.Text);
cmd.Parameters.AddWithValue("#Data", imagelist[ij]);
cmd.Parameters.AddWithValue("#REF", refcounts[ij].ToString());
cmd.Parameters.AddWithValue("#DATE", DateTime.Now.ToString("dd/MM/yyyy"));
cmd.Parameters.AddWithValue("#datetime", DateTime.Now.ToString("dd/MM/yyyy HH:mm:ss"));
cmd.Parameters.AddWithValue("#user", Session["user"].ToString());
cmd.Parameters.AddWithValue("#chassis", txtchasisno.Text);
con.Open();
cmd.ExecuteNonQuery();//'"+DateTime.Now.ToString("dd/MM/yyyy HH:mm:ss")+"','"+Session["user"].ToString()+"','"++"'
}
}
}
imagelist.Clear();
refcounts.Clear();
}
else
{
ScriptManager.RegisterClientScriptBlock(this, this.GetType(), "alertMessage", "alert('PLEASE SELECT A FILE')", true);
}
}
catch (Exception ee)
{
WriteLog(ee.Message);
throw ee;
}
}
private byte[] reducesize(HttpPostedFile HttpPostedFiles)
{
//variable to store the image content
//the size of the variable is initialized to the file content length
byte[] imageBytes = new byte[HttpPostedFiles.ContentLength];
//Gets the underlying System.Web.HttpPostedFile object for a file that is uploaded
//by using the System.Web.UI.WebControls.FileUpload control.
//HttpPostedFile uploadImage = fileUploadS.PostedFile;
//read the image stream from the post and store it in imageBytes
HttpPostedFiles.InputStream.Read(imageBytes, 0, (int)HttpPostedFiles.ContentLength);
//resize image to a thumbnail
MemoryStream thumbnailPhotoStream = ResizeImage(HttpPostedFiles);
byte[] thumbnailImageBytes = thumbnailPhotoStream.ToArray();
return thumbnailImageBytes;
//end
}
private MemoryStream ResizeImage(HttpPostedFile HttpPostedFiless)
{
try
{
Bitmap originalBMP = new Bitmap(HttpPostedFiless.InputStream);
// Create a bitmap with the uploaded file content
//Bitmap originalBMP = new Bitmap(inputContent);
// get the original dimensions
int origWidth = originalBMP.Width;
int origHeight = originalBMP.Height;
//calculate the current aspect ratio
int aspectRatio = origWidth / origHeight;
//if the aspect ration is less than 0, default to 1
if (aspectRatio <= 0)
aspectRatio = 1;
//new width of the thumbnail image
int newWidth = 100;
//calculate the height based on the aspect ratio
int newHeight = newWidth / aspectRatio;
// Create a new bitmap to store the new image
Bitmap newBMP = new Bitmap(originalBMP, newWidth, newHeight);
// Create a graphic based on the new bitmap
Graphics graphics = Graphics.FromImage(newBMP);
// Set the properties for the new graphic file
graphics.SmoothingMode = SmoothingMode.AntiAlias; graphics.InterpolationMode = InterpolationMode.HighQualityBicubic;
// Draw the new graphic based on the resized bitmap
graphics.DrawImage(originalBMP, 0, 0, newWidth, newHeight);
//save the bitmap into a memory stream
System.IO.MemoryStream stream = new System.IO.MemoryStream();
newBMP.Save(stream, GetImageFormat(System.IO.Path.GetExtension(HttpPostedFiless.FileName)));
// dispose drawing objects
originalBMP.Dispose();
newBMP.Dispose();
graphics.Dispose();
return stream;
}
catch (Exception ee)
{
WriteLog(ee.Message);
throw;
}
}
private System.Drawing.Imaging.ImageFormat GetImageFormat(string extension)
{
switch (extension.ToLower())
{
case "jpg":
return System.Drawing.Imaging.ImageFormat.Jpeg;
case "bmp":
return System.Drawing.Imaging.ImageFormat.Bmp;
case "png":
return System.Drawing.Imaging.ImageFormat.Png;
}
return System.Drawing.Imaging.ImageFormat.Jpeg;
}

VB.NET - FTP upload/download integrity issues

I have a Windows service that periodically uploads files to an FTP server, and will then redownload the file to check that the bytes match exactly. For most files it seems ok, but one of the files we're using recently is a 10mb CSV file and the files are always slightly different (by a couple of bytes).
Below are the upload, download and file compare methods. Note that the Download is setting UseBinary as true but the upload isn't - could this be why?
public string Upload(FileInfo fi, string targetFilename)
{
//copy the file specified to target file: target file can be full path or just filename (uses current dir)
//1. check target
string target;
if (targetFilename.Trim() == "")
{
//Blank target: use source filename & current dir
target = this.CurrentDirectory + fi.Name;
}
else if (targetFilename.Contains("/"))
{
//If contains / treat as a full path
target = AdjustDir(targetFilename);
}
else
{
//otherwise treat as filename only, use current directory
target = CurrentDirectory + targetFilename;
}
string URI = Hostname + target;
//perform copy
System.Net.FtpWebRequest ftp = GetRequest(URI);
//Set request to upload a file in binary
ftp.Method = System.Net.WebRequestMethods.Ftp.UploadFile;
ftp.UseBinary = false;
//Notify FTP of the expected size
ftp.ContentLength = fi.Length;
//create byte array to store: ensure at least 1 byte!
const int BufferSize = 2048;
byte[] content = new byte[BufferSize - 1 + 1];
int dataRead;
string result = null;
//open file for reading
using (FileStream fs = fi.OpenRead())
{
try
{
//open request to send
using (Stream rs = ftp.GetRequestStream())
{
do
{
dataRead = fs.Read(content, 0, BufferSize);
rs.Write(content, 0, dataRead);
} while (!(dataRead < BufferSize));
rs.Close();
}
}
catch (Exception x)
{
result = URI + " - " + x.ToString();
}
finally
{
//ensure file closed
fs.Close();
}
}
ftp = null;
return result;
}
#endregion
public bool Download(string sourceFilename, FileInfo targetFI, bool PermitOverwrite)
{
//1. check target
if (targetFI.Exists && !(PermitOverwrite))
{
throw (new ApplicationException("Target file already exists"));
}
//2. check source
string target;
if (sourceFilename.Trim() == "")
{
throw (new ApplicationException("File not specified"));
}
else if (sourceFilename.Contains("/"))
{
//treat as a full path
target = AdjustDir(sourceFilename);
}
else
{
//treat as filename only, use current directory
target = CurrentDirectory + sourceFilename;
}
string URI = Hostname + target;
//3. perform copy
System.Net.FtpWebRequest ftp = GetRequest(URI);
//Set request to download a file in binary mode
ftp.Method = System.Net.WebRequestMethods.Ftp.DownloadFile;
ftp.UseBinary = true;
ftp.UsePassive = false;
//open request and get response stream
using (FtpWebResponse response = (FtpWebResponse)ftp.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
//loop to read & write to file
using (FileStream fs = targetFI.OpenWrite())
{
try
{
byte[] buffer = new byte[2048];
int read = 0;
do
{
read = responseStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, read);
} while (!(read == 0));
responseStream.Close();
fs.Flush();
fs.Close();
}
catch (Exception)
{
//catch error and delete file only partially downloaded
fs.Close();
//delete target file as it's incomplete
targetFI.Delete();
throw;
}
}
responseStream.Close();
}
response.Close();
}
return true;
}
Public Function FileCompare(ByVal file1 As String, ByVal file2 As String) As Boolean
' Checks to see if two files are the same
Dim file1byte As Integer
Dim file2byte As Integer
Dim fs1 As FileStream
Dim fs2 As FileStream
If file1 = file2 Then
Return True
End If
fs1 = New FileStream(file1, FileMode.Open)
fs2 = New FileStream(file2, FileMode.Open)
' Simple length test
If fs1.Length <> fs2.Length Then
fs1.Close()
fs2.Close()
Return False
End If
Do
file1byte = fs1.ReadByte()
file2byte = fs2.ReadByte()
Loop While file1byte = file2byte And file1byte <> -1
fs1.Close()
fs2.Close()
Return ((file1byte - file2byte) = 0)
End Function

Downloading an Azure blob in chunks using WCF

I have been asked to provide a WCF service that allows a blob (potentially 1GB) to be downloaded in chunks as an offset byte[] for consumption by a Silverlight application. Essentially, the operation will have a parameter for number of bytes to offset and the max number of bytes to return, nothing complex I think.
The code I have so far is:
[OperationContract]
public byte[] Download(String url, int blobOffset, int bufferSize)
{
var blob = new CloudBlob(url);
using(var blobStream = blob.OpenRead())
{
var buffer = new byte[bufferSize];
blobStream.Seek(blobOffset, SeekOrigin.Begin);
int numBytesRead = blobStream.Read(buffer, 0, bufferSize);
if (numBytesRead != bufferSize)
{
var trimmedBuffer = new byte[numBytesRead];
Array.Copy(buffer, trimmedBuffer, numBytesRead);
return trimmedBuffer;
}
return buffer;
}
}
I have tested this (albeit with relatively small files < 2MB) and it does work, but my questions are:
Can someone suggest improvements to the code?
Is there a better approach given the requirement?
using (BlobStream blobStream = blob.OpenRead())
{
bool getSuccess = false;
int getTries = 0;
rawBytes = new byte[blobStream.Length];
blobStream.Seek(0, SeekOrigin.Begin);
int blockSize = 4194304; //Start at 4 mb per batch
int index = 0;
int documentSize = rawBytes.Length;
while (getTries <= 10 && !getSuccess)
{
try
{
int batchSize = blockSize;
while (index < documentSize)
{
if ((index + batchSize) > documentSize)
batchSize = documentSize - index;
blobStream.Read(rawBytes, index, batchSize);
index += batchSize;
}
getSuccess = true;
}
catch (Exception e)
{
if (getTries > 9)
throw e;
else
blockSize = blockSize / 2; // Reduce by half for each attempt
}
finally
{ getTries++; }
}
}
You could return the blob as a stream instead of a byte array. There is a code sample in a related question here: Returning Azure BLOB from WCF service as a Stream - Do we need to close it?
Note there are some restrictions on which bindings you can use when you return a stream.