Downloading a file from Google cloud storage is corrupted randomly - google-bigquery

I am trying to download data Bigquery data through Google cloud storage. Am able to send data from BigQuery to GCS but when downloading data from GCS to load the files are corrupted randomly.
getObject.getMediaHttpDownloader().setDirectDownloadEnabled(true);
out = fs.create(pathDir, true);
getObject.executeMediaAndDownloadTo(out);
boolean match= ismd5HashValid(o.getMd5Hash(), pathDir);
and to check md5 checksum
private boolean ismd5HashValid(String md5hash, String path) {
org.apache.hadoop.fs.Path pathDir = new org.apache.hadoop.fs.Path(path);
org.apache.hadoop.conf.Configuration conf = new org.apache.hadoop.conf.Configuration();
InputStream is = null;
try {
FileSystem fs = FileSystem.get(conf);
MessageDigest md = MessageDigest.getInstance("MD5");
is = fs.open(pathDir);
byte[] bytes = new byte[1024];
int numBytes;
while ((numBytes = is.read(bytes)) != -1) {
md.update(bytes, 0, numBytes);
}
byte[] digest = md.digest();
String result = new String(Base64.encodeBase64(digest));
Log.info("Source file md5hash {} Downloaded file md5hash {}", md5hash, result);
if (md5hash.equals(result)) {
Log.info("md5hash check is valid");
return true;
}
} catch (IOException e) {
// TODO Auto-generated catch block
Log.warn(e.getMessage(), e);
} catch (NoSuchAlgorithmException e) {
// TODO Auto-generated catch block
Log.warn(e.getMessage(), e);
} finally {
IOUtils.closeQuietly(is);
}
return false;
}

Related

Dropbox Java Api Upload File

How do I upload a file public and get link ? I am using Dropbox Java core api. Here.
public static void Yukle(File file) throws DbxException, IOException {
FileInputStream fileInputStream = new FileInputStream(file);
InputStream inputStream = fileInputStream;
try (InputStream in = new FileInputStream(file)) {
UploadBuilder metadata = clientV2.files().uploadBuilder("/"+file.getName());
metadata.withMode(WriteMode.OVERWRITE);
metadata.withClientModified(new Date());
metadata.withAutorename(false);
metadata.uploadAndFinish(in);
System.out.println(clientV2.files());
}
}
I use the following code to upload files to DropBox:
public DropboxAPI.Entry uploadFile(final String fullPath, final InputStream is, final long length, final boolean replaceFile) {
final DropboxAPI.Entry[] rev = new DropboxAPI.Entry[1];
rev[0] = null;
Thread t = new Thread(new Runnable() {
public void run() {
try {
if (replaceFile == true) {
try {
mDBApi.delete(fullPath);
} catch (Exception e) {
e.printStackTrace();
}
//! ReplaceFile is always true
rev[0] = mDBApi.putFile(fullPath, is, length, null, true, null);
} else {
rev[0] = mDBApi.putFile(fullPath, is, length, null, null);
}
} catch (DropboxException e) {
e.printStackTrace();
}
}
});
t.start();
try {
t.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
return rev[0];
}

Limiting upload file size in Playframework 2.2.x

Regarding file upload in play framework 2.2.3.
According to update on this question,
adding the following to application.conf should enable upload of files upto 10MB.
parsers.MultipartFormData.maxLength = 10240K
this does not work for me. I get "413, Request entity too large" error code for any file greater than 1MB.
I tried setting another field
parsers.text.maxLength=10M
Still upload fails with 413.
I upload files using XHR with a FormData that a can contain multiple files.
Upload Controller Code :
public Result uploadAttendeeFiles(){
try{
MultipartFormData body = request().body().asMultipartFormData();
List<FilePart> uploadedFiles = body.getFiles();
Map<String, String> returnMessages = new HashMap<String, String>();
String fileName ="", fileExtension="", fieldId = "";
int fileCounter = 0;
if (!CommonUtils.isEmpty(uploadedFiles)) {
for (FilePart filePar : uploadedFiles) {
try {
fileExtension = Files.getFileExtension(filePar.getFilename());
fieldId = body.asFormUrlEncoded().get("fieldId")[fileCounter];
fileName = body.asFormUrlEncoded().get("fileName")[fileCounter++];
InputStream in = new FileInputStream(filePar.getFile());
Object objectId = uploadService.loadFile(in, fileName, fileExtension);
if(objectId != null)
returnMessages.put(fieldId, objectId.toString());
else
returnMessages.put(fieldId, "failed-Failed to save file!");
} catch (IOException e) {
returnMessages.put(fileName+"."+fileExtension, "failed-Error while uploading file!");
e.printStackTrace();
}
}
}
else{
return ok("{\"errormessage\":\"No files selected!\"}");
}
return ok(Json.toJson(returnMessages));
} catch(Exception e) {
e.printStackTrace();
return ok("{\"errormessage\":\"Error while uploading files!\"}");
}
}

Adding new revision for document in DropBox through android api

I want to add a new revision to the document(Test.doc) in Dropbox using android api. Can anyone share me any sample code or links. I tried
FileInputStream inputStream = null;
try {
DropboxInputStream temp = mDBApi.getFileStream("/Test.doc", null);
String revision = temp.getFileInfo().getMetadata().rev;
Log.d("REVISION : ",revision);
File file = new File("/sdcard0/renamed.doc");
inputStream = new FileInputStream(file);
Entry newEntry = mDBApi.putFile("/Test.doc", inputStream, file.length(), revision, new ProgressListener() {
#Override
public void onProgress(long arg0, long arg1) {
Log.d("","Uploading.. "+arg0+", Total : "+arg1);
}
});
} catch (Exception e) {
System.out.println("Something went wrong: " + e);
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (IOException e) {}
}
}
New revision is created for first time. When i execute again, another new revision is not getting created.

save image file in specific directory jsf primefaces project

I want to save byte[] file into a specific directory :
I get it from this method :
public void setUploadedPicture(UploadedFile uploadedPicture)
{
System.out.println("set : "+uploadedPicture.getFileName()+" size : "+uploadedPicture.getSize());
this.uploadedPicture = uploadedPicture;
}
and I access the byte[] with :
uploadedPicture.getContents()
I tested this link but no result
how to save it into a specific directory either inside my project or outside
thank you
*********EDIT**********
here is the code whic works but sometimes I have the error :
public void setUploadedPicture(UploadedFile uploadedPicture)
{
System.out.println("set : "+uploadedPicture.getFileName()+" size : "+uploadedPicture.getSize());
this.uploadedPicture = uploadedPicture;
InputStream inputStr = null;
try {
inputStr = uploadedPicture.getInputstream();
} catch (IOException e) {
e.printStackTrace();
}
//create destination File
String destPath = "C:\\"+uploadedPicture.getFileName();
File destFile = new File(destPath);
//use org.apache.commons.io.FileUtils to copy the File
try {
FileUtils.copyInputStreamToFile(inputStr, destFile);
} catch (IOException e) {
e.printStackTrace();
}
}
public void handleFileUpload(FileUploadEvent event) {
//get uploaded file from the event
UploadedFile uploadedFile = (UploadedFile)event.getFile();
//create an InputStream from the uploaded file
InputStream inputStr = null;
try {
inputStr = uploadedFile.getInputstream();
} catch (IOException e) {
//log error
}
//create destination File
String destPath = "your path here";
File destFile = new File(destPath);
//use org.apache.commons.io.FileUtils to copy the File
try {
FileUtils.copyInputStreamToFile(inputStr, destFile);
} catch (IOException e) {
//log error
}
}

Managing trace files on Sql Server 2005

I need to manage the trace files for a database on Sql Server 2005 Express Edition. The C2 audit logging is turned on for the database, and the files that it's creating are eating up a lot of space.
Can this be done from within Sql Server, or do I need to write a service to monitor these files and take the appropriate actions?
I found the [master].[sys].[trace] table with the trace file properties. Does anyone know the meaning of the fields in this table?
Here's what I came up with that is working pretty good from a console application:
static void Main(string[] args)
{
try
{
Console.WriteLine("CcmLogManager v1.0");
Console.WriteLine();
// How long should we keep the files around (in months) 12 is the PCI requirement?
var months = Convert.ToInt32(ConfigurationManager.AppSettings.Get("RemoveMonths") ?? "12");
var currentFilePath = GetCurrentAuditFilePath();
Console.WriteLine("Path: {0}", new FileInfo(currentFilePath).DirectoryName);
Console.WriteLine();
Console.WriteLine("------- Removing Files --------------------");
var fileInfo = new FileInfo(currentFilePath);
if (fileInfo.DirectoryName != null)
{
var purgeBefore = DateTime.Now.AddMonths(-months);
var files = Directory.GetFiles(fileInfo.DirectoryName, "audittrace*.trc.zip");
foreach (var file in files)
{
try
{
var fi = new FileInfo(file);
if (PurgeLogFile(fi, purgeBefore))
{
Console.WriteLine("Deleting: {0}", fi.Name);
try
{
fi.Delete();
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
}
Console.WriteLine("------- Files Removed ---------------------");
Console.WriteLine();
Console.WriteLine("------- Compressing Files -----------------");
if (fileInfo.DirectoryName != null)
{
var files = Directory.GetFiles(fileInfo.DirectoryName, "audittrace*.trc");
foreach (var file in files)
{
// Don't attempt to compress the current log file.
if (file.ToLower() == fileInfo.FullName.ToLower())
continue;
var zipFileName = file + ".zip";
var fi = new FileInfo(file);
var zipEntryName = fi.Name;
Console.WriteLine("Zipping: \"{0}\"", fi.Name);
try
{
using (var fileStream = File.Create(zipFileName))
{
var zipFile = new ZipOutputStream(fileStream);
zipFile.SetLevel(9);
var zipEntry = new ZipEntry(zipEntryName);
zipFile.PutNextEntry(zipEntry);
using (var ostream = File.OpenRead(file))
{
int bytesRead;
var obuffer = new byte[2048];
while ((bytesRead = ostream.Read(obuffer, 0, 2048)) > 0)
zipFile.Write(obuffer, 0, bytesRead);
}
zipFile.Finish();
zipFile.Close();
}
fi.Delete();
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
}
}
Console.WriteLine("------- Files Compressed ------------------");
Console.WriteLine();
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
Console.WriteLine("Press any key...");
Console.ReadKey();
}
public static bool PurgeLogFile(FileInfo fi, DateTime purgeBefore)
{
try
{
var filename = fi.Name;
if (filename.StartsWith("audittrace"))
{
filename = filename.Substring(10, 8);
var year = Convert.ToInt32(filename.Substring(0, 4));
var month = Convert.ToInt32(filename.Substring(4, 2));
var day = Convert.ToInt32(filename.Substring(6, 2));
var logDate = new DateTime(year, month, day);
return logDate.Date <= purgeBefore.Date;
}
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
return false;
}
public static string GetCurrentAuditFilePath()
{
const string connStr = "Data Source=.\\SERVER;Persist Security Info=True;User ID=;Password=";
var dt = new DataTable();
var adapter =
new SqlDataAdapter(
"SELECT path FROM [master].[sys].[traces] WHERE path like '%audittrace%'", connStr);
try
{
adapter.Fill(dt);
if (dt.Rows.Count >= 1)
{
if (dt.Rows.Count > 1)
Console.WriteLine("More than one audit trace file defined! Count: {0}", dt.Rows.Count);
var path = dt.Rows[0]["path"].ToString();
return path.StartsWith("\\\\?\\") ? path.Substring(4) : path;
}
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
throw new Exception("No Audit Trace File in sys.traces!");
}
You can also set up SQL Trace to log to a SQL table. Then you can set up a SQL Agent task to auto-truncate records.
sys.traces has a record for every trace started on the server. Since SQL Express does not have Agent and cannot set up jobs, you'll need an external process or service to monitor these. You'll have to roll your own everything (monitoring, archiving, trace retention policy etc). If you have C2 audit in place, I assume you have policies in place that determine the duration audit has to be retained.