Identify a file or directory on a SSH machine using net.schmizz.sshj.sftp.SFTPClient api. - ssh

I want to identify whether a the given path is a valid path for file or directory using net.schmizz.sshj.sftp.SFTPClient api and based on i need to take decision that if it is a valid file path then i need to access its parent directory. my code looks like below>
SSHClient ssh = new SSHClient();
String rsaKey = "e3:27:12:a9:62:9a:46:cc:98:ee:0d:b7:38:72:a0:63";
String host = "10.235.1.154";
String uName = "root";
String pwd = "pspl#123";
String url = "/root/ram2.log/";
String testUrl = host + url;
ssh.addHostKeyVerifier(rsaKey);
List fileItems = new ArrayList();
1.try {
2. ssh.connect(host);
3. ssh.authPassword(uName,pwd);
4. SFTPClient sftp = ssh.newSFTPClient();
5.
6. if(testUrl.startsWith(host)){
7. String[] splitedStrings = testUrl.split(host);
8. String str = splitedStrings[1];
9. url = str;
10. }else{
11. url = url;
12. }
13.
14.
15. List<RemoteResourceInfo> fileInfoList = sftp.ls(url, new RemoteResourceFilter() {
16. public boolean accept(RemoteResourceInfo rrInfo) {
17. return rrInfo.getName().charAt(0) != '.';
18. }
19. });
20.
21.
22. for (RemoteResourceInfo fileInfo : fileInfoList) {
23. //files.add(str + "/" + fileInfo.getName());
24. String fileName = fileInfo.getName();
25. if (fileInfo.isDirectory()) {
FileItem childFileItem = new FileItem();
childFileItem.setPath(host + url + fileName);
fileItems.add(childFileItem);
} else {
int dotIndex = fileName.lastIndexOf('.');
String ext = dotIndex > 0 ? fileName
.substring(dotIndex + 1) : "";
FileItem childFileItem = new FileItem();
childFileItem.setPath(host + url + fileName);
childFileItem.setDirectory(false);
fileItems.add(childFileItem);
}
}
} catch (IOException e) {
System.out.println("Couldn't resolve host : {} "+ host);
}
return fileItems;
Problem: Line no. 15 is throwing error saying no such file if I m giving the path as "/root/ram2.log/" even though the file ram2.log does exis on server.
Any help on this wud be graet helpful.

You can use lstat to get information about file system objects.
FileAttributes attributes = sftp.lstat(url);
if (attributes.getType() == FileMode.Type.DIRECTORY) {
...
}
But i think your actual problem is that something is odd with the directory "/root/ram2.log/". Maybe no permission, maybe it's not visible to you, maybe it contains a file with a name that isn't encoded properly.

Related

How to retain original Last Modified/Created date of the file on FTP when we try to download them using FTPClient Class and Selenium [duplicate]

I am using org.apache.commons.net.ftp.FTPClient for retrieving files from a ftp server. It is crucial that I preserve the last modified timestamp on the file when its saved on my machine. Do anyone have a suggestion for how to solve this?
This is how I solved it:
public boolean retrieveFile(String path, String filename, long lastModified) throws IOException {
File localFile = new File(path + "/" + filename);
OutputStream outputStream = new FileOutputStream(localFile);
boolean success = client.retrieveFile(filename, outputStream);
outputStream.close();
localFile.setLastModified(lastModified);
return success;
}
I wish the Apache-team would implement this feature.
This is how you can use it:
List<FTPFile> ftpFiles = Arrays.asList(client.listFiles());
for(FTPFile file : ftpFiles) {
retrieveFile("/tmp", file.getName(), file.getTimestamp().getTime());
}
You can modify the timestamp after downloading the file.
The timestamp can be retrieved through the LIST command, or the (non standard) MDTM command.
You can see here how to do modify the time stamp: that: http://www.mkyong.com/java/how-to-change-the-file-last-modified-date-in-java/
When download list of files, like all files returned by by FTPClient.mlistDir or FTPClient.listFiles, use the timestamp returned with the listing to update timestemp of local downloaded files:
String remotePath = "/remote/path";
String localPath = "C:\\local\\path";
FTPFile[] remoteFiles = ftpClient.mlistDir(remotePath);
for (FTPFile remoteFile : remoteFiles) {
File localFile = new File(localPath + "\\" + remoteFile.getName());
OutputStream outputStream = new BufferedOutputStream(new FileOutputStream(localFile));
if (ftpClient.retrieveFile(remotePath + "/" + remoteFile.getName(), outputStream))
{
System.out.println("File " + remoteFile.getName() + " downloaded successfully.");
}
outputStream.close();
localFile.setLastModified(remoteFile.getTimestamp().getTimeInMillis());
}
When downloading a single specific file only, use FTPClient.mdtmFile to retrieve the remote file timestamp and update timestamp of the downloaded local file accordingly:
File localFile = new File("C:\\local\\path\\file.zip");
FTPFile remoteFile = ftpClient.mdtmFile("/remote/path/file.zip");
if (remoteFile != null)
{
OutputStream outputStream = new BufferedOutputStream(new FileOutputStream(localFile));
if (ftpClient.retrieveFile(remoteFile.getName(), outputStream))
{
System.out.println("File downloaded successfully.");
}
outputStream.close();
localFile.setLastModified(remoteFile.getTimestamp().getTimeInMillis());
}

Uploading file to Server using JSP

i was searching for a way to upload a file to server and found the following code ..
File file ;
int maxFileSize = 5000 * 1024;
int maxMemSize = 5000 * 1024;
ServletContext context = pageContext.getServletContext();
String filePath = context.getInitParameter("file-upload");
// Verify the content type
String contentType = request.getContentType();
if ((contentType.indexOf("multipart/form-data") >= 0)) {
DiskFileItemFactory factory = new DiskFileItemFactory();
// maximum size that will be stored in memory
factory.setSizeThreshold(maxMemSize);
// Location to save data that is larger than maxMemSize.
factory.setRepository(new File("/user2/tst/test"));
// Create a new file upload handler
ServletFileUpload upload = new ServletFileUpload(factory);
// maximum file size to be uploaded.
upload.setSizeMax( maxFileSize );
try{
// Parse the request to get file items.
List fileItems = upload.parseRequest(request);
// Process the uploaded file items
Iterator i = fileItems.iterator();
out.println("<html>");
out.println("<head>");
out.println("<title>JSP File upload</title>");
out.println("</head>");
out.println("<body>");
while ( i.hasNext () )
{
FileItem fi = (FileItem)i.next();
if ( !fi.isFormField () )
{
// Get the uploaded file parameters
String fieldName = fi.getFieldName();
String fileName = fi.getName();
boolean isInMemory = fi.isInMemory();
long sizeInBytes = fi.getSize();
// Write the file
if( fileName.lastIndexOf("\\") >= 0 ){
file = new File( filePath +
fileName.substring( fileName.lastIndexOf("\\"))) ;
}else{
file = new File( filePath +
fileName.substring(fileName.lastIndexOf("\\")+1)) ;
}
fi.write( file ) ;
out.println("Uploaded Filename: " + filePath +
fileName + "<br>");
}
}
out.println("</body>");
out.println("</html>");
}catch(Exception ex) {
System.out.println(ex);
}
}else{
out.println("<html>");
out.println("<head>");
out.println("<title>Servlet upload</title>");
out.println("</head>");
out.println("<body>");
out.println("<p>No file uploaded</p>");
out.println("</body>");
out.println("</html>");
}
this code is working perfectly fine .. the only problem i am facing is , the file that is updated is getting stored in apache/bin folder ..
for this it was specified to add the following code to web.xml that is available in ROOT/WEB-INF
<context-param>
<description>Location to store uploaded file</description>
<param-name>file-upload</param-name>
<param-value>
/user2/tst
</param-value>
</context-param>
even after this the file is getting stored in bin folder ..
i needed some help in this regard .. thanks in advance ..

Efficient Way to do batch import XMI in Enterprise Architect

Our team are using Enterprise Architect version 10 and SVN for the repository.
Because the EAP file size is quite big (e.g. 80 MB), we exports each packages into separate XMI and stored it into SVN. The EAP file itself is committed after some milestone. The problem is to synchronize the EAP file with work from co worker during development, we need to import lots of XMI (e.g. total can be 500 files).
I know that once the EAP file is updated, we can use Package Control -> Get All Latest. Therefore this problem occurs only during parallel development.
We have used keyboard shorcuts to do the import as follow:
Ctrl+Alt+I (Import package from XMI file)
Select the file name to import
Alt+I (Import)
Enter (Yes)
Repeat step number 2 to 4 until module finished
But still, importing hundreds of file is inefficient.
I've checked that the Control Package has Batch Import/Export. The batch import/export are working when I explicitly hard-coded the XMI Filename, but the options are not available if using version control (batch import/export options are greyed).
Is there any better ways to synchronize EAP and XMI files?
There is a scripting interface in EA. You might be able to automate the import using that. I've not used it but its probably quite good.
I'm not sure I fully understand your working environment, but I have some general points that may be of interest. It might be that if you use EA in a different way (especially my first point below), the need to batch import might go away.
Multiworker
First, multiple people can work on the same EAP file at a time. The EAP file is nothing more than an Access database file, and EA uses locking to stop multiple people editing the same package at the same time. But you can comfortably have multiple people editing different packages in one EAP file at the same time. Putting the EAP file on a file share somewhere is a good way of doing it.
Inbuilt Revision Control
Secondly, EA can interact directly with SVN (and other revision control systems). See this. In short, you can setup your EAP file so that individual packages (and everything below them) is SVN controlled. You can then check out an individual package, edit it, check it back in. Or indeed you can check out the whole branch below a package (including sub packages that are themselves SVN controlled).
Underneath the hood EA is importing and exporting XMI files and checking them in and out of SVN, whilst the EAP file is always the head revision. Just like what you're doing by hand, but automated. It makes sense given that you can all use the one single EAP file. You do have to be a bit careful rolling back - links originating from objects in older versions of one package might be pointing at objects that no longer exist (but you can look at the import log errors to see if this is the case). It takes a bit of getting used to, but it works pretty well.
There's also the built in package baselining functionality - that might be all you need anyway, and works quite well especially if you're all using the same EAP file.
Bigger Database Engine
Thirdly, you don't have to have an EAP file at all. The model's database can be in any suitable database system (MySQL, SQL Server, Oracle, etc). So that gives you all sorts of options for scaling up how its used, what its like over a WAN/Internet, etc.
In short Sparx have been quite sensible about how EA can be used in a multi-worker environment, and its worth exploiting that.
I have created the EA script using JScript for the automation
Here is the script to do the export:
!INC Local Scripts.EAConstants-JScript
/*
* Script Name : Export List of SVN Packages
* Author : SDK
* Purpose : Export a package and all of its subpackages information related to version
* controlled. The exported file then can be used to automatically import
* the XMIs
* Date : 30 July 2013
* HOW TO USE : 1. Select the package that you would like to export in the Project Browser
* 2. Change the output filepath in this script if necessary.
* By default it is "D:\\EAOutput.txt"
* 3. Send the output file to your colleague who wanted to import the XMIs
*/
var f;
function main()
{
// UPDATE THE FOLLOWING OUTPUT FILE PATH IF NECESSARY
var filename = "D:\\EAOutput.txt";
var ForReading = 1, ForWriting = 2, ForAppending = 8;
Repository.EnsureOutputVisible( "Script" );
Repository.ClearOutput( "Script" );
Session.Output("Start generating output...please wait...");
var treeSelectedType = Repository.GetTreeSelectedItemType();
switch ( treeSelectedType )
{
case otPackage:
{
var fso = new ActiveXObject("Scripting.FileSystemObject");
f = fso.OpenTextFile(filename, ForWriting, true);
var selectedObject as EA.Package;
selectedObject = Repository.GetContextObject();
reportPackage(selectedObject);
loopChildPackages(selectedObject);
f.Close();
Session.Output( "Done! Check your output at " + filename);
break;
}
default:
{
Session.Prompt( "This script does not support items of this type.", promptOK );
}
}
}
function loopChildPackages(thePackage)
{
for (var j = 0 ; j < thePackage.Packages.Count; j++)
{
var child as EA.Package;
child = thePackage.Packages.GetAt(j);
reportPackage(child);
loopChildPackages(child);
}
}
function getParentPath(childPackage)
{
if (childPackage.ParentID != 0)
{
var parentPackage as EA.Package;
parentPackage = Repository.GetPackageByID(childPackage.ParentID);
return getParentPath(parentPackage) + "/" + parentPackage.Name;
}
return "";
}
function reportPackage(thePackage)
{
f.WriteLine("GUID=" + thePackage.PackageGUID + ";"
+ "NAME=" + thePackage.Name + ";"
+ "VCCFG=" + getVCCFG(thePackage) + ";"
+ "XML=" + thePackage.XMLPath + ";"
+ "PARENT=" + getParentPath(thePackage).substring(1) + ";"
);
}
function getVCCFG(thePackage)
{
if (thePackage.IsVersionControlled)
{
var array = new Array();
array = (thePackage.Flags).split(";");
for (var z = 0 ; z < array.length; z++)
{
var pos = array[z].indexOf('=');
if (pos > 0)
{
var key = array[z].substring(0, pos);
var value = array[z].substring(pos + 1);
if (key=="VCCFG")
{
return (value);
}
}
}
}
return "";
}
main();
And the script to do the import:
!INC Local Scripts.EAConstants-JScript
/*
* Script Name : Import List Of SVN Packages
* Author : SDK
* Purpose : Imports a package with all of its sub packages generated from
* "Export List Of SVN Packages" script
* Date : 01 Aug 2013
* HOW TO USE : 1. Get the output file generated by "Export List Of SVN Packages" script
* from your colleague
* 2. Get the XMIs in the SVN local copy
* 3. Change the path to the output file in this script if necessary (var filename).
* By default it is "D:\\EAOutput.txt"
* 4. Change the path to local SVN
* 5. Run the script
*/
var f;
var svnPath;
function main()
{
// CHANGE THE FOLLOWING TWO LINES ACCORDING TO YOUR INPUT AND LOCAL SVN COPY
var filename = "D:\\EAOutput.txt";
svnPath = "D:\\svn.xxx.com\\yyy\\docs\\design\\";
var ForReading = 1, ForWriting = 2, ForAppending = 8;
Repository.EnsureOutputVisible( "Script" );
Repository.ClearOutput( "Script" );
Session.Output("[INFO] Start importing packages from " + filename + ". Please wait...");
var fso = new ActiveXObject("Scripting.FileSystemObject");
f = fso.OpenTextFile(filename, ForReading);
// Read from the file and display the results.
while (!f.AtEndOfStream)
{
var r = f.ReadLine();
parseLine(r);
Session.Output("--------------------------------------------------------------------------------");
}
f.Close();
Session.Output("[INFO] Finished");
}
function parseLine(line)
{
Session.Output("[INFO] Parsing " + line);
var array = new Array();
array = (line).split(";");
var guid;
var name;
var isVersionControlled;
var xmlPath;
var parentPath;
isVersionControlled = false;
xmlPath = "";
for (var z = 0 ; z < array.length; z++)
{
var pos = array[z].indexOf('=');
if (pos > 0)
{
var key = array[z].substring(0, pos);
var value = array[z].substring(pos + 1);
if (key=="GUID") {
guid = value;
} else if (key=="NAME") {
name = value;
} else if (key=="VCCFG") {
if (value != "") {
isVersionControlled = true;
}
} else if (key=="XML") {
if (isVersionControlled) {
xmlPath = value;
}
} else if (key=="PARENT") {
parentPath = value;
}
}
}
// Quick check for target if already exist to speed up process
var targetPackage as EA.Package;
targetPackage = Repository.GetPackageByGuid(guid);
if (targetPackage != null)
{
// target exists, do not do anything
Session.Output("[DEBUG] Target package \"" + name + "\" already exist");
return;
}
var paths = new Array();
var packages = new Array(paths.Count);
for (var i = 0; i < paths.Count; i++)
{
packages[i] = null;
}
paths = (parentPath).split("/");
if (paths.Count < 2)
{
Session.Output("[INFO] Skipped root or level1");
return;
}
packages[0] = selectRoot(paths[0]);
packages[1] = selectPackage(packages[0], paths[1]);
if (packages[1] == null)
{
Session.Output("[ERROR] Cannot find " + paths[0] + "/" + paths[1] + "in Project Browser");
return;
}
for (var j = 2; j < paths.length; j++)
{
packages[j] = selectPackage(packages[j - 1], paths[j]);
if (packages[j] == null)
{
Session.Output("[DEBUG] Creating " + packages[j].Name);
// create the parent package
var parent as EA.Package;
parent = Repository.GetPackageByGuid(packages[j-1].PackageGUID);
packages[j] = parent.Packages.AddNew(paths[j], "");
packages[j].Update();
parent.Update();
parent.Packages.Refresh();
break;
}
}
// Check if name (package to import) already exist or not
var targetPackage = selectPackage(packages[paths.length - 1], name);
if (targetPackage == null)
{
if (xmlPath == "")
{
Session.Output("[DEBUG] Creating " + name);
// The package is not SVN controlled
var newPackage as EA.Package;
newPackage = packages[paths.length - 1].Packages.AddNew(name,"");
Session.Output("New GUID = " + newPackage.PackageGUID);
newPackage.Update();
packages[paths.length - 1].Update();
packages[paths.length - 1].Packages.Refresh();
}
else
{
// The package is not SVN controlled
Session.Output("[DEBUG] Need to import: " + svnPath + xmlPath);
var project as EA.Project;
project = Repository.GetProjectInterface;
var result;
Session.Output("GUID = " + packages[paths.length - 1].PackageGUID);
Session.Output("GUID XML = " + project.GUIDtoXML(packages[paths.length - 1].PackageGUID));
Session.Output("XMI file = " + svnPath + xmlPath);
result = project.ImportPackageXMI(project.GUIDtoXML(packages[paths.length - 1].PackageGUID), svnPath + xmlPath, 1, 0);
Session.Output(result);
packages[paths.length - 1].Update();
packages[paths.length - 1].Packages.Refresh();
}
}
else
{
// target exists, do not do anything
Session.Output("[DEBUG] Target package \"" + name + "\" already exist");
}
}
function selectPackage(thePackage, childName)
{
var childPackage as EA.Package;
childPackage = null;
if (thePackage == null)
return null;
for (var i = 0; i < thePackage.Packages.Count; i++)
{
childPackage = thePackage.Packages.GetAt(i);
if (childPackage.Name == childName)
{
Session.Output("[DEBUG] Found " + childName);
return childPackage;
}
}
Session.Output("[DEBUG] Cannot find " + childName);
return null;
}
function selectRoot(rootName)
{
for (var y = 0; y < Repository.Models.Count; y++)
{
root = Repository.Models.GetAt(y);
if (root.Name == rootName)
{
return root;
}
}
return null;
}
main();

how to take log file backup automatically

How to take automatically backup of a log file(.txt) when it's size reached a threshold level, say 5MB. The backup file name should be like (log_file_name)_(system_date) and original log file should be cleaned(0 KB).
Please help. Thanks in advance.
Check your log file size using lenght().Then check if its bigger then 5mb call extendLogFile() func.
This is c# code u can easly convert to java
Size check:
if (size > 400 * 100 * 100)
{
extendLogFile(Path);
}
Copy old log file in archive directory and create new log file:
private static void extendLogFile(string lPath)
{
string name = lPath.Substring(0, lPath.LastIndexOf("."));
string UniquName = GenerateUniqueNameUsingDate(); // create a unique name for old log files like '12-04-2013-12-43-00'
string ArchivePath = System.IO.Path.GetDirectoryName(lPath) + "\\Archive";
if (!string.IsNullOrEmpty(ArchivePath) && !System.IO.Directory.Exists(ArchivePath))
{
System.IO.Directory.CreateDirectory(ArchivePath);
}
string newName = ArcivePath + "\\" + UniquName;
if (!File.Exists(newName))
{
File.Copy(lPath, newName + ".txt");
using (FileStream stream = new FileStream(lPath, FileMode.Create))
using (TextWriter writer = new StreamWriter(stream))
{
writer.WriteLine("");
}
}
}

ASP MVC 2 Uploading file to database (blob)

I am trying to upload a file via a form and then save in in SQL as a blob.
I already have my form working fine, my database is fully able to take the blob and I have a controller that take the file, saves it in a local directory:
[AcceptVerbs(HttpVerbs.Post)]
public ActionResult FileUpload(int id, HttpPostedFileBase uploadFile)
{
//allowed types
string typesNonFormatted = "text/plain,application/msword,application/pdf,image/jpeg,image/png,image/gif";
string[] types = typesNonFormatted.Split(',');
//
//Starting security check
//checking file size
if (uploadFile.ContentLength == 0 && uploadFile.ContentLength > 10000000)
ViewData["StatusMsg"] = "Could not upload: File too big (max size 10mb) or error while transfering the file.";
//checking file type
else if(types.Contains(uploadFile.ContentType) == false)
ViewData["StatusMsg"] = "Could not upload: Illigal file type!<br/> Allowed types: images, Ms Word documents, PDF, plain text files.";
//Passed all security checks
else
{
string filePath = Path.Combine(HttpContext.Server.MapPath("../Uploads"),
Path.GetFileName(uploadFile.FileName)); //generating path
uploadFile.SaveAs(filePath); //saving file to final destination
ViewData["StatusMsg"] = "Uploaded: " + uploadFile.FileName + " (" + Convert.ToDecimal(uploadFile.ContentLength) / 1000 + " kb)";
//saving file to database
//
//MISSING
}
return View("FileUpload", null);
}
Now all I am missing is putting the file in the database. I could not find anything on the subject... I found some way to do it in a regular website but nothing in MVC2.
Any kind of help would be welcome!
Thank you.
This could help: http://byatool.com/mvc/asp-net-mvc-upload-image-to-database-and-show-image-dynamically-using-a-view/
Since you have HttpPostedFileBase in your controllers method, all you need to do is:
int length = uploadFile.ContentLength;
byte[] tempImage = new byte[length];
myDBObject.ContentType = uploadFile.ContentType;
uploadFile.InputStream.Read(tempImage, 0, length);
myDBObject.ActualImage = tempImage ;
HttpPostedFileBase has a InputStream property
Hope this helps.
Alright thanks to kheit, I finaly got it working. Here's the final solution, it might help someone out there.
This script method takes all the file from a directory and upload them to the database:
//upload all file from a directory to the database as blob
public void UploadFilesToDB(long UniqueId)
{
//directory path
string fileUnformatedPath = "../Uploads/" + UniqueId; //setting final path with unique id
//getting all files in directory ( if any)
string[] FileList = System.IO.Directory.GetFiles(HttpContext.Server.MapPath(fileUnformatedPath));
//for each file in direcotry
foreach (var file in FileList)
{
//extracting file from directory
System.IO.FileStream CurFile = System.IO.File.Open(file, System.IO.FileMode.Open);
long fileLenght = CurFile.Length;
//converting file to a byte array (byte[])
byte[] tempFile = new byte[fileLenght];
CurFile.Read(tempFile, 0, Convert.ToInt32(fileLenght));
//creating new attachment
IW_Attachment CurAttachment = new IW_Attachment();
CurAttachment.attachment_blob = tempFile; //setting actual file
string[] filedirlist = CurFile.Name.Split('\\');//setting file name
CurAttachment.attachment_name = filedirlist.ElementAt(filedirlist.Count() - 1);//setting file name
//uploadind attachment to database
SubmissionRepository.CreateAttachment(CurAttachment);
//deleting current file fromd directory
CurFile.Flush();
System.IO.File.Delete(file);
CurFile.Close();
}
//deleting directory , it should be empty by now
System.IO.Directory.Delete(HttpContext.Server.MapPath(fileUnformatedPath));
}
(By the way IW_Attachment is the name of one of my database table)