Creating .doc on Google Drive - google-drive-android-api

My goal is to programmatically create .doc on Google Drive with text and generated shapes.
My first problem that I cannot overcome is that when I create a doc file the file is generated saved on google drive but can not be opened .
Before when I created plain text files everything was OK.
private void createFile() {
final Task<DriveFolder> rootFolderTask = getDriveResourceClient().getRootFolder();
final Task<DriveContents> createContentsTask = getDriveResourceClient().createContents();
Tasks.whenAll(rootFolderTask, createContentsTask)
.continueWithTask(new Continuation<Void, Task<DriveFile>>() {
#Override
public Task<DriveFile> then(#NonNull Task<Void> task) throws Exception {
DriveFolder parent = rootFolderTask.getResult();
final Task<MetadataBuffer> Createlist = getDriveResourceClient().listChildren(parent);
DriveContents contents = createContentsTask.getResult();
OutputStream outputStream = contents.getOutputStream();
try (Writer writer = new OutputStreamWriter(outputStream)) {
writer.write("Witaj z nuclearhelperaa");
}
MetadataChangeSet changeSet = new MetadataChangeSet.Builder()
.setTitle("test6")
// .setMimeType("text/plain")
.setMimeType("application/msword")
.setStarred(true)
.build();
return getDriveResourceClient().createFile(parent, changeSet, contents);
}
})
.addOnSuccessListener(this,
new OnSuccessListener<DriveFile>() {
#Override
public void onSuccess(DriveFile driveFile) {
showMessage(getString(R.string.file_created,
driveFile.getDriveId().encodeToString()));
finish();
}
})
.addOnFailureListener(this, new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
Log.e(TAG, "Unable to create file", e);
showMessage(getString(R.string.file_create_error));
finish();
}
});
// [END create_file]
}
Is my task possible? I mean after creating a doc put some images formatted text and generated shapes (I need to create very simple picture containing couple dots) is possible to fulfil and if yes do you know some good libraries to make it possible?

Related

Is it possible to cancel a call to speakTextAsync?

I'm using the javascript SDK of Microsoft Speech Synthesizer and calling speakTextAsync to convert text to speech.
This works perfectly, but sometimes the text is long and I want to be able to cancel in the middle, but I cannot find any way to do this. The documentation doesn't seem to indicate any way to cancel. The name speakTextAsync suggests that it returns a Task that could be cancelled, but in fact the method returns undefined, and I can't find any other way to do this. How can this be done?
Seems there is no way to stop it when it is speaking. But actually,as a workaround, you can just download the audio file and play the file yourself so that you can control everything. try the code below:
import com.microsoft.cognitiveservices.speech.*;
import com.microsoft.cognitiveservices.speech.audio.AudioConfig;
import java.nio.file.*;
import java.io.*;
import javax.sound.sampled.*;
public class TextToSpeech {
public static void main(String[] args) {
try {
String speechSubscriptionKey = "key";
String serviceRegion = "location";
String audioTempPath = "d://test.wav"; //temp file location
SpeechConfig config = SpeechConfig.fromSubscription(speechSubscriptionKey, serviceRegion);
AudioConfig streamConfig = AudioConfig.fromWavFileOutput(audioTempPath);
SpeechSynthesizer synth = new SpeechSynthesizer(config, streamConfig);
String filePath = "....//test2.txt"; // .txt file for test with long text
Path path = Paths.get(filePath);
String text = Files.readString(path);
synth.SpeakText(text);
Thread thread = new Thread(new Speaker(audioTempPath));
thread.start();
System.out.println("play audio for 8s...");
Thread.sleep(8000);
System.out.println("stop play audio");
thread.stop();
} catch (Exception ex) {
System.out.println("Unexpected exception: " + ex);
assert (false);
System.exit(1);
}
}
}
class Speaker implements Runnable {
private String path;
public String getText(String path) {
return this.path;
}
public Speaker(String path) {
this.path = path;
}
public void run() {
try {
File file = new File(path);
AudioInputStream stream;
AudioFormat format;
DataLine.Info info;
Clip clip;
stream = AudioSystem.getAudioInputStream(file);
format = stream.getFormat();
info = new DataLine.Info(Clip.class, format);
clip = (Clip) AudioSystem.getLine(info);
clip.open(stream);
clip.start();
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
}

Files uploaded but not appearing on server

I use the code stated here to upload files through a webapi http://bartwullems.blogspot.pe/2013/03/web-api-file-upload-set-filename.html. I also made the following api to list all the files I have :
[HttpPost]
[Route("sharepoint/imageBrowser/listFiles")]
[SharePointContextFilter]
public async Task<HttpResponseMessage> Read()
{
string pathImages = HttpContext.Current.Server.MapPath("~/Content/images");
DirectoryInfo d = new DirectoryInfo(pathImages);//Assuming Test is your Folder
FileInfo[] Files = d.GetFiles(); //Getting Text files
List<object> lst = new List<object>();
foreach (FileInfo f in Files)
{
lst.Add(new
{
name = f.Name,
type = "f",
size = f.Length
});
}
return Request.CreateResponse(HttpStatusCode.OK, lst);
}
When calling this api, all the files uploaded are listed. But when I go to azure I dont see any of them (Content.png is a file I manually uploaded to azure)
Why are the files listed if they dont appear on azure.
According to your description, I suggest you could firstly use azure kudu console to locate the right folder in the azure web portal to see the image file.
Open kudu console:
In the kudu click the debug console and locate the site\wwwroot\yourfilefolder
If you find your file is still doesn't upload successfully, I guess there maybe something wrong with your upload codes. I suggest you could try below codes.
Notice: You need add image folder in the wwwort folder.
{
public class UploadingController : ApiController
{
public async Task<HttpResponseMessage> PostFile()
{
// Check if the request contains multipart/form-data.
if (!Request.Content.IsMimeMultipartContent())
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
string root = Environment.GetEnvironmentVariable("HOME").ToString() + "\\site\\wwwroot\\images";
//string root = HttpContext.Current.Server.MapPath("~/images");
var provider = new FilenameMultipartFormDataStreamProvider(root);
try
{
StringBuilder sb = new StringBuilder(); // Holds the response body
// Read the form data and return an async task.
await Request.Content.ReadAsMultipartAsync(provider);
// This illustrates how to get the form data.
foreach (var key in provider.FormData.AllKeys)
{
foreach (var val in provider.FormData.GetValues(key))
{
sb.Append(string.Format("{0}: {1}\n", key, val));
}
}
// This illustrates how to get the file names for uploaded files.
foreach (var file in provider.FileData)
{
FileInfo fileInfo = new FileInfo(file.LocalFileName);
sb.Append(string.Format("Uploaded file: {0} ({1} bytes)\n", fileInfo.Name, fileInfo.Length));
}
return new HttpResponseMessage()
{
Content = new StringContent(sb.ToString())
};
}
catch (System.Exception e)
{
return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, e);
}
}
}
public class FilenameMultipartFormDataStreamProvider : MultipartFormDataStreamProvider
{
public FilenameMultipartFormDataStreamProvider(string path) : base(path)
{
}
public override string GetLocalFileName(System.Net.Http.Headers.HttpContentHeaders headers)
{
var name = !string.IsNullOrWhiteSpace(headers.ContentDisposition.FileName) ? headers.ContentDisposition.FileName : Guid.NewGuid().ToString();
return name.Replace("\"", string.Empty);
}
}
}
Result:

Apache lucene 4.3.1 - Index reader is not reach to the last indexed document

In My App I have documents represents my data for each category, my application perform an automatic index to new and the modified documents.
if i performed index for all documents in one category, its work fine and retrieve a correct results, but the problem is, if i modified or create new document its will not retrieve it, if its matched my search query.
usually keeps return all docs except the last modified one.
any help please ?
I have this IndexWriter config :
private IndexWriter getIndexWriter() throws IOException {
Directory directory = FSDirectory.open(new File(filepath));
IndexWriterConfig config = new IndexWriterConfig(Version.LUCENE_43, IndexFactory.ANALYZER);
config.setRAMBufferSizeMB(350);
TieredMergePolicy tmp = new TieredMergePolicy();
tmp.setUseCompoundFile(false);
config.setMergePolicy(tmp);
ConcurrentMergeScheduler scheduler = (ConcurrentMergeScheduler) config.getMergeScheduler();
scheduler.setMaxThreadCount(2);
scheduler.setMaxMergeCount(20);
IndexWriter writer = new IndexWriter(directory, config);
writer.forceMerge(1);
return writer;
My Collector :
public void collect(int docNum) throws IOException {
try {
if ((getCount() == getMaxSearchLimit() + 1) && getMaxSearchResults() != null) {
setCounterExceededLimit(true);
return;
}
addDocKey();// method to add and render the matching docs by customize way
} catch(IOException exp) {
if (!getErrors().toArrayList(getApplication().getLocale()).contains(exp.getMessage())) {
getErrors().addError(exp.getMessage());
}
} catch (BusinessException bEx) {
if (!getErrors().containsError(bEx.getErrorNumber())) {
getErrors().addError(bEx);
}
} catch (CounterExceededLimitException counterEx) {
return;
}
}
#Override
public boolean acceptsDocsOutOfOrder() {
// TODO Auto-generated method stub
return true;
}
#Override
public void setNextReader(AtomicReaderContext context) throws IOException {
// TODO Auto-generated method stub
}
#Override
public void setScorer(Scorer scorer) throws IOException {
// TODO Auto-generated method stub
}
acually i have this busniess logic to save my doc, then i asked if the doc saved successfully to add it to the index process.
public boolean saveDocument(CategoryDocument doc) {
boolean saved = false;
// code to save my doc
if(saved) {
//add this document to the index process
IndexManager.getInstance().addToIndex(this);
}
}
then my index manager create a new thread to handle indexing this doc.
here is my process to index my data document :
private void processDocument(IndexDocument indexDoc, DocKey docKey, boolean addToIndex) throws SearchException, BusinessException {
CategorySetting catSetting = docKey.getCategorySetting();
Integer catID = catSetting.getID();
IndexManager manager = IndexManager.getInstance();
IndexWriter writer = null;
try {
//Delete the lock file in case previous index operation failed to delete it
File lockFile = new File(filepath, IndexWriter.WRITE_LOCK_NAME);
if (lockFile != null && lockFile.exists()) {
lockFile.delete();
}
if(!manager.isGlobalIndexingProcess(catID)) {
writer = getIndexWriter();
} else {
writer = manager.getGlobalIndexWriter(catID);
}
writer.forceMerge(1);
removeDocument(docKey, writer);
if (addToIndex) {
writer.addDocument(indexDoc.getLuceneIndexDoc());
}
} catch(IOException exp) {
throw new SearchException(exp.getMessage(), true);
} finally {
if(!manager.isGlobalIndexingProcess(catID)) {
if (writer != null) {
try {
writer.close(true);
} catch(IOException ex) {
throw new SearchException(ex);
}
}
}
}
}
Use lucene search and search for the word or phrase that you edited in the document and let us know whether you get the correct hits or not. If you didn't get any hits then probably you are not indexing edited or newly added documents.

org.apache.commons.io.FileCleaningTracker does not delete temp files unless explicitly calling System.gc()?

I am working on a upload image feature for my web app, and am having a strange issue with the "FileCleaningTracker" from apache commons fileupload. I have a ImageUploadService with a instance variable FileCleaningTracker, then I have a upload method that creates an instance of DiskFileItemFactory and then references the FileCleaningTracker, after the upload method completes successfully, I set the FileCleaningTracker of DiskFileItemFactory to null, so i would expect the DiskFileItemFactory to be garbage collected and then the underlying subclass of PhantomReference in FileCleaningTracker will be notified hence delete the temp file the DiskFileItemFactory created.
But that does not happen until I null the DiskFileItemFactory and call System.gc() (only nulling the DiskFileItemFactory does not help) at the end of the upload method. THis seems very strange to me. Here is my code :
#Override
public void upload(final HttpServletRequest request) {
ValidateUtils.checkNotNull(request, "upload request");
final File tmp = new File(this.tempFolder);
if (!tmp.exists()) {
tmp.mkdir();
}
DiskFileItemFactory fileItemFactory = new DiskFileItemFactory(this.sizeThreshold, tmp);
fileItemFactory.setFileCleaningTracker(this.fileCleaningTracker);
ServletFileUpload uploadHandler = new ServletFileUpload(fileItemFactory);
List items;
try {
items = uploadHandler.parseRequest(request);
} catch (final FileUploadException e) {
throw new ImageUploadServiceException("Error parsing the http servlet request for image upload.", e);
}
final Iterator it = items.iterator();
while (it.hasNext()) {
final DiskFileItem item = (DiskFileItem) it.next();
if (item.isFormField()) {
// log message
} else {
final String fileName = item.getName();
final File destination = this.createFileForUpload(fileName, this.uploadFolder);
FileChannel outChannel;
try {
outChannel = new FileOutputStream(destination).getChannel();
} catch (final FileNotFoundException e) {
throw new ImageUploadServiceException(e);
}
FileChannel inChannel = null;
try {
inChannel = new FileInputStream(item.getStoreLocation()).getChannel();
outChannel.transferFrom(inChannel, 0, item.getSize());
} catch (final IOException e) {
throw new ImageUploadServiceException(String.format("Error uploading image to '%s/%s'.", this.uploadFolder, destination.getName()), e);
} finally {
IOUtils.closeChannel(inChannel);
IOUtils.closeChannel(outChannel);
}
}
}
fileItemFactory.setFileCleaningTracker(null);
}
The above code causes every upload creates a file in the temp folder but does not remove it at the end by the "fileCleaningTracker", possibly because the DiskFileItemFactory instance is not garbage collected(I've failed to see why it shouldn't have) or it has been GCed but not notified by the PhantomReference in fileCleaningTracker(how reliable is PhantomReference?)
I waited 10 minutes and the files are still there, so it should't be because the GC has not run. and there are no exceptions.
Now if I add the following code, the temp files are removed every time after the upload:
fileItemFactory = null;
System.gc();
This looks very strange to me as I would expect the fileItemFactory be GCed without an explict call to System.gc().
Any input will be appreciated.
Thank you.
I have the same problem. The temporary files are never removed even after the server shutdown: GC process had not been started so FileCleaningTracker had no chance to get tracked files to delete from ReferenceQueue and all the files remain on the hard drive.
Due to specific behavior of my application I have to clean up after each upload (files might be very big). Instead of using standard org.apache.commons.io.FileCleaningTracker I am feeling lucky to override this class with my own implementation:
/**
* Cleaning tracker to clean files after each upload with special method invocation.
* Not thread safe and must be used with 1 factory = 1 thread policy.
*/
public class DeleteFilesOnEndUploadCleaningTracker extends FileCleaningTracker {
private List<String> filesToDelete = new ArrayList();
public void deleteTemporaryFiles() {
for (String file : filesToDelete) {
new File(file).delete();
}
filesToDelete.clear();
}
#Override
public synchronized void exitWhenFinished() {
deleteTemporaryFiles();
}
#Override
public int getTrackCount() {
return filesToDelete.size();
}
#Override
public void track(File file, Object marker) {
filesToDelete.add(file.getAbsolutePath());
}
#Override
public void track(File file, Object marker, FileDeleteStrategy deleteStrategy) {
filesToDelete.add(file.getAbsolutePath());
}
#Override
public void track(String path, Object marker) {
filesToDelete.add(path);
}
#Override
public void track(String path, Object marker, FileDeleteStrategy deleteStrategy) {
filesToDelete.add(path);
}
}
If this the right case for you just inject the instance of the class above into your DiskFileItemFactory:
DeleteFilesOnEndUploadCleaningTracker tracker = new DeleteFilesOnEndUploadCleaningTracker();
fileItemFactory.setFileCleaningTracker(tracker);
And don't forget to invoke the cleaning method after your work with uploaded items is done:
tracker.deleteTemporaryFiles();
Forgot to mention: I use commons-fileupload version 1.2.2 and commons-io version 1.3.2.

excel file upload using apache file upload

I am developing an testing automation tool in linux system. I dont have write permissions for tomcat directory which is located on server. I need to develop an application where we can select an excel file so that the excel content is automatically stored in already existing table.
For this pupose i have written an form to select an file which is posted to a servlet CommonsFileUploadServlet where i am storing the uploaded file and then calling ReadExcelFile class which reads the file path and create a vector for data in file which is used to sstore data in database.
My problem is that i am not able to store the uploaded file in directory. Is it necessary to have permission rights for tomcat to do this. Can i store the file on my system and pass the path to ReadExcelFile.class
Please guide me
My code is as follows:
Form in jsp
CommonsFileUploadServlet class code:
public void init(ServletConfig config) throws ServletException {
super.init(config);
}
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
PrintWriter out = response.getWriter();
response.setContentType("text/plain");
out.println("<h1>Servlet File Upload Example using Commons File Upload</h1>");
DiskFileItemFactory fileItemFactory = new DiskFileItemFactory ();
fileItemFactory.setSizeThreshold(1*1024*1024);
fileItemFactory.setRepository(new File("/home/example/Documents/Project/WEB-INF/tmp"));
ServletFileUpload uploadHandler = new ServletFileUpload(fileItemFactory);
try {
List items = uploadHandler.parseRequest(request);
Iterator itr = items.iterator();
while(itr.hasNext()) {
FileItem item = (FileItem) itr.next();
if(item.isFormField()) {
out.println("File Name = "+item.getFieldName()+", Value = "+item.getString());
} else {
out.println("Field Name = "+item.getFieldName()+
", File Name = "+item.getName()+
", Content type = "+item.getContentType()+
", File Size = "+item.getSize());
File file = new File("/",item.getName());
String realPath = getServletContext().getRealPath("/")+"/"+item.getName();
item.write(file);
ReadExcelFile ref= new ReadExcelFile();
String res=ref.insertReq(realPath,"1");
}
out.close();
}
}catch(FileUploadException ex) {
log("Error encountered while parsing the request",ex);
} catch(Exception ex) {
log("Error encountered while uploading file",ex);
}
}
}
ReadExcelFile code:
public static String insertReq(String fileName,String sno) {
//Read an Excel File and Store in a Vector
Vector dataHolder=readExcelFile(fileName,sno);
//store the data to database
storeCellDataToDatabase(dataHolder);
}
public static Vector readExcelFile(String fileName,String Sno)
{
/** --Define a Vector
--Holds Vectors Of Cells
*/
Vector cellVectorHolder = new Vector();
try{
/** Creating Input Stream**/
//InputStream myInput= ReadExcelFile.class.getResourceAsStream( fileName );
FileInputStream myInput = new FileInputStream(fileName);
/** Create a POIFSFileSystem object**/
POIFSFileSystem myFileSystem = new POIFSFileSystem(myInput);
/** Create a workbook using the File System**/
HSSFWorkbook myWorkBook = new HSSFWorkbook(myFileSystem);
int s=Integer.valueOf(Sno);
/** Get the first sheet from workbook**/
HSSFSheet mySheet = myWorkBook.getSheetAt(s);
/** We now need something to iterate through the cells.**/
Iterator rowIter = mySheet.rowIterator();
while(rowIter.hasNext())
{
HSSFRow myRow = (HSSFRow) rowIter.next();
Iterator cellIter = myRow.cellIterator();
Vector cellStoreVector=new Vector();
short minColIndex = myRow.getFirstCellNum();
short maxColIndex = myRow.getLastCellNum();
for(short colIndex = minColIndex; colIndex < maxColIndex; colIndex++)
{
HSSFCell myCell = myRow.getCell(colIndex);
if(myCell == null)
{
cellStoreVector.addElement(myCell);
}
else
{
cellStoreVector.addElement(myCell);
}
}
cellVectorHolder.addElement(cellStoreVector);
}
}catch (Exception e){e.printStackTrace(); }
return cellVectorHolder;
}
private static void storeCellDataToDatabase(Vector dataHolder)
{
Connection conn;
Statement stmt;
String query;
try
{
// get connection and declare statement
int z;
for (int i=1;i<dataHolder.size(); i++)
{
z=0;
Vector cellStoreVector=(Vector)dataHolder.elementAt(i);
String []stringCellValue=new String[10];
for (int j=0; j < cellStoreVector.size();j++,z++)
{
HSSFCell myCell = (HSSFCell)cellStoreVector.elementAt(j);
if(myCell==null)
stringCellValue[z]=" ";
else
stringCellValue[z] = myCell.toString();
}
try
{
//inserting into database
}
catch(Exception error)
{
String e="Error"+error;
System.out.println(e);
}
}
stmt.close();
conn.close();
System.out.println("success");
}
catch(Exception error)
{
String e="Error"+error;
System.out.println(e);
}
}
POI will happily open from an old InputStream, it needn't be a File one.
I'd suggest you look at the Commons FileUpload Streaming API and consider just passing the excel part straight to POI without touching the disk