VBA macro create temporary file - vba

I am using a macro which creates temporary docx files that are then assembled into one.
I then delete the temporary files.
These files still show up in the Recent Files list, even though they no longer exist.
How can I prevent these temp files from being recognized by Word as a recent file?
Or is there a way to save the contents of the would-be temporary file in an array and then use this array to complete the final file? Meaning, the temp file does not ever actually exist.

The fifth parameter of Document.SaveAs is AddToRecentFiles. Set that to False.
https://msdn.microsoft.com/en-us/library/office/aa220734
You can create the temporary files, combine them into one, and then close them without saving them.

Related

how to read a tab delimited .txt file and insert into oracle table

I want to read a tab delimited file using PLSQL and insert the file data into a table.
Everyday new file will be generated.
I am not sure if external table will help here because filename will be changed based on date.
Filename: SPRReadResponse_YYYYMMDD.txt
Below is the sample file data.
Option that works on your own PC is to use SQL*Loader. As file name changes every day, you'd use your operating system's batch script (on MS Windows, these are .BAT files) to pass a different name while calling sqlldr (and the control file).
External table requires you to have access to the database server and have (at least) read privilege on its directory which contains those .TXT files. Unless you're a DBA, you'll have to talk to them to provide environment. As of changing file name, you could use alter table ... location which is rather inconvenient.
If you want to have control over it, use UTL_FILE; yes, you still need to have access to that directory on the database server, but - writing a PL/SQL script, you can modify whatever you want, including file name.
Or, a simpler option, first rename input file to SPRReadResponse.txt, then load it and save yourself of all that trouble.

Deleting NDF from file group completely?

I want a script to delete NDF file from a file group completely without using "shrinkfile " command
A file can be removed from the database only if the file is empty. Without SHRINKFILE, the implication is that the file must be the only file in a user-defined filegroup and you must first drop or move all the objects (or partitions) from the filegroup to a different file group. The empty file can then be dropped with ALTER DATABASE...REMOVE FILE.
It seems your objective is to delete data older than 6 months. It would be easier to just delete/truncate the data and not bother with files/filegroups at all.

When to delete a temp file?

Is there a standard or a common way when to delete a temporary file?
I'm currently writing a script in which I'm using many of them, but when should I remove them?
Delete the file after you used it / won't need it afterwards
Delete all temp files at the end of the script
I would give the temp files a specific directory or a specific file extension. And then delete it at the end of the script (all inside the temp-dir or all with the extension). I prefer doing it this way because:
1) If there are a lot of files you might forget one if you delete them by their names
2) If you still needs one of them later inside the code it's still there.

Dynamically populate external tables location

I'm trying to use oracle external tables to load flat files into a database but I'm having a bit of an issue with the location clause. The files we receive are appended with several pieces of information including the date so I was hoping to use wildcards in the location clause but it doesn't look like I'm able to.
I think I'm right in assuming I'm unable to use wildcards, does anyone have a suggestion on how I can accomplish this without writing large amounts of code per external table?
Current thoughts:
The only way I can think of doing it at the moment is to have a shell watcher script and parameter table. User can specify: input directory, file mask, external table etc. Then when a file is found in the directory, the shell script generates a list of files found with the file mask. For each file found issue a alter table command to change the location on the given external table to that file and launch the rest of the pl/sql associated with that file. This can be repeated for each file found with the file mask. I guess the benefit to this is I could also add the date to the end of the log and bad files after each run.
I'll post the solution I went with in the end which appears to be the only way.
I have a file watcher than looks for files in a given input dir with a certain file mask. The lookup table also includes the name of the external table. I then simply issue an alter table on the external table with the list of new file names.
For me this wasn't much of an issue as I'm already using shell for most of the file watching and file manipulation. Hopefully this saves someone searching for ages for a solution.

How to do loop in pentaho for getting file names?

I have 100 000 files.
I want to get the name of those file names and have to put in database,
I have to do like this
get 10 files name's;
update/insert names into database; and
move those 10 files to another directory;
and loop these three steps till no files are found.
Is this possible?
I'm attaching a working example (I tested it with ~400 text files on kettle 4.3.).
transformation.ktr
job.kjb
Both transformation and job contain detailed notes on what to set and where.
Transformation.ktr It reads first 10 filenames from given source folder, creates destination filepath for file moving. It outputs filenames to insert/update (I used dummy step as a placeholder) and uses "Copy rows to resultset" to output needed source and destination paths for file moving.
job.kjb All the looping is done in this job. It executes "transformation.ktr" (which does insert/update for 10 files), and then moves those 10 files to destination folder. After that, it checks whether there's any more files in source folder. If there is, process is repeated, if not, it declares success.