I have searched high and low and cannot find a way of doing this. I am writing a program that will run at logon and delete a directory within another directory. Our company has a software application that contains a directory that sometimes becomes corrupted. The issue is that the directory contains some static words and then is appended with a randomly generated set of characters. Thus, the need for a search for the static words and delete any directory that contains them. This is kicking my butt. Thanks for any help!
Edit:
My apologies for not adding some or all of the code that I've written thus far. I can delete the static directories, but not the dynamic ones. Again, I'm teaching myself and I'm sure there's better ways of doing what I need, but I don't know them. I'm also relatively certain that my code is messy and such. I would love some constructive criticism, but please don't bash me for trying. Please see below. Thanks!
Imports System.IO
Module Module1
Public Sub Main()
'I'm wanting to see the user name output in the console
Dim user As String
user = Environment.UserName
Console.Write(user)
'new line
Console.WriteLine()
Dim path1 As String
path1 = "\appdata\local\DIRECTORY\APPLICATIONNAME.exe_Url_ny2thmvtmqmw4jiqk1yuytwfbddruu02"
Dim path2 As String
path2 = "\appdata\local\DIRECTORY\APPLICATIONNAME.exe_Url_r3joylqll52q54guz0002pxu4swqous0"
Dim fullpath As String
fullpath = "C:\Users\" & user & path1
Dim fullpath2 As String
fullpath2 = "C:\Users\" & user & path2
Dim toplevel As String
toplevel = "\appdata\local\APPLICATIONNAME\"
Dim toplevel1 As String
toplevel1 = "C:\Users" & user & toplevel
If Directory.Exists(fullpath) = True Then
Directory.Delete(fullpath, True)
ElseIf Directory.Exists(fullpath2) = True Then
Directory.Delete(fullpath2, True)
End If
'I would like to keep the window open until I work the kinks out
Console.WriteLine("Finished. You may now close this window.")
Console.ReadKey()
End Sub
End Module
This should do what you need. I've included parameter names to make it more readable. You can strip them out if you prefer the more concise approach...
' Will find all directories under C:\Root\Folder\
' (including subdirectories) with a name that starts
' with "SearchString", then delete them and their contents
System.IO.
Directory.
GetDirectories(
path:="C:\Root\Folder\",
searchPattern:="SearchString*",
searchOption:=System.IO.SearchOption.AllDirectories).
ToList().
ForEach(Sub(x) System.IO.Directory.Delete(path:=x, recursive:=True))
That said, This is simply joining together two tasks:
Finding a list of directories
Deleting each one in turn
There are many tutorials and examples on the internet (and numerous questions on Stack Overflow) relating to these topics.
Edit: The concise version
Imports System.IO
Directory.GetDirectories("C:\Root\Folder\", "SearchString*", SearchOption.AllDirectories).
ToList().ForEach(Sub(x) Directory.Delete(x, True))
try using something like
this code below deletes every folder thats contains the pattern listed on
the string array.
Dim Words() As String = {"Word1","Word3","Word4",.."Wordn"}
For Each iPatternWord as String In Words
For Each iDir As System.IO.DirectoryInfo In System.IO.Directory.GetDirectories(#"C:\",iPattern)
iDir.Delete(true);//===>Delete this folder.
Loop
Loop
I'm trying to improve a system I have that already works. However I've ran into a problem I can't seem to find an answer to using google, perhaps I'm not searching the right questions?
I'm using a For Each loop to gather files in a directory. I then try to get each file name and some other information and determine if I should move the file temporarily or delete the file.
Example of the code used:
For Each File In Directory.GetFiles(Profile)
Dim tmpFile As System.IO.FileInfo = My.Computer.FileSystem.GetFileInfo(File) 'Never moves to the next file.
Dim Name As String = tmpFile.Name
Dim AccessDate As String = tmpFile.LastWriteTime.Date()
Dim CurrentDate As String = My.Computer.Clock.LocalTime.Date()
If AccessDate < CurrentDate Then
My.Computer.FileSystem.DeleteFile(File) 'Moves to the next file without any issues.
Threading.Thread.Sleep(150) 'If no sleep program will lock up
Else
If Not Directory.Exists(Path) Then
Directory.CreateDirectory(Path)
My.Computer.FileSystem.MoveFile(File, Path & Name)
Else
My.Computer.FileSystem.MoveFile(File, Path & Name)
End If
End If
Next
I have defined in the code what my problem is, and forgive me if there is a better way to do this, I'm self taught and still learning many things. If there's an easier way please point me in that direction.
Essentially I hit an error saying "File not found" on the move because tmpFile is not moving to the next File in the directory.
I need to find an Excel file. However, the extension of the file I"m looking for could be .xls or .xlsx. I was considering using FileExists but I can't use a wildcard with that. Here's my attempt at using GetFiles, however, the .xls* part of my code does not work. I've never used GetFiles before, can anyone give me some guidance on what I'm doing wrong?
Dim InputFormPath As String = "W:\TOM\ERIC\NET Dev\"
Dim wbNameXLSInputForm As String = StatVar.xlApp.Sheets("New Calculator Input").Range("D15").Text & ".xls*"
Dim XLSInputForm As String = wbNameXLSInputForm
Dim dirs As String() = Directory.GetFiles(InputFormPath, wbNameXLSInputForm)
If dirs.Length <> 0 Then
'do something
End If
Take a look at this documentation. It says: The following list shows the behavior of different lengths for the searchPattern parameter: "*.abc" returns files having an extension of.abc,.abcd,.abcde,.abcdef, and so on.
I'm trying to import a tab delimited file into a table.
The issue is, SOMETIMES, the file will include an awkward record that has two "null values" and causes my program to throw a "unexpected end of file".
For example, each record will have 20 fields. But the last record will have only two fields (two null values), and hence, unexpected EOF.
Currently I'm using a StreamReader.
I've tried counting the lines and telling bcp to stop reading before the "phantom nulls", but StreamReader gets an incorrect count of lines due to the "phantom nulls".
I've tried the following code to get rid of all bogus code (code borrowed off the net). But it just replaces the fields with empty spaces (I'd like the result of no line left behind).
Public Sub RemoveBlankRowsFromCVSFile2(ByVal filepath As String)
If filepath = DBNull.Value.ToString() Or filepath.Length = 0 Then Throw New ArgumentNullException("filepath")
If (File.Exists(filepath) = False) Then Throw New FileNotFoundException("Could not find CSV file.", filepath)
Dim tempFile As String = Path.GetTempFileName()
Using reader As New StreamReader(filepath)
Using writer As New StreamWriter(tempFile)
Dim line As String = Nothing
line = reader.ReadLine()
While Not line Is Nothing
If Not line.Equals(" ") Then writer.WriteLine(line)
line = reader.ReadLine()
End While
End Using
End Using
File.Delete(filepath)
File.Move(tempFile, filepath)
End Sub
I've tried using SSIS, but it encounters the EOF unexpected error.
What am I doing wrong?
If you read the entire file into a string variable (using reader.ReadToEnd()) do you get the whole thing? or are you just getting the data up to those phantom nulls?
Have you tried using the Reader.ReadBlock() function to try and read past the file length?
At our company we do hundreds of imports every week. If a file is not sent in the correct, agreed to format for our automated process, we return it to the sender. If the last line is wrong, the file should not be processed because it might be missing information or in some other way corrupt.
One way to avoid the error is to use ReadAllLines, then process the array of file lines instead of progressing through the file. This is also a lot more efficient than streamreader.
Dim fileLines() As String
fileLines = File.ReadAllLines("c:\tmp.csv")
...
for each line in filelines
If trim(line) <> "" Then writer.WriteLine(line)
next line
You can also use save the output lines in the same or a different string array and use File.WriteAllLines to write the file all at once.
You could try the built-in .Net object for reading tab-delimited files. It is Microsoft.VisualBasic.FileIO.TextFileParser.
This was solved using a bit array, checking one bit at a time for the suspect bit.
I believe I have come up with a very efficient way to read very, very large files line-by-line. Please tell me if you know of a better/faster way or see room for improvement. I am trying to get better at coding, so any sort of advice you have would be nice. Hopefully this is something that other people might find useful, too.
It appears to be something like 8 times faster than using Line Input from my tests.
'This function reads a file into a string. '
'I found this in the book Programming Excel with VBA and .NET. '
Public Function QuickRead(FName As String) As String
Dim I As Integer
Dim res As String
Dim l As Long
I = FreeFile
l = FileLen(FName)
res = Space(l)
Open FName For Binary Access Read As #I
Get #I, , res
Close I
QuickRead = res
End Function
'This function works like the Line Input statement'
Public Sub QRLineInput( _
ByRef strFileData As String, _
ByRef lngFilePosition As Long, _
ByRef strOutputString, _
ByRef blnEOF As Boolean _
)
On Error GoTo LastLine
strOutputString = Mid$(strFileData, lngFilePosition, _
InStr(lngFilePosition, strFileData, vbNewLine) - lngFilePosition)
lngFilePosition = InStr(lngFilePosition, strFileData, vbNewLine) + 2
Exit Sub
LastLine:
blnEOF = True
End Sub
Sub Test()
Dim strFilePathName As String: strFilePathName = "C:\Fld\File.txt"
Dim strFile As String
Dim lngPos As Long
Dim blnEOF As Boolean
Dim strFileLine As String
strFile = QuickRead(strFilePathName) & vbNewLine
lngPos = 1
Do Until blnEOF
Call QRLineInput(strFile, lngPos, strFileLine, blnEOF)
Loop
End Sub
Thanks for the advice!
My two cents…
Not long ago I needed reading large files using VBA and noticed this question. I tested the three approaches to read data from a file to compare its speed and reliability for a wide range of file sizes and line lengths. The approaches are:
Line Input VBA statement
Using the File System Object (FSO)
Using Get VBA statement for the whole file and then parsing the string read as described in posts here
Each test case consists of three steps:
Test case setup that writes a text file containing given number of lines of the same given length filled by the known character pattern.
Integrity test. Read each file line and verify its length and contents.
File read speed test. Read each line of the file repeated 10 times.
As you can notice, Step #3 verifies the true file read speed (as asked in the question) while Step #2 verifies the file read integrity and therefore simulates real conditions when string parsing is needed.
The following chart shows the test results for the File read speed test. The file size is 64M bytes for all tests, and the tests differ in line length that varies from 2 bytes (not including CRLF) to 8M bytes.
CONCLUSION:
All the three methods are reliable for large files with normal and abnormal line lengths (please compare to Graeme Howard’s answer)
All the three methods produce almost equivalent file reading speed for normal line lengths
“Superfast way” (Method #3) works fine for extremely long lines while the other two don’t.
All this is applicable to different Offices, different PCs, for VBA and VB6
You can use Scripting.FileSystemObject to do that thing.
From the Reference:
The ReadLine method allows a script to read individual lines in a text file. To use this method, open the text file, and then set up a Do Loop that continues until the AtEndOfStream property is True. (This simply means that you have reached the end of the file.) Within the Do Loop, call the ReadLine method, store the contents of the first line in a variable, and then perform some action. When the script loops around, it will automatically drop down a line and read the second line of the file into the variable. This will continue until each line has been read (or until the script specifically exits the loop).
And a quick example:
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFile = objFSO.OpenTextFile("C:\FSO\ServerList.txt", 1)
Do Until objFile.AtEndOfStream
strLine = objFile.ReadLine
MsgBox strLine
Loop
objFile.Close
Line Input works fine for small files. However, when file sizes reach around 90k, Line Input jumps all over the place and reads data in the wrong order from the source file.
I tested it with different filesizes:
49k = ok
60k = ok
78k = ok
85k = ok
93k = error
101k = error
127k = error
156k = error
Lesson learned - use Scripting.FileSystemObject
With that code you load the file in memory (as a big string) and then you read that string line by line.
By using Mid$() and InStr() you actually read the "file" twice but since it's in memory, there is no problem.
I don't know if VB's String has a length limit (probably not) but if the text files are hundreds of megabyte in size it's likely to see a performance drop, due to virtual memory usage.
I would think , in a large file scenario using a stream would be far more efficient, because memory consumption would be very small.
But your algorithm could alternate between using a stream and loading the entire thing in memory based on the file size. I wouldn't be surprised if one is only better than the other under certain criteria.
'you can modify above and read full file in one go
and then display each line as shown below
Option Explicit
Public Function QuickRead(FName As String) As Variant
Dim i As Integer
Dim res As String
Dim l As Long
Dim v As Variant
i = FreeFile
l = FileLen(FName)
res = Space(l)
Open FName For Binary Access Read As #i
Get #i, , res
Close i
'split the file with vbcrlf
QuickRead = Split(res, vbCrLf)
End Function
Sub Test()
' you can replace file for "c:\writename.txt to any file name you desire
Dim strFilePathName As String: strFilePathName = "C:\writename.txt"
Dim strFileLine As String
Dim v As Variant
Dim i As Long
v = QuickRead(strFilePathName)
For i = 0 To UBound(v)
MsgBox v(i)
Next
End Sub
My take on it...obviously, you've got to do something with the data you read in. If it involves writing it to the sheet, that'll be deadly slow with a normal For Loop. I came up with the following based upon a rehash of some of the items there, plus some help from the Chip Pearson website.
Reading in the text file (assuming you don't know the length of the range it will create, so only the startingCell is given):
Public Sub ReadInPlainText(startCell As Range, Optional textfilename As Variant)
If IsMissing(textfilename) Then textfilename = Application.GetOpenFilename("All Files (*.*), *.*", , "Select Text File to Read")
If textfilename = "" Then Exit Sub
Dim filelength As Long
Dim filenumber As Integer
filenumber = FreeFile
filelength = filelen(textfilename)
Dim text As String
Dim textlines As Variant
Open textfilename For Binary Access Read As filenumber
text = Space(filelength)
Get #filenumber, , text
'split the file with vbcrlf
textlines = Split(text, vbCrLf)
'output to range
Dim outputRange As Range
Set outputRange = startCell
Set outputRange = outputRange.Resize(UBound(textlines), 1)
outputRange.Value = Application.Transpose(textlines)
Close filenumber
End Sub
Conversely, if you need to write out a range to a text file, this does it quickly in one print statement (note: the file 'Open' type here is in text mode, not binary..unlike the read routine above).
Public Sub WriteRangeAsPlainText(ExportRange As Range, Optional textfilename As Variant)
If IsMissing(textfilename) Then textfilename = Application.GetSaveAsFilename(FileFilter:="Text Files (*.txt), *.txt")
If textfilename = "" Then Exit Sub
Dim filenumber As Integer
filenumber = FreeFile
Open textfilename For Output As filenumber
Dim textlines() As Variant, outputvar As Variant
textlines = Application.Transpose(ExportRange.Value)
outputvar = Join(textlines, vbCrLf)
Print #filenumber, outputvar
Close filenumber
End Sub
Be careful when using Application.Transpose with a huge number of values. If you transpose values to a column, excel will assume you are assuming you transposed them from rows.
Max Column Limit < Max Row Limit, and it will only display the first (Max Column Limit) values, and anithing after that will be "N/A"
I just wanted to share some of my results...
I have text files, which apparently came from a Linux system, so I only have a vbLF/Chr(10) at the end of each line and not vbCR/Chr(13).
Note 1:
This meant that the Line Input method would read in the entire file, instead of just one line at a time.
From my research testing small (152KB) & large (2778LB) files, both on and off the network I found the following:
Open FileName For Input: Line Input was the slowest (See Note 1 above)
Open FileName For Binary Access Read: Input was the fastest for reading the whole file
FSO.OpenTextFile: ReadLine was fast, but a bit slower then Binary Input
Note 2:
If I just needed to check the file header (first 1-2 lines) to check if I had the proper file/format, then FSO.OpenTextFile was the
fastest, followed very closely by Binary Input.
The drawback with the Binary Input is that you have to know how many characters
you want to read.
On normal files, Line Input would also be a good
option as well, but I couldn't test due to Note 1.
Note 3:
Obviously, the files on the network showed the largest difference in read speed. They also showed the greatest benefit from reading the file a second time (although there are certainly memory buffers that come into play here).