I'm working with ZIP Files and I'm using the following code:
Dim archive As ZipArchive
archive = ZipFile.OpenRead(Filelocation)
Do While (stopLoop)
entry = archive.Entries.Item(counter)
If (entry.FullName.Contains(".jpeg") Or entry.FullName.Contains(".jpg") Or entry.FullName.Contains(".png")) Then
stopLoop = False
Else
counter += 1
End If
Loop
These zip files contain images and documents but the images are in a specific order. The user can flip through the images and sometimes my code works perfectly fine (images show 1 throught 10) but other times, the order is completely garbage (images show 2,9,3,7,8....). I understand that the entries are based on when they were added into the archive but is there anyway to sort these by fullname?
Related
I got problem with resources. My project have 400 small images for status indicator. All images are set in 4 groups (4 different folders). Group are identified by My.Settings.imagevalue parameter. Indicator can give value from 1 to 100. Every new indicator value are displayed in: MainForm.indication_Value.Text .
As I have so many images, I would like to use dll with images as external resource. I have putted all images in 4 dll's, 100 images on each one. On top of DLL structure are RCData, where are set all images by value 1 to 100. I plan to use same DLL's for few apps seamlessly.
I use very simple code on text changes to change image:
Private Sub preview()
Dim indication As String
indication = MainForm.indication_Value.Text & ".jpg"
Dim resource As String = IO.Path.Combine(AppDomain.CurrentDomain.BaseDirectory, My.Settings.imagevalue, indication)
If IO.File.Exists(resource) Then
indication_preview.Image = Image.FromFile(resource)
Else
indication_preview.Image = My.Resources.Resources._error
End If
End Sub
How can use images from externa DLL with logic I have created before? I really don't want to write messy code for each image.
Thanks in advance for response!
I have a scanned book review VB.Net app that displays two pages (TIF images) at a time and lets the user move forward or backward in the book. The books usually have between 100 and 600 pages and are stored on a network drive. It seems that the TIF images are being cached, because I can go backward in the book a lot lot faster than forward. The Task Manager also shows that the app is using more memory, the further I go in the book.
Does the VB app automatically cache files that it reads and displays?
If so, how can I reduce the amount of time a file spends in cache?
Public fileNames(10) As String
Public p As Integer ' page number of right-hand image
Public totalImages as Integer
' --------
Dim TNpath as String = "S:\Audit\CP_TN-123456"
Dim txtFiles = Directory.EnumerateFiles(TNpath, "*.tif")
totalImages = txtFiles.Count
ReDim fileNames(totalImages)
' -------- read all fileNames from the Directory
p = 1 ' when starting a new book, the right-hand image is 1
' -------- moving right in the book
p = p + 1
Me.PicBoxRight.Image.Dispose()
Me.PicBoxRight.Image = Nothing
imgRight.Dispose()
' -------- loading next Right image
fileRight = TNpath + "\" + fileNames(p)
imgRight = Image.FromFile(fileRight)
Me.PicBoxRight.Image = imgRight
Note: Book pages are scanned at 300 DPI and saved in single image TIF files. Each color page is about 25MB and each black/white page is about 8MB. So cache fills up quickly. The app reads files, but does not write files...
Purpose of my question and of the VBA code:
Get specific data (a couple columns) from each one of the "table.csv" file in a network directory. Each networkdirectory/subfolders01/subfolders02 contains one "table.csv" file but 100 other subfolders are included in each network/subfolders01. The other folders are not needed, the only one we are interested in is subfolder02 for each subfolder01. The number of subfolders01 in the network directory is about 15000. However I only need subfolders01 from Jan2020 to Apr2020,for example (200 subfolders).
Final purpose is to trend data.
Issue:
I am trying to understand how I could improve the VBA code that I am currently using.
This code goes through each subfolder one by one and then check the date and file name.
I am wondering if there is a way to add any search filters criteria for subfolder date and name to have a faster loop.
How can we avoid the code to go through each subfolders?
Please see below the code I am using,
I really appreciate your time and hope my request is clear.
'''
Function GetFiles(startPath As String) As Collection
Dim fso As Object, rv As New Collection, colFolders As New Collection, fpath As String
Dim subFolder As Object, f, dMinfold, dtMod
Set fso = CreateObject("Scripting.FileSystemObject")
dMinfold = ThisWorkbook.Sheets("Enter_Date").Cells(2, 1)
colFolders.Add startPath
Do While colFolders.Count > 0
fpath = colFolders(1)
colFolders.Remove 1
'process subfolders
For Each subFolder In fso.getfolder(fpath).subfolders
If subFolder.DateLastModified >= dMinfold Then
colFolders.Add subFolder.Path
End If
Next subFolder
'process files
f = Dir(fso.buildpath(fpath, "*Table.csv"), vbNormal)
Do While f <> ""
f = fso.buildpath(fpath, f)
dtMod = FileDateTime(f)
If dtMod >= dMinfold And Right(f, 3) = "csv" Then
rv.Add f
End If
f = Dir()
Loop
Loop
Set GetFiles = rv
End Function'''
Then I have my code to get transfer data from each file.
Thank you.
I'll put in screenshots to clear up the Get & Transform method, since it is the GUI approach rather than code.
It is possible to filter before loading contents, which will speed things up significantly.
I tried with a few thousand subfolders filtered down to 20, loads instantly.
Here's the initial screen for get data from folder
You can then filter on path. In your case it will be based on the date from the folder name.
Now that it's filtered you can expand the content using the header button.
Inside content, you'll have to expand again to convert from csv to excel table
Choose/rename columns as needed, then hit "close and load" to drop it into excel.
Default is to a new table, but you can "load to" if something more custom is needed.
Here's your output. You can right-click refresh or refresh from vba as needed.
Edit- Just noticed that I used .txt rather than .csv for the files. Might change how a step or two looks in the middle, but the general idea is the same.
Due to a company policy on how the PDM system operates, when the user checks in a file, the local copy is deleted from the users cache. My macro checks files out, edits them and checks back in again. If I try and edit a file that has just been edited I get a 'file not found' error (because it's been deleted from cache). I have tried to get around this by writing a sub to get the latest copy of a file immediatly before editing it to ensure there is always a file present but the code doesnt seem to retrieve the file. The sub is as below.
Sub GetLatest(fName As String)
Dim vaultName As String
Dim eVault As IEdmVault13
Dim eFile As IEdmFile8
Dim BG As IEdmBatchGet
Dim files(1) As EdmSelItem
'log into the vault
vaultName = Config.ReadXMLElement(pathConfig, "vaultname")
Set eVault = New EdmVault5
If Not eVault.IsLoggedIn Then
Call eVault.LoginAuto(vaultName, 0)
End If
'get the file to get lastest
Set eFile = eVault.GetFileFromPath(fName)
'put the file in an array
files(0).mlDocID = 0
files(0).mlProjID = eFile.ID
Set BG = eVault.CreateUtility(EdmUtil_BatchGet)
Call BG.AddSelection(eVault, files())
Call BG.CreateTree(0, EdmGetCmdFlags.Egcf_SkipExisting)
Call BG.GetFiles(0, Nothing)
End Sub
If I manually 'get latest' in the EPDM browser before editing the file, the macro reads it fine. The code is slightly modified from that posted by Michael Dekoning at https://forum.solidworks.com/thread/51105
From first glance, it looks like you are populating the EdmSelItem properties incorrectly.
The docID property is the database ID of the document. The ProjID property is the ID of the containing folder. For getting the latest version, you can use any containing folder, as it will be checked out in all folders. With EPDM, when a file is "shared" it can have multiple parent folder IDs it belongs to, and we can enumerate over then using the methods from iEdmFile5 GetFirstFolderPosition and GetNextFolder.
You can refer to the documentation for further information and examples.
If you want to get a single file, try the following adjustment and see if that does it:
Set eFile = eVault.GetFileFromPath(fName)
Dim eFolder as iEdmFolder5
Dim Pos as iEdmPos5
Set Pos=eFile.GetFirstFolderPosition
Set eFolder=eFile.GetNextFolder(Pos)
'Get the file from the folder
files(0).mlDocID = eFile.ID
files(0).mlProjID = eFolder.ID
When you provide DocID = 0 it tells EPDM to get all the files in the folder specified. Like so:
'Get all files from the folder
files(0).mlDocID = 0
files(0).mlProjID = eFolder.ID
I want to learn how to write and read data from files using Visual Basic Studio Express 2013 for Windows Desktop.
Specifically, I have a class called Pilots. The class includes text of a first and last name, and integers representing rank, skill, missions and status.
There is an ArrayList called Squadron that contains 18 (at start) Pilots.
What I want to do is save and then load all the data in Squadron to a file. And I'm not sure how to get it to work properly. I've read numerous books and sites, but somewhere along the line, I'm just not comprehending how to get it to work correctly.
At this point, I've been able to write the data to a file.... but I'm having no luck getting it back out of the file.
To write it out... I'm using this code:
Dim currentpilot As New Pilot("John", "Doe")
' Create the BinaryWriter and use File.Open to create the file.
Using writer As BinaryWriter = New BinaryWriter(File.Open("squadron.dat", FileMode.Create))
' Write each integer.
writer.Write(SquadronList.Count)
For index = 0 To SquadronList.Count - 1
currentpilot = CType(SquadronList(index), Pilot)
writer.Write(currentpilot.pFirstName)
writer.Write(currentpilot.pLastName)
writer.Write(currentpilot.pSkill)
writer.Write(currentpilot.pRank)
writer.Write(currentpilot.pStatus)
writer.Write(currentpilot.pMissions)
writer.Write(currentpilot.pKills)
Next
End Using
The writer.Write(SquadronList.Count) line is there to record the number of actual Pilot records in the file. I don't know if that's necessary or not.
The file does write to disk and just looking at it with notepad, it does seem to have the correct data.
The problem is getting it back out. This code fails quickly...
Using reader As BinaryReader = New BinaryReader(File.OpenRead("squadron.dat"))
Dim index As Integer = reader.ReadInt32
For counter = 0 To index - 1
currentpilot.pFirstName = reader.ReadString
currentpilot.pLastName = reader.ReadString
currentpilot.pSkill = reader.ReadInt16
currentpilot.pRank = reader.ReadInt16
currentpilot.pStatus = reader.ReadInt16
currentpilot.pMissions = reader.ReadInt16
currentpilot.pKills = reader.ReadInt16
SquadronList.Add(currentpilot)
Next
End Using
So... any suggestions or guidance will be appreciated!