How to run a command in VBS in high memory - pdf

Saving or exporting a word document as pdf takes very long, using VBscript. Is there way to run save as on high priority or to allocate high memory to the command so that it execute faster.
...
Set doc = wrdApp.Documents.Open(myfiledoc)
Doc.ExportAsFixedFormat myfilepdf,17,False,0,0,,,0,False,True,1
...
Thanks

Related

Error executing batch file in excel VBA windows

I was trying to execute a batch file via standard call shell() function.
This Batch file is project specific and created automatically for each project by an external application. Main function is to normalise around 40 files having statistical data used for my project. This data is being acquired form excel. While executing manually this takes around 30 seconds for the complete process and its working just fine.
When I try to access this using call shell function in VBA, It just pop up for like 2 seconds and outputs were not generated from Batch file.
I am attaching My sample code below used for this. I am just baby-stepping in VBA Macros. Please Excuse my coding practice.
Call Shell(Range("L8") & "\DSTAT$.BAT")
I tried this also
`Dim Runcc
Runcc = Shell(Range("L8") & "\DSTAT$.BAT", 1)`
Please let me know if any further information is required to sort this out.
Try
Dim waitOnReturn As Boolean: waitOnReturn = True
Dim windowStyle As Integer: windowStyle = 1
With CreateObject("WScript.Shell")
.Run Range("L8") & "\DSTAT$.BAT", windowStyle, waitOnReturn
End With
I use this for converting PDFs to JPG with Irfanview on report_open in Ms-Access.
Stolen from vba WScript.Shell run .exe file with parameter

Calculate the file size - VBA macro

I am working on a VBA macro which will process a CSV file by creating a temporary file.
I am using Microsoft Excel 2010 32-bit. The 32-bit versions have the 2 Gigabyte memory limits.
The size of CSV file that is being processed using macro is > 1GB.
As stated previously, while processing the CSV file we are saving it to a temporary *xls file.
So the total size while processing the CSV file will be the size of CSV file + size of the temp xls file that we are creating during the processing.
The size of both the files together is beyond the limit of 2GB and hence excel is crashing.
We have 2 scenarios:
When the user opens a CSV file using macro, I show a message box to the user if the user opens a CSV file whose size is > 1GB and close the excel.
When the user opens a CSV file using macro, and the CSV file size < 1GB and then user adds some data to it which will make its
size > 1GB, I have to show a message box to the user as soon as the file size reaches 1GB and close the excel without losing any data.
My question,
For the first case, I can use FileLen which returns a Long value specifying the length of a file in bytes.
For the second case: I am not getting how to calculate the file size when user is editing the CSV file using macro.
Need help,
Thanks in advance.
One character = 1 byte. So when user modifies CSV you can track chnges in the following manner:
On cell-value-changed event you add amount of characters in cell + 1 (for comma or any other separator) as amount of bytes to size of file at the beginning. This way you control the size during modifying file.
Keep in mind, if cell already had "something" in it, you have to subtract this value (amount of characters/bytes), as it will be overwritten.
The below code might help you, in the below code i have used 1 MB as reference. you can change it for 1GB or based on your attributes.
Private Sub Worksheet_SelectionChange(ByVal Target As Range)
Call FileSize
End Sub
Sub FileSize()
Dim LResult As Long
strFileFullName = ActiveWorkbook.FullName
LResult = FileLen(strFileFullName)
If LResult > 1000 Then
MsgBox "FileSize is large " & (LResult)
ActiveWorkbook.Close (savechanges)
End If
End Sub

Running SAS Stored Process in Excel fails after two runs

I hope this is an appropriate place to ask this question.
I have recently built a data analysis tool in Excel that works by submitting inputs to a SAS Stored Process (as an 'input stream'), running the processes and displaying the results in Excel.
I also use some code to check for and remove all active stored processes from the workbook before running the process again.
This runs successfuly the first 2 times, but fails on the third attempt. It always fails on the third attempt and I can't figure out why.
Is there some kind of memory allocation for Excel VBA that's exhausted by this stage? Or some other buffer that's maxed out? I've stepped-in to every line of the VBA code and it appears to hang (on the third run) at the following line:
SAS.InsertStoredProcess processLoc, _
outputSheet.Range("A1"), , , inputStream
Code used to initiate SAS Add-in for Microsoft Office:
Dim SAS As SASExcelAddIn
Set SAS = Application.COMAddIns.Item("SAS.ExcelAddIn").Object
Code used to delete stored processes from target output sheet:
Dim Processes As SASStoredProcesses
Set Processes = SAS.GetStoredProcesses(outputSheet)
Dim i As Integer
For i = 1 To Processes.Count
' MsgBox Processes.Item(i).DisplayName
Processes.Item(i).Delete
Next i
Code used to insert and run stored process:
Dim inputStream As SASRanges
Set inputStream = New SASRanges
inputStream.Add "Prompts", inputSheet.Range("DrillDown_Input")
SAS.InsertStoredProcess processLoc, _
outputSheet.Range("A1"), , , inputStream
Cheers
On reflection, my theory here would be that you are hitting the limit of multibridge connections. Each multibridge connection represents a port, and the more ports you have the more parallel connections are enabled. By default there are three, perhaps you have two, or you are kicking off another STP at the same time?
This would explain the behaviour. I had a spreadsheet that called STPs and it would always fail on the fourth call because the first three were running. You can get around this by either a) increasing the number of multibridge connections or b) chaining your processes so they run sequentially.
I don't know if this'll be useful, but I had a similar problem running a DLL written in C++ through VBA. The problem occurred because the function in the DLL returned a double value, which I didn't need and so the code in VBA did
Call SomeProcessFromDLL()
But the process was returning a double floating point value which was 'filling up' some buffer memory in VBA and VBA has a limited buffer (I think it gave up at 8 tries). So the solution for me was
Dim TempToDiscard as Double
TempToDiscard = SomeProcessFromDLL()
Maybe looking at the documentation of the process being called would help here, especially if it's returning some value to be discarded anyway, like a
Return 0;
I never liked using the IOM in VBA, mainly due to issues with references and having to do client installs when rolling out applications. Last year I found a MUCH better way to connect Excel and SAS - using the Stored Process web application. Simply set up your server side SAS process with streaming output, and pass your inputs via an Excel Web query. No client installs, no worries about SAS version upgrades, hardly any code - am surprised it's not used more often!
See: http://rawsas.blogspot.co.uk/2016/11/sas-as-service-easy-way-to-get-sas-into.html

Speeding up Bloomberg session start

I am using the following VBA code to begin a Bloomberg session:
Set Sess = New Session
Dim Opt As SessionOptions
Set Opt = Sess.CreateSessionOptions
Opt.ServerHost = "127.0.0.1"
Opt.ServerPort = 8194
Sess.SetSessionOptions Opt
Sess.Start
It works but takes 15-20 seconds. I can live with that if I must, but it seems odd because there was no such delay using their old Bloomberg Data control. Is there a way to speed things up, say, by connecting to an existing Bloomberg process, or some other option I don't know about?
It can take a while to start the first session after logging into the Terminal, but should be quicker for subsequent sessions. If you want to avoid a long start in your application you can run Excel and use the API for any retrieval of data. For example, type this formula in a cell in Excel: "=BDP("IBM US Equity","PX_LAST") and then run your application.

Unable to read and process large txt files using threading, background worker or string builder

In .net I'm coding an app that needs to read large txt files, such as 10 mb. My problem is reading files using StreamReader and doing some string manipulation, and then adding the results to a list box.
If I use threading or a background worker, processing becomes very slow. I also tried using string builder but with same result.
Any solutions for this?
You can read a 10 mb text file very quickly using ReadAllLines:
Dim ss() As String
ss = System.IO.File.ReadAllLines(filename)
Then you can manipulate the strings in the array, ss in this case.
When you update the ListBox, you should use .BeginUpdate and .EndUpdate to make that part faster.
You can put Application.DoEvents in the loop to allow Windows messages to be processed. This may keep it from looking so much like the system is locked up.