Speeding up Bloomberg session start - bloomberg

I am using the following VBA code to begin a Bloomberg session:
Set Sess = New Session
Dim Opt As SessionOptions
Set Opt = Sess.CreateSessionOptions
Opt.ServerHost = "127.0.0.1"
Opt.ServerPort = 8194
Sess.SetSessionOptions Opt
Sess.Start
It works but takes 15-20 seconds. I can live with that if I must, but it seems odd because there was no such delay using their old Bloomberg Data control. Is there a way to speed things up, say, by connecting to an existing Bloomberg process, or some other option I don't know about?

It can take a while to start the first session after logging into the Terminal, but should be quicker for subsequent sessions. If you want to avoid a long start in your application you can run Excel and use the API for any retrieval of data. For example, type this formula in a cell in Excel: "=BDP("IBM US Equity","PX_LAST") and then run your application.

Related

How do I write VB.NET program to remotely control 6060B (electric load)

I am writing a vb.net program to remotely control the 6060B (electric load). So far I have successfully connected my pc to the 6060B and I am able to query information from the load. Below is the part of the code I wrote:
Dim mbSession As MessageBasedSession;
mbSession = ResourceManager.GetLocalManager().Open("GPIB::6::INSTR");
Dim responseString As String = mbSession.Query("*idn?");
This returns me the information of the load -- "responseString is HEWLETT-PACKARD...". However, I don't know what should I do to change/set the current, voltage ect just as I normally do from the panel. I search on the internet and I found that I could use HPSL programming language but what should I remotely control the 6060B using vb.net? I am using NI-VISA.NET API.
You need to find the command reference. There is a standard, SCPI, which has commands generally in the form
MEASure:CURRent?
Again this is a standard, and you should find the specific command reference for your device, HP (Keysight / Agilent) 6060B 300 Watt DC Electronic Load.
A search engine result for hp 6060b manual should yield some good results. Look for an operating or programming manual which usually has the command reference.
This should work for you:
' This example sets the current level to 0.75 amps
' and then reads back the actual current value.
' set input off
mbSession.Write("INPUT OFF")
' set mode to current
mbSession.Write("MODE:CURR")
' set current range
mbSession.Write("CURR:RANG 1")
' set current value
mbSession.Write("CURR 0.75")
' set input on
mbSession.Write("INPUT ON")
' measure current
Dim result As String
result = mbSession.Query("MEAS:CURR?")
Dim measuredCurrent As Single = Single.Parse(result)
Example taken from page 70 of this operating manual I found.
In general, things are usually easier if you are provided example code. I will usually use the example code to get a baseline working operation, then copy the code to my project and manipulate as needed.

Preventing Scheduled Agent from executing when modified

I have a scheduled agent that runs Weekly at one particular time on a target of All new & modified documents.
If I modify this agent, even if I only save it, it runs again.
If I remember correctly from long long ago, I have to add code such as this:
Dim db As NotesDatabase
Dim agent As NotesAgent
Set db = s.CurrentDatabase
Set agent = db.GetAgent("myAgent")
If agent.HasRunSinceModified = False
Exit Sub
End If
Am I remembering correctly? And I always wondered, why would an agent fire off after being modified? Makes no sense to me.
My response corresponds to your title: Preventing Scheduled Agent from executing when modified.
The solution is to move all your code to a script library, and never change the agent (since no need of it).
When you modify your code in the script library the agent is not fired.
You can also read Notes Designer runs agent after saving which suggest (I didn't test) Amgr_SkipPriorDailyScheduledRuns=1

Running SAS Stored Process in Excel fails after two runs

I hope this is an appropriate place to ask this question.
I have recently built a data analysis tool in Excel that works by submitting inputs to a SAS Stored Process (as an 'input stream'), running the processes and displaying the results in Excel.
I also use some code to check for and remove all active stored processes from the workbook before running the process again.
This runs successfuly the first 2 times, but fails on the third attempt. It always fails on the third attempt and I can't figure out why.
Is there some kind of memory allocation for Excel VBA that's exhausted by this stage? Or some other buffer that's maxed out? I've stepped-in to every line of the VBA code and it appears to hang (on the third run) at the following line:
SAS.InsertStoredProcess processLoc, _
outputSheet.Range("A1"), , , inputStream
Code used to initiate SAS Add-in for Microsoft Office:
Dim SAS As SASExcelAddIn
Set SAS = Application.COMAddIns.Item("SAS.ExcelAddIn").Object
Code used to delete stored processes from target output sheet:
Dim Processes As SASStoredProcesses
Set Processes = SAS.GetStoredProcesses(outputSheet)
Dim i As Integer
For i = 1 To Processes.Count
' MsgBox Processes.Item(i).DisplayName
Processes.Item(i).Delete
Next i
Code used to insert and run stored process:
Dim inputStream As SASRanges
Set inputStream = New SASRanges
inputStream.Add "Prompts", inputSheet.Range("DrillDown_Input")
SAS.InsertStoredProcess processLoc, _
outputSheet.Range("A1"), , , inputStream
Cheers
On reflection, my theory here would be that you are hitting the limit of multibridge connections. Each multibridge connection represents a port, and the more ports you have the more parallel connections are enabled. By default there are three, perhaps you have two, or you are kicking off another STP at the same time?
This would explain the behaviour. I had a spreadsheet that called STPs and it would always fail on the fourth call because the first three were running. You can get around this by either a) increasing the number of multibridge connections or b) chaining your processes so they run sequentially.
I don't know if this'll be useful, but I had a similar problem running a DLL written in C++ through VBA. The problem occurred because the function in the DLL returned a double value, which I didn't need and so the code in VBA did
Call SomeProcessFromDLL()
But the process was returning a double floating point value which was 'filling up' some buffer memory in VBA and VBA has a limited buffer (I think it gave up at 8 tries). So the solution for me was
Dim TempToDiscard as Double
TempToDiscard = SomeProcessFromDLL()
Maybe looking at the documentation of the process being called would help here, especially if it's returning some value to be discarded anyway, like a
Return 0;
I never liked using the IOM in VBA, mainly due to issues with references and having to do client installs when rolling out applications. Last year I found a MUCH better way to connect Excel and SAS - using the Stored Process web application. Simply set up your server side SAS process with streaming output, and pass your inputs via an Excel Web query. No client installs, no worries about SAS version upgrades, hardly any code - am surprised it's not used more often!
See: http://rawsas.blogspot.co.uk/2016/11/sas-as-service-easy-way-to-get-sas-into.html

"System resource exceeded" when updating recordset, and possibly at other points

I'm experiencing an intermittent issue with an application that's Excel 2010 front-end, Access 2010 back end. It's in use by 5-10 users simultaneously. Recently, users have started intermittently receiving the following error:
Run-time error '3035': System resource exceeded.
Sometimes the Debug button is grayed out so I can't jump to the code that caused the error, but when it's available to click, it takes me to the following code:
'Open connection to back end DB
Set db = OpenDatabase(dbPath)
'Open a recordset of a table
Set RS = db.OpenRecordset(Tbl)
'loop through rows in a 2D array
For i = FR To LR
RS.AddNew
'loop through columns of the 2D array
For j = 1 to LC
'set values for various fields in the new record, using values from the array
Next
RS.Update
Next
Here, the RS.Update is marked as the line that's causing the error.
What's odd is that this problem comes and goes; users will repeatedly receive it when attempting to submit a certain data set, then, several hours later, when they try to submit the same data set again, the operation succeeds without the error. It's also perplexing that sometimes the Debug button is available and sometimes it isn't.
One issue might be the size of the Access back end; it's currently ~650 MB, and we didn't start getting these messages until it grew to around 600 MB.
Any ideas as to what could be causing this? Various Google hits indicate that this problem sometimes happens when a join query has too many fields, but this is just a recordset of a table, not a join query.
Ahh, methink that this is one of the strange errors that pop-up when you write a lot to a backend database that can't keep up with the management of the lock file.
The solution is to make sure you keep a connection open to the back-end database from each of your client and that you hold onto that connection until you close the client.
Just open a recordset to a table (say a dummy table with only one record), and keep that recordset open until you close the application.
Resource-wise, keeping this connection open will have no detrimental effect on performance or memory consumption, but it will ensure that the lock file is not continuously created/deleted every time a connection is open then closed.
Keeping that connection open will also substantially increase the performance of your data access.
Edit:
You should be more explicit when using recordsets and specify exactly the mode of operation you need:
Set RS = db.OpenRecordset(Tbl, dbOpenTable, dbFailOnError)
or, faster if you are only appending data:
Set RS = db.OpenRecordset(Tbl, dbOpenTable, dbAppendOnly + dbFailOnError)
Also make absolutely sure you close the recordset once you're finished with appending the data!:
Set RS = db.OpenRecordset(Tbl, dbOpenTable, dbAppendOnly + dbFailOnError)
With RS
'loop through rows in a 2D array
For i = FR To LR
.AddNew
'loop through columns of the 2D array
For j = 1 To LC
'set values for various fields in the new record,
'using values from the array
Next
.Update
Next
.Close
End With
Set RS = Nothing
This is caused by running out of available virtual memory (VM) aka swap disk. A 32 bit app cannot use more than 2gb and for some reason Access uses a lot of VM and when it needs more and cannot get any then you run out of system resources.
Solution is to make sure your VM is at least 4 times the RAM and to restart your PC at least daily, only this clears out the VM from garbage left lying around from other apps.
You will never have had this issue on a 32 bit OS, its only now with 64 bit OS that this happens.
Short Story
My 3035 error was caused by an add.new to a file without a primary key.
Composing a copy of the file with a primary key assigned and replacing the key-less file appears to have corrected the problem.
Long Story
I have been experiencing 3035 error on an Add.New to an existing backend table with >81,000 records. After searching the web for ideas and coming up dry I reflected on possible issues.
I compacted/repaired the backend files to no affect. Then decided to check the file design. It turns out there was no primary key assigned.
Assigning a primary key to the auto number ID field caused the same 3035 error! So I copied the data structure to a new file, assigned a primary key to the new file and then did a query append of the original file to the new file. Finally I renamed the files.
Using the new file appears to be working.

SSIS Excel RefreshAll() task SQL job fails when no one logged in

A few months ago, I made a SSIS package that is used to refresh a couple excel worksheets (+few other things)
Now, my college that is still on the project reports this weird behavior.
The SSIS package is scheduled as SQL Agent Job.
It runs hourly from 8am00. At 8am it always fails, his conclusion: at 8am is no one logged on the server.
The second time, at 9am, the job runs OK. (RefreshAll worked) His conclusion: there is always someone logged on the the server via RDP at that moment.
(In fact I don't know about the other runs later on the day)
The task is a VB Script task that calls the Excel Interop dlls. I remember having difficulties to get it working until I installed Excel 2010 x86 on the server. -> Excel is fully and legitimately installed on the server.
My guess and determination at that moment was that it sometimes went wrong somewhere and Excel did not close properly. When I opened taskmgr I found +10 instances of Excel.exe running... This was during development.
My college did an interesting test: scheduled the job every minute and logged on and off a few times to the server. Every time no one was logged on to the server (RDP) the job failed. When logged on, the job ran OK !
Below the code that is used in the 'RefreshAll' script task.
I also used threading.sleep because otherwise I got timeout errors. Found no other way.
Thanks in advance!!
L
Public Sub Main()
Dts.TaskResult = ScriptResults.Success
Dim oApp As New Microsoft.Office.Interop.Excel.Application
oApp.Visible = False
'oApp.UserControl = True
Dim oldCI As System.Globalization.CultureInfo = _
System.Threading.Thread.CurrentThread.CurrentCulture
System.Threading.Thread.CurrentThread.CurrentCulture = _
New System.Globalization.CultureInfo("en-US")
Dim wb As Microsoft.Office.Interop.Excel.Workbook
wb = oApp.Workbooks.Open(Dts.Variables("User::FileNameHandleFull1").Value.ToString)
oApp.DisplayAlerts = False
wb.RefreshAll()
Threading.Thread.Sleep(10000)
wb.Save()
wb.Close()
oApp.DisplayAlerts = True
oApp.Quit()
Runtime.InteropServices.Marshal.ReleaseComObject(oApp)
End Sub
End Class
Excel is a client application and normally it does require to have active user session on a machine it runs on. For a job like this, I would consider other approaches not involving working with excel process, eiter:
store the result in a sql server table, then use linked table from excel sheet to pull the data.
use export as .csv (comma separated values)
use 3rd party controls that write Excel format
they may fix it in next version of Excel?