VB.net Environment Changes But I Don't See Where - vb.net

So my code works fine on my system running Windows 7 Ultimate. It runs fine on a laptop running the same. But my servers are either Windows server OS's or in one case Windows 7 Professional.
Now part of my code reads data from a website. But this data may or may not be there. I have a browser control reading it and I use StringSplit to break it down. If certain parts are there great. If not I let a catch set it to 0.
So the start bit:
Dim parts As String() = webdata.Split(New String() {Environment.NewLine},
StringSplitOptions.None)
I had assumed this is where the problem was. Enviroment.NewLine could be different even under the slightly different Windows 7 server. So I ran a loop on both to dump the data.
Dim testloop As Integer = 0
Do Until testloop = 25
MsgBox(testloop & ") " & parts(testloop))
testloop = testloop + 1
Loop
They both return exactly the same results in exactly the same order. So unless I'm missing something it can't be an environment change there.
But this reads fine on my Windows systems just not the VPS:
Try
Firstdata = Parts(22)
Seconddata = Parts(24)
Catch
Firstdata = 0
Seconddata = 0
End Try
This might not be the neatest way to do it but it works consistently outside of the VPS and I can't see why an environment would change that.
Looking at the exact same data at the exact same time locally it looks at the table returning the data when it's there and 0's otherwise. On the VPS it just returns 0's.
I thought this would be a simple change to the stringsplit but the loop shows (22) and (24) as the data I'm looking for in the loop. I don't see why the TRY would be different in a different environment.
Any help here would be much appreciated. I can set up a 7 Ultimate server but I'd rather understand what the heck is going on.

Related

Same Winforms .Net binary uses much more memory on one PC and doesn't release it (GC not working maybe?)

I have a problem with an application that's driving me crazy, and I really don't know what else I can do to investigate further.
I've got a simple Windows Forms application (32-bit, VB.net, Framework 4.5.1) behaving very strangely on only one PC, regarding memory allocation. On all other PCs I tried it's working fine! There MUST be something on that particular PC, but I'm at a loss how to figure out what it is, short of reinstalling everything one by one...
So, on all PCs except one, the memory utilization of the application is fine and doesn't go above 10-20 Megabytes. On one PC only, which happens to be my own development PC, the memory keeps increasing and never gets released, until it reaches the 2 GB Windows limit and throws an Out of Memory exception. I copied the exact same binaries on 6 different PCs (four Win7, two Win10), and confirmed that the application is behaving fine (and consistently) on all those PCs. On my own PC only, even when the application starts without initiating the transfer, it seems to be occupying at least 20 megabytes more compared to other PCs. So whatever it is, must be something on the PC which is causing this.
The app is using an SqlDataReader with SqlBulkCopy to transfer several gigabytes of data from one database to another. The transfer is done month by month. All the SqlDataReader, SqlBulkCopy, SqlConnection and SqlCommand objects are disposed after each month, since I'm enclosing them in a Using block. I'm even calling GC.Collect at the end of each iteration.
I even used Visual Studio's memory profiler to try and figure out what's happening, but all it's saying is that the majority of the memory is requested by the SqlBulkCopy.WriteToServer command. This doesn't seem to be particularly insightful.
Now here's another interesting note: If I use the Winapi method EmptyWorkingSet after every iteration (not shown in code below), the memory seems to be getting released. So why on earth doesn't Dotnet's Garbage Collector do the same, as expected and why only on one PC???
Here's the code. Some parts modified to remove confidential stuff.
It's a VB.Net Winforms application using .Net Framework 4.5.1. It's build on VS 2017, Windows 10.
Any advise on how on earth to investigate this further, short of starting to remove Windows Updates and other software one-by-one from that particular workstation, would be greatly appreciated.
Many thanks!
Private Sub CopyData()
' Database connection strings for source and destination db
Dim conn_string_source As String = "<CONFIDENTIAL DATA MASKED>",
conn_string_destination As String = "<CONFIDENTIAL DATA MASKED>"
' For progress report.
Dim rows_total As Long,
rows_done As Long
'
' Find total rows to transfer.
' For progress report.
'
Using _
conn_src As New SqlConnection(conn_string_source),
cmd As New SqlCommand("Select Count(*) From dbo.SOURCE_TABLE", conn_src)
rows_total = CLng(cmd.ExecuteScalar())
Console.WriteLine("{0:#,##0} total rows to transfer.", rows_total)
End Using
'
' Transfer rows for one year, one month at a time.
' Dispose connections, readers and bulkcopy class at end of each month.
'
For which_month = 1 To 12
Using _
conn_dest As New SqlConnection(conn_string_destination),
conn_src As New SqlConnection(conn_string_source),
bc = New SqlClient.SqlBulkCopy(conn_dest)
'
' For progress report.
' Calculate rows for this month.
'
Dim rows_this_month As Long
Using cmd = New SqlCommand("Select Count(*)
From dbo.SOURCE_TABLE
Where posting_month = #mm", conn_src)
cmd.Parameters.AddWithValue("#mm", which_month)
rows_this_month = cmd.ExecuteScalar()
End Using
'
' Setup the Bulk Copy operation
'
bc.DestinationTableName = "dbo.TARGET_TABLE"
bc.BulkCopyTimeout = 0 ' Never timeout
bc.BatchSize = 3000 ' Rows per batch. This values seems efficient enough.
bc.NotifyAfter = 250 ' Raise notification event after every this number of rows copies.
' Microsoft says for better memory usage, though tests indicate doesn't make difference.
bc.EnableStreaming = True
' Notify us every few rows to know how much was finished.
AddHandler bc.SqlRowsCopied,
Sub(sndr, rr)
Console.WriteLine(" ...{0:#,##0} / {1:#,##0}", rr.RowsCopied, rows_this_month)
End Sub
'
' Perform the actual transfer.
' Feed a reader to the SqlBulkCopy operation.
'
Using cmd As New SqlCommand("Select *
From dbo.SOURCE_TABLE
Where posting_month = #mm", conn_src)
cmd.Parameters.AddWithValue("#mm", which_month)
Using rdr = cmd.ExecuteReader()
bc.WriteToServer(rdr)
End Using
End Using
rows_done += rows_this_month
Console.WriteLine(" GRAND TOTAL: {0:#,##0} / {1:#,##0}", rows_done, rows_total)
End Using
GC.Collect()
Next
End Sub

Why has my VB.net program decided to be case fussy?

I have a small VB.net program that opens some text files, 2 Excel files, a PDF file & creates a new Word document, or opens it if it already exists. Nothing clever, no databases & it works just fine.
I have just bought a Lima device (https://meetlima.com) and have moved my data & program onto the HD attached to it. This is showing on my PC as the "L" drive and everything seems to work EXCEPT the program has now decided to be fussy about the case of the data names !!!
For instance, this code worked just fine before I moved it from my "D" drive to "L" when trying to use the file Tes1 when the value of myLeftCB is TES
myRARfile = myFolder + myLeftCB + mySession + ".zip"
Now I have had to edit the code to this
myRARfile = myFolder + StrConv(myLeftCB, VbStrConv.ProperCase) + mySession + ".zip"
The trouble is, there are a number of similar checks & I don't really want to change all of them, if possible I'd just like to know why something has changed, and where, so I can hopefully change it back !!!
Because Lima is a separate device acting as a file system it's almost certain to be running some form of *nix and that means a case sensitive file system. In order to map case insensitive Windows filenames onto case-sensitive storage they must be doing some magic, leading to your unexpected results.
I agree with #Steve that you're going to have to seek answers from the vendor documentation for this one. In particular, examine the technical documentation to see if there are settings that control how case sensitivity is managed.
If you've got a number of places where this code is duplicated then I'd say that you've already got a weak point in your code. I suggest replacing all of those lines with call to a function that returns the filename. That's a pain to do but it replaces the code duplication so if you have to change the code in future you're only changing it in one place. Your code is easier to maintain and when the Lima vendors change their API in two months you'll only have to change the code in one place to fix it.

Running SAS Stored Process in Excel fails after two runs

I hope this is an appropriate place to ask this question.
I have recently built a data analysis tool in Excel that works by submitting inputs to a SAS Stored Process (as an 'input stream'), running the processes and displaying the results in Excel.
I also use some code to check for and remove all active stored processes from the workbook before running the process again.
This runs successfuly the first 2 times, but fails on the third attempt. It always fails on the third attempt and I can't figure out why.
Is there some kind of memory allocation for Excel VBA that's exhausted by this stage? Or some other buffer that's maxed out? I've stepped-in to every line of the VBA code and it appears to hang (on the third run) at the following line:
SAS.InsertStoredProcess processLoc, _
outputSheet.Range("A1"), , , inputStream
Code used to initiate SAS Add-in for Microsoft Office:
Dim SAS As SASExcelAddIn
Set SAS = Application.COMAddIns.Item("SAS.ExcelAddIn").Object
Code used to delete stored processes from target output sheet:
Dim Processes As SASStoredProcesses
Set Processes = SAS.GetStoredProcesses(outputSheet)
Dim i As Integer
For i = 1 To Processes.Count
' MsgBox Processes.Item(i).DisplayName
Processes.Item(i).Delete
Next i
Code used to insert and run stored process:
Dim inputStream As SASRanges
Set inputStream = New SASRanges
inputStream.Add "Prompts", inputSheet.Range("DrillDown_Input")
SAS.InsertStoredProcess processLoc, _
outputSheet.Range("A1"), , , inputStream
Cheers
On reflection, my theory here would be that you are hitting the limit of multibridge connections. Each multibridge connection represents a port, and the more ports you have the more parallel connections are enabled. By default there are three, perhaps you have two, or you are kicking off another STP at the same time?
This would explain the behaviour. I had a spreadsheet that called STPs and it would always fail on the fourth call because the first three were running. You can get around this by either a) increasing the number of multibridge connections or b) chaining your processes so they run sequentially.
I don't know if this'll be useful, but I had a similar problem running a DLL written in C++ through VBA. The problem occurred because the function in the DLL returned a double value, which I didn't need and so the code in VBA did
Call SomeProcessFromDLL()
But the process was returning a double floating point value which was 'filling up' some buffer memory in VBA and VBA has a limited buffer (I think it gave up at 8 tries). So the solution for me was
Dim TempToDiscard as Double
TempToDiscard = SomeProcessFromDLL()
Maybe looking at the documentation of the process being called would help here, especially if it's returning some value to be discarded anyway, like a
Return 0;
I never liked using the IOM in VBA, mainly due to issues with references and having to do client installs when rolling out applications. Last year I found a MUCH better way to connect Excel and SAS - using the Stored Process web application. Simply set up your server side SAS process with streaming output, and pass your inputs via an Excel Web query. No client installs, no worries about SAS version upgrades, hardly any code - am surprised it's not used more often!
See: http://rawsas.blogspot.co.uk/2016/11/sas-as-service-easy-way-to-get-sas-into.html

program logic to speed up processing

I have setup an application to parse about 3000 files a day where each contains around 4000-5000 posts in xml format with like 100 fields.
It involves a lot of cleanup and parsing but on average it takes around 6 seconds per post. Now I tried threading but because of the way I have everything set up with variables being overwritten etc...I have divided the files into different folders and just created copies of the program to run and access the assigned folder. It is running on a windows 2008 server with 16 G of memory and I am told I need to reprogram to speed up the process and also not use so much memory.
Does anyone have any suggestions or does this process I have set up seem fine? I am the new guy and literally everyone thinks I am an idiot.
For i As Integer = 0 To fileLists.Count - 1
Do
Try
If Not completeList.Contains(fileLists(i).ToString) AndAlso fileLists(i).EndsWith("xml") Then
If fileLists(i).Contains("Fa") Then
inputFile = New StreamReader(fileLists(i))
data = String.Empty
infile = fileLists(i).ToString
swriter.WriteLine(infile.ToString)
swriter.Flush()
Dim objFileInfo As New FileInfo(fileLists(i))
fileDate = objFileInfo.CreationTime
Dim length As Integer = objFileInfo.Length
data = inputFile.ReadToEnd
If Not data Is Nothing Then
parsingTools.xmlLoad(data)
tempList.Add(fileLists(i))
completeList.Add(fileLists(i))
End If
inputFile.DiscardBufferedData()
End If
End If
End If
Ok I am not sure what code to post because there is literally a lot of code. The above is the main module and onece it reads in data it tries to load it into xml document, if it fails it parses it using ordinary text parsing. It navigates to each field I need to extract and also connects to a couple of web services to get more content before all this is added together to create a new xml file.
manager.AddNamespace("x", "http://www.w3.org/2005/Atom")
manager.AddNamespace("a", "http://activitystrea.ms/spec/1.0/")
Dim nodecount As Integer = xmlParser.getNodesCount(navigator, "x:entry", manager)
For i As Integer = 1 To nodecount
statid = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:id", manager)
contentDate = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:published", manager)
template = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:title", manager)
title = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:source/x:title", manager)
ctext = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:summary", manager)
htext = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/a:object/x:content", manager)
author = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:author/x:name", manager)
authorUri = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/x:author/x:uri", manager)
avatarUrl = xmlParser.XPathFind(navigator, "x:entry[" & i & "]/a:author/x:link[#rel='avatar']/#href", manager)
Next
The problem with something like this is the hard drive itself - depending upon many factors it can act as a funnel and essentially constrict the number of files you are able to interact with on the drive concurrently.
With that said, I'd highly recommend you take a look at the TPL (task parallel library) in .NET v4.0. It's a framework that vastly simplifies the act of "spreading the work across all the available cores" of your processors. My computer has dual processors, each with 4 native cores (Intel Xeon's # 3GHz) which gives me 8 cores. I have an application that downloads ~7,800 different URL's off the net and analyzes their content. Depending on the values it finds it will do some additional processing then store the results. This is somewhat similar to your situation in that we both share a restricting resource (for me it's the network) and we have to manually parse and evaluate the contents of the files we're working with.
My program used to take between 26 to 30 minutes (on average) to process all those files. This was using a properly implemented multithreaded application. By switching the code over to the TPL it now only takes 5 minutes. A HUGE improvement.
Take a look at the TPL and plan on having to make some changes to your code in order to maximize the potential improvements. But, the payoff can be fantastic if done correctly.
Does the application run continuously as a service or is it something you run once/or few times a day? If it doesnt run continuously you could try that and limit the processing to a few concurrent threads.

protecting software to run only on one computer in vb.net

I have developed a small application and now i want to protect it.
I want to run it only on my own computer and i have developed it for myself.
How can i do that?
A. Don't publish it.
B. Hard-code your computer name in the code, and make the first thing the program does to be verifying that System.Environment.MachineName matches it.
You could always check the processor ID or motherboard serial number.
Private Function SystemSerialNumber() As String
' Get the Windows Management Instrumentation object.
Dim wmi As Object = GetObject("WinMgmts:")
' Get the "base boards" (mother boards).
Dim serial_numbers As String = ""
Dim mother_boards As Object = _
wmi.InstancesOf("Win32_BaseBoard")
For Each board As Object In mother_boards
serial_numbers &= ", " & board.SerialNumber
Next board
If serial_numbers.Length > 0 Then serial_numbers = _
serial_numbers.Substring(2)
Return serial_numbers
End Function
Private Function CpuId() As String
Dim computer As String = "."
Dim wmi As Object = GetObject("winmgmts:" & _
"{impersonationLevel=impersonate}!\\" & _
computer & "\root\cimv2")
Dim processors As Object = wmi.ExecQuery("Select * from " & _
"Win32_Processor")
Dim cpu_ids As String = ""
For Each cpu As Object In processors
cpu_ids = cpu_ids & ", " & cpu.ProcessorId
Next cpu
If cpu_ids.Length > 0 Then cpu_ids = _
cpu_ids.Substring(2)
Return cpu_ids
End Function
Was taken from where: http://www.vb-helper.com/howto_net_get_cpu_serial_number_id.html
Here's a question by Jim to convert this for Option Strict.
It really depends on who is the "enemy".
If you wish to protect it from your greedy, non-cracker, friends, then you can simply have the application run only if a certain password is found in the registry (using a cryptographically secure hash function), or use the MachineName as Jay suggested.
But if you're thinking of protecting it from serious "enemies", do notice: It has been mathematically proven that as long as the hardware is insecure, any software running on it is inherently insecure. That means that every piece of software is crackable, any protection mechanism is bypassable (even secured-hardware devices such as Alladin's Finjan USB product key, since the rest of the hardware is insecure).
Since most (if not all) of today's hardware is insecure, you simply cannot get 100% security in a software.
In between, there are lots of security solutions for licensing and copy-protection. It all comes down to who is the enemy and what is the threat.
No matter how hard you try, if someone really want to run it on another computer, they will.
All need to do is reverse engineer your protection to
remove it
play with it
Another option might be to have your program ask the USER a question that has a derived answer. Here's a brain dead example....
Your Program: "What time is it now?"
You Enter: (TheYear + 10 - theDay + 11) Mod 13
In this way its actually ONLY YOU that can run the program instead of it being MACHINE dependent.
I have made things like this in VB DOS.
I either made a non-deletable file that is key to a specific machine with a code inside, and/or read the .pwl files and have several checks, that are only on your machine. The non-editable file is made with extended character sets like char 233 so when a person tries to look at it, it will open a blank copy (edit) (write.ex), so data cannot be read and it cannot be edited moved or deleted.
It needs to be certain characters; I am not sure if every charter between 128 and 255 will work it, some extended characters work to do this some will not, also it can be defeated, but it will keep some people out,
But it can be read or checked in a program environment. Nothing is totally secure, this is some of the things I mess with.
Note: the file will be very hard to delete, maybe make a test directory to test this.
I hope this is OK I am not very good at conveying info to people; I have programmed since 1982.
Another idea ... I wrote a program that cannot be run directly, it is only ran by an external file, so you could add in a password entry section to it and encrypt password so it cannot be read very easily ,I made an executable version of a vb program to test. it writes in to slack space a character so if the program sees that value it will not run, BUT the runner program has a different character, and it changes it to that character ,and the program is designed to only let in if the character is the proper one ,made only by the runner , then when it enters it changes it back so it is not left open , I have made this sorta thing, and it does work, there is always a way to defeat any protection , the goal is to slow them down or discourage them from running or using your program if you do not want them to.I may include examples at a later date.