Application layer references data layer directly - vb.net

In my experience the application layer should reference the business layer and the business layer should reference the data layer. I want to make a change to an app so that the application layer references the data layer directly as shown below:
Imports System.Data.SqlClient
Public Class ApplicationLayerClass
Public Function ProcessAllPersons()
Dim data As New DataLayerClass
Dim list As List(Of Person) = data.getAllPersons()
For Each person In list
'Call this function from the application client. Do some complex work on the person here.
Next
End Function
End Class
Public Class DataLayerClass
Public Function getAllPersons() As List(Of Person)
Dim list As List(Of Person) = New List(Of Person)
Dim p As New Person
Dim objCommand As New SqlCommand
Dim strConString As String = "Data Source=IANSCOMPUTER;Initial Catalog=Test;Integrated Security=True"
Dim objCon As New SqlConnection
objCon.ConnectionString = strConString
objCon.Open()
objCommand.Connection = objCon
objCommand.CommandText = "select * from person "
Dim objDR As SqlDataReader = objCommand.ExecuteReader
If objDR.HasRows Then
Using objCon
Do While objDR.Read
p.Name = objDR("Name")
p.age = objDR("Age")
list.Add(p)
Loop
End Using
End If
Return list
End Function
End Class
Public Class Person
Public Name As String
Public age As String
End Class
Alternatively I could create a class in the business layer that uses the adaptor pattern (http://en.wikipedia.org/wiki/Adapter_pattern) i.e. a function called BusinessLayerClass.ProcessAllPersons, which is called by ApplicationLayerClass.ProcessAllPersons and calls DataLayerClass.getAllPersons. Which option is more appropriate? I suppose it depends to some extent on the problem domain.

If you have a valid reason for calling the data layer directly, then do it. If you add a pass-through function in the business layer, then all you've done is added more code for no apparent benefit.
Now, if your business layer is exposed via an interface, IBusinessLayer for example, then adding a ProcessAllPersons() function to it and having it pass the call directly to the data layer makes more sense and consistency. This is what I would recommend.

the business layer should reference the data layer
This is one approach but arguably not the best if you're aiming at maximum decoupling between your modules and being able to change your data access layer easily without touching your business layer (this is called Persistence Ignorance).
Have a look at Onion Architecture and/or Hexagonal Architecture. These architectural styles emphasize on your business layer being at the core of the system and independent from any other layer.
Concretely, how it works is you define abstractions for data access objects in your business core, which will be implemented in peripheral modules and used by the application layer.
http://jeffreypalermo.com/blog/the-onion-architecture-part-1/
http://alistair.cockburn.us/Hexagonal+architecture
http://blog.kamil.dworakowski.name/2010/08/from-layers-to-hexagon-architecture.html

If you have no intention of changing your database provider and you have the ADO.net skills that a business layer would protect developers from, then you don't need a business layer.
However, where are you going to put your business logic?

Related

VB store list of records in a text file and read back

I have a VB program in VS2017. It reads many excel files and store relevant data in
Dim total_sessions As New List(Of Session_CSV_file)
The list is big (~10k), and includes about 20 fields each.
Is there a way in VB to store it in a file in one command and read it later on easily., without reading and writing each field?
Often I need to debug the later part. I prefer doing it without rerunning the whole first part.
Or is there a way in Visual Studio, to stop the run at a certain point, and store this point in a dump file, like simulators often do? (so they can run the simulation from this point later on)
Perhaps if you load your csv data into a DataTable instead:
Dim dt as New DataTable
dt.Columns.Add("Name")
dt.Columns.Add("Age", GetType(Int32))
For Each filepath in tenKFilePaths
For Each line in File.ReadAllLines(file)
Dim bits = line.Split(","c)
dt.Rows.Add(bits(0), Convert.ToInt32(bit(1))
Next line
Next filepath
Then you can save the whole lot to xml:
dt.WriteXml("c:\temp\my.xml")
Have a coffee, reboot etc the read them again and carry on where you left off:
Dim dt2 as New DataTable
dt2.ReadXml("c:\temp\my.xml")
Raw datatables are a bit difficult to work with because you end up accessing their rows and column data by string indexing and having to cast it a lot - fairly lame. There is a better way to use them, as visual studio can create custom classes that inherit from the base datatables and offer lots of functionality that does away with all the awkwardness of dealing with datatables as a very generic thing. Just like the forms designer creates a UI form that inherits from Form, the dataset designer creates custom datatables that inherit from the base
If you add a new DataSet type object to your project, double click it it opens something like a database designer. Right click that surface and choose Add DataTable, Add your Columns with their respective datatypes etc - this is the equivalent of your Session_Csv_File class, with its properties and fields. Double click on it and you can add code etc - if your session_csv_file has custom methods you can transfer their code to the custom data row
Imagine I created a new datatable in a DataSet and called it SessionCsvFiles. Imagine I added a Name and Age column, and I added a custom method called BlahBlah
I'd then be able to say in code things like:
Dim dt as New MyCustomDataSet.SessionCsvFilesDataTable
'it's like making a new SessionCsvFile
Dim r = dt.NewSessionCsvFilesRow()
'It's like setting your properties
r.Name = csvBits(0)
r.Age = Convert.ToInt32(csvBits(1))
'It's like adding your file to the List
dt.Add(r)
You now have virtually the same thing as you had before - a collection of objects that represents things about your csv files. The difference is with this route it has an already-built-in way of saving all the properties and the collection to xml and reading it back again
Add the Serializable attribute to the Class. I added a parameterized constructor for ease in building my list but you must provide a constructor without parameters for this to work. Thus, the empty Sub New.
<Serializable()>
Public Class Session_CSV_file
Public Property ID As Integer
Public Property Name As String
Public Property Type As String
Public Sub New()
End Sub
Public Sub New(SessID As Integer, SessName As String, SessType As String)
ID = SessID
Name = SessName
Type = SessType
End Sub
End Class
The add Imports System.Xml.Serialization to the top of the file.
Private Sub SaveList()
Dim serializer = New XmlSerializer(GetType(List(Of Session_CSV_file)))
Using writer As New StreamWriter("C:\Users\xxx\Documents\XMLtest.xml")
serializer.Serialize(writer, lst)
End Using
End Sub
I filled my list as follows but you can fill you list any way you choose.
Private lst As New List(Of Session_CSV_file)
Private Sub FillList()
Using cn As New SqlConnection(My.Settings.CoffeeConnection),
cmd As New SqlCommand("Select Top 10 * From Coffees;", cn)
cn.Open()
Dim reader = cmd.ExecuteReader
While reader.Read
Dim sess As New Session_CSV_file(reader.GetInt32(0), reader.GetString(1), reader.GetString(3))
lst.Add(sess)
End While
End Using
End Sub
To recreate the list... (OOps! forgot this)
Private RehydratedList As New List(Of Session_CSV_file)
Private Sub CreateListFromXML()
Dim serial As New XmlSerializer(GetType(List(Of Session_CSV_file)))
Using fs As New FileStream("C:\Users\maryo\Documents\XMLtest.xml", FileMode.Open)
RehydratedList = DirectCast(serial.Deserialize(fs), List(Of Session_CSV_file))
End Using
For Each item In RehydratedList
Debug.Print(item.ToString)
Next
End Sub

VBA Factory with VB_PredeclaredId = True vs "anonymous" instance pros and cons

Curios what are advantages/disadvantages between Factory with VB_PredeclaredId = True and "anonymous" instance in VBA. Each for its own scenarios or better stick to one of them and why? Any feedback or links where I can read more about it would be appreciated! Thank you!
Worksheet data staring A1
Name Manager Target
Person1 Manager1 97
Person2 Manager2 92
Person3 Manager3 95
Person4 Manager4 105
Person5 Manager5 108
Person6 Manager6 88
Factory - Class
'#Folder("VBAProject")
Option Explicit
'#PredeclaredId
Public Function getPerson(ByVal name As String, ByVal manager As String, ByVal target As Double) As Person
With New Person
.name = name
.manager = manager
.Targer = target
Set getPerson = .self
End With
End Function
Person - Class
Private pName As String
Private pManager As String
Private pTarger As Double
Public Property Get Targer() As Double
Targer = pTarger
End Property
Public Property Let Targer(ByVal value As Double)
pTarger = value
End Property
Public Property Get manager() As String
manager = pManager
End Property
Public Property Let manager(ByVal value As String)
pManager = value
End Property
Public Property Get name() As String
name = pName
End Property
Public Property Let name(ByVal value As String)
pName = value
End Property
Public Function toString() As String
toString = "Name: " & Me.name & ", Manager: " & Me.manager & ", Targer: " & Me.Targer
End Function
Public Function self() As Person
Set self = Me
End Function
Test - Module
Sub test()
Dim i As Long
For i = 2 To 6
With New Person
.name = Range("A" & i)
.manager = Range("b" & i)
.Targer = Range("c" & i)
Debug.Print .toString
End With
Debug.Print Factory.getPerson(name:=Range("A" & i), _
manager:=Range("B" & i), target:=Range("C" & i)).toString
'or shorter whithout feild names
Debug.Print Factory.getPerson(Range("A" & i), Range("B" & i), Range("C" & i)).toString
Next i
End Sub
TL;DR: It's apples and bananas, really. Anonymous objects are a language constructs; factories aren't defined in the language specs, they're more of a design pattern, and yes, there are different reasons to use each different approach - although IMO a "factory bag module" is a bad idea.
Anonymous Objects
Anonymous objects are awesome - you get to invoke members on an object that's held by a With block, without needing to add a local variable. That's With New, but it's also particularly useful with With CreateObject:
With CreateObject("Scripting.FileSystemObject")
'...
End With
They create objects, but such objects are (normally - a .Self getter can thwart that) confined to the containing With block; anonymous objects are useful for objects you need right here & now, and no longer need beyond that point. They are a language feature, basically.
Factory Class (or module)
We're leaving the realm of language features, and entering that of design patterns. Now we're talking about encapsulating the relatively complex creation of a given object, inside a class dedicated to that purpose.
The VB_PredeclaredId = True attribute (whether set by a #PredeclaredId Rubberduck annotation or not) makes that class work like any other standard module.
The problem with a Factory class that exposes a getPerson (or CreatePerson) method, is that you now have an excuse to extend this with some getAnotherThing (or CreateAnotherThing), and before you know it you're looking at a class/module that's able to create just about anything, whether these things are even remotely related or not.
Surely there's a better way.
Factory Method
What if a Person object knew how to create a new instance of the Person class?
Set p = Person.Create(1188513, "Mathieu")
This requires having the VB_PredeclaredId = True attribute on the Person class, and Person's default interface to expose a Create method to be invoked from that default instance.
The problem is that now anything that consumes a Person object is now seeing a confusing API that exposes Property Let members for ID and Name properties, and a Create function that's only meant to be used from the default instance.
This problem has a solution though: the only code that's "allowed" to work with Person, should be code that's responsible for invoking the Create factory method on that type. Everything else only ever sees an IPerson - an explicit interface that Person implements to define how the rest of the code shall interact with an object of that type... and IPerson does not need to expose any Property Let members, or Create method.
There's another problem that the pattern doesn't solve though - imagine IPerson is implemented by two or more classes (say, Employee and Manager): even if the entire project only ever sees IPerson, there's still some code in there that's calling Employee.Create or Manager.Create, and that code is inherently coupled with these specific implementations of the IPerson interface. It's a problem, because such coupling essentially negates the benefits of coding against interfaces in the first place.
Abstract Factory
In order to decouple the code that's creating IPerson objects from Employee and Manager classes, we could create an IPersonFactory interface that abstracts the factory method itself: now the code consuming the factory and create IPerson objects doesn't even know what concrete type of objects it's creating, and the decoupling is complete.
You probably don't need that level of decoupling unless you have everything covered with a thorough suite of unit tests, but it's useful to know it exists regardless.
So you would have EmployeeFactory and ManagerFactory classes, both implementing IPersonFactory and supplying an implementation for some Create function that returns an IPerson object.
...to be honest this is where the person/employee example kind of falls apart, because such a class is more of a simple DTO (Data Transfer Object - i.e. a bunch of public fields, or read/write properties) than anything that actually has responsibilities - so let's drop it.
We'll have a SqlConnectionFactory, an OracleConnectionFactory, and a MySqlConnectionFactory, all implementing some IDbConnectionFactory to yield some IDbConnection object that encapsulates, you'll have guessed, a database connection.
So we could have code that looks like this:
Public Sub DoSomething(ByVal factory As IDbConnectionFactory)
Dim pwd As String
pwd = InputBox("SA Password?") 'bad example: now we're coupled with the InputBox API!
Dim conn As IDbConnection
Set conn = factory.Create("server name", "sa", pwd)
'...
End Sub
That DoSomething method would be able to do its thing against an SQL Server, Oracle, or MySQL database server, and never needs to care which one it's actually working with.
Abstract Factory is useful when you're injecting dependencies (c.f. "dependency injection") that themselves have dependencies that you cannot resolve until last-minute, for example because you need some parameter that the user needs to provide through some UI (which is itself ideally also abstracted behind an interface, so that you can unit-test the method without popping that UI - although, Rubberduck's Fakes API does allow you to hijack an InputBox... imagine we're popping a UserForm instead).

How can I improve the performance of this ADO.NET connection object

I have written a simple connection class, to read from an ODBC-compatible database.
The SQLplusConnection class is provided by the DB vendor, and inherits Data.Common.DbConnection
Imports System.Data.Entity
Imports AspenTech.SQLplus
Public Class IP21Connection
Private connection As SQLplusConnection
Public Const SQLPlusISO8601Format = "yyyy-MM-ddTHH:mm:ssZ"
Public Sub New(hostName As String)
Dim connectionStringBuilder = New SQLplusConnectionStringBuilder With {
.Host = hostName
}
connection = New SQLplusConnection(connectionStringBuilder.ConnectionString)
End Sub
Public Function ExecuteSQLCommand(SQLCommand As SQLplusCommand) As DataSet
Dim resultDataSet = New DataSet()
#If DEBUG Then
Debug.WriteLine(SQLCommand.CommandText)
#End If
Using connection
connection.Open()
Dim adapter = New SQLplusDataAdapter(SQLCommand.CommandText, connection)
adapter.Fill(resultDataSet)
End Using
Return resultDataSet
End Function
End Class
However, the performance is awful.
I have run a performance analysis to understand why, and I found out that:
15% of the runtime is DbDataAdapter.Fill()
65% of the runtime is SQLplusConnection.Open()
It is my understanding that ADO.NET does not really open a new connection everytime I call connection.Open(), but rather used a pool, and got the connection from it.
But given that the program spends four times as much time opening the connections to the DB than executing the actual query, I am starting to wonder if I am not missing an important point.
Could you give me any pointers to improve the performance?

Thread safe variable in VB.NET

Is this the right way of declaration dbreaders when mulitple users access the same page?
public dbReader as system.Data.IDataReader at class level or
Dim dbReader as System.Data.IDataReader in each function inside a class.
What would be the best practice to make the dbReader thread safe in VB.Net?
Does declaring them as static makes it thread safe?
Thanks in Advance,
If you'd like each thread to modify the variable without 'fear' another thread will change it somewhere along the line, it is best you adorn the variable with the ThreadStatic attribute.
The ThreadStatic attribute creates a different instance of the variable for each thread that's created so you're confident there won't be any race conditions.
Example (from MSDN)
Imports System
<ThreadStatic> Shared value As Integer
I would recommend you using reentrant functions when possible which are thread safe by definition instead of using class fields:
Function GetIds() As IEnumerable(Of Integer)
Dim result = New List(Of Integer)()
Using conn = New SqlConnection("SomeConnectionString")
Using cmd = conn.CreateCommand()
conn.Open()
cmd.CommandText = "SELECT id FROM foo"
Using reader = cmd.ExecuteReader()
While reader.Read()
result.Add(reader.GetInt32(0))
End While
End Using
End Using
End Using
Return result
End Function
If you are Diming the variable in a function, no other thread can access that variable making it thread-safe by definition.
However, if you are declaring it at a class level, you might want to use SyncLock which will prevent other threads from accessing it if it is currently being used by an other one.
Example:
Public Sub AccessVariable()
SyncLock Me.dbReader
'Work With dbReader
End SyncLock
End Sub

Disposing of custom Data Access Layer references

Our application uses a custom DataAccessLayer class almost exclusively, and within that we do use Data Access Application Block (currently version 2). We are getting the infamous "GetOrdinal" error sporadically. We are not using out-of-method connections. We are using DAAB version 2. Below is a typical example of our DAL methods:
Public Function MyDALMethod(ByVal Param1 As Integer, ByVal Param2 As Integer) As System.Data.IDataReader
Dim db As Database = DatabaseFactory.CreateDatabase()
Dim sqlCommand As String = "usp_MyProcedure"
Dim dbCommand As DbCommand = db.GetStoredProcCommand(sqlCommand)
db.AddInParameter(dbCommand, "Param1", DbType.Int32, MyParam1)
db.AddInParameter(dbCommand, "Param2", DbType.Int32, MyParam2)
Return db.ExecuteReader(dbCommand)
End Function
In our code we just instantiate a DAL var and call the desired method. After using the DataReader, the referencing code will close, dispose, and set the reader to nothing. However, nothing is done with the reference to the DAL. I've wondered if this is part of our problem. A typical method would use our DAL like this:
Dim DAL as New DataAccessLayer
Dim dbReader as system.data.idatareader
dbreader = DAL.MyDALMethod(parm1, parm2)
While dbreader.read
i = dbreader.items("column1")
j = dbreader.items("column2")
End While
dbreader.close()
dbreader.dispose()
dbreader = nothing
My main question is should these DAL references be disposed somehow? It's a custom class written in VB.NET, so it doesn't implement IDisposable so I'm not sure if there's anything to be done or not, but we do have errors and issues (like the GetOrdinal problem) which seem to be load-related, and I'm wondering if this is part of the problem.
If the DAL holds at least one member variable that is Disposable (implements IDisposable), then it too should implement IDisposable. This would then indicate to DAL's client that Dispose() should be called. The DAL's Dispose() would then call the member variable(s) Dispose().
That is the general guidance for implementing IDisposable.
BTW - there is no need to dbreader = nothing - it achieves nothing. See Eric Lippert's post