I am creating an website. It has two web pages. I am trying to pass value between the two pages using wcf..
WCF has two functions
static int a;
void send(int b)
{
a=b;
}
int get()
{
return a;
}
class1 creates an object for wcf(say w) and calls w.send(5)
class2 creates an object for wcf(say w1) and calls a= w1.get()
But the value changed by class1 is not getting reflected in class2..
How are you hosting your service?
If your service is hosted in IIS, it is possible that the application was recycled between the 2 calls. In this case the app domain is recreated and the static members loose their values.
By the vary nature WCF is stateless.. What is done by w.send(5) is not known to w1.get() as both the calls are treated as seperate calls.
Either you save the data in some statefull mechanism like (table or file) or else the behavior is as expected.
Though it's static variable but when you create new object it will refer to that particular object only. SO when you are setting value of one variable with one object and getting value from other object then it wont give correct value.
So
w.send(5)
a= w1.get()
wont work.
Related
What is the differences between these two Properties?
I can use HttpContext.Items instead of HttpContext.Features to share data between middlewares. The only difference I see is that I tell Items for a key and it gives me an object and I have to cast it. This casting can be done in Features automatically.
Is there something else behind them?
The biggest difference is that the HttpContext.Items is designed to store Key-Value-Pair, while the HttpContext.Features is designed to store Type-Instance-Pair.
To be more clear, HttpContext.Items is designed to share items within the scope of current request, while the HttpContext.Features, which is an instance of IFeatureCollection, is by no means to be used like that .
The IFeatureCollection interface represents a collection of HTTP features, such as:
IAuthenticationFeature which stores original PathBase and original Path.
ISessionFeature which stores current Session.
IHttpConnectionFeature which stores the underlying connection.
and so on.
To help store and retrieve a Type-Instance-Pair, the interface has three important methods:
public interface IFeatureCollection : IEnumerable<KeyValuePair<Type, object>>{
// ...
object this[Type key] { get; set; }
TFeature Get<TFeature>();
void Set<TFeature>(TFeature instance);
}
and the implementation (FeatureCollection) will simply cast the value into required type:
public class FeatureCollection : IFeatureCollection
{
// ... get the required type of feature
public TFeature Get<TFeature>()
{
return (TFeature)this[typeof(TFeature)]; // note: cast here!
}
public void Set<TFeature>(TFeature instance)
{
this[typeof(TFeature)] = instance; // note!
}
}
This is by design. Because there's no need to store two IHttpConnectionFeature instances or two ISession instances.
While you can store some Type-Value pairs with FeatureCollection, you'd better not . As you see, the Set<TFeature>(TFeature instance) will simply replace the old one if the some type already exists in the collection; it also means there will be a bug if you have two of the same type.
HttpContext.Items is designed to share short-lived per-request data, as you mentioned.
HttpContext.Features is designed to share various HTTP features that allow middleware to create or modify the application's hosting pipeline. It's already filled with several features from .NET, such as IHttpSendFileFeature.
You should use HttpContext.Items to store data, and HttpContext.Features to add any new HTTP features that another middleware class might need.
I have been tasked with implementing a WCF service in VB.NET. This WCF service will be consumed by our own .NET Windows Forms application and provide data from a SQL Server database. The data will be displayed in a third party grid (Infragistics).
Originally I was going to use ADO.NET and return a datatable or dataset from SQL within the WCF service but I have read too many articles encouraging me to stay away from returning datatables over the internet. I am not worried about the interoperability but I am worried about the size/speed of the process. This had lead me to start using linq to sql to return entities which I can pass over the internet (.dbml)
The problem:
A lot of our data comes from stored procedures, these stored procedures return result sets that are a mix-match of existing tables and therefore does not match any return type. I have imported a stored procedure that returns (Auto-generated Type) however when I try and return this type over the WCF service I get this error in the tracelog:
Type
System.Data.Linq.SqlClient.SqlProvider+SingleResult'1[benchmark_prd_colour_findResult]
cannot be serialized. Consider marking it with the
DataContractAttribute attribute, and marking all of its members you
want serialized with the DataMemberAttribute attribute. If the type
is a collection, consider marking it with the
CollectionDataContractAttribute. See the Microsoft .NET Framework
documentation for other supported types.
Please let me know what I need to be doing to be able to return the results of a stored procedure when the results don't match any particular existing table
Translate it to VB! (Sorry, didnt see the tag... Assumed C#)
Build your own object - and mark this up like this:
[DataContract]
public class MyObject{
//here go all your fields which match whats returned in your stored proc
[DataMember]
public String MyField;
[DataMember]
public int MySecondField;
}
Then populate this when getting the result of your stored proc
public MyObject GetResult(){
stored_proc_result result = factory.callProc(); //obviously replace this with whatever you use!
MyObject obj = new MyObject{
MyField = FieldOne,
MySecondField = FieldTwo
};
return obj;
}
Then if you pass the MyObject instance in your WCF call it should work.
The scenario is as follows: I implemented a WCF service (lets call it X) which has its own data objects.
The service X is using another WCF service (Y) which has its own set of data objects. Service X needs to pass some data it receive from service Y to its clients (service X clients).
As far as i know, it is considered a "best practice" to translatethe objects received from Y service to data objects of X service.
What is the best practice when it comes to Enum values? Do i need to map each enum value or is there another way?
Generally the idea is to isolate users of your service from changes in your implementation. Therefore, you do not expose your implementation types on the wire. Image the situation where you decide to rename an enum value. If the service consumer does not update their implementation you will have introduced a breaking change as the service user will be sending the old enum value to you which will not deserialize correctly.
In addition you may find that not all of the enum values are applicable to users of your service (maybe they are used internally)
So, yes, you should translate enum values just like other types
If you give your enums explicit numeric values you could translate between them using casts
class Program
{
static void Main(string[] args)
{
Internal i = Internal.Too;
External e = (External) i;
Console.WriteLine(e);
}
}
enum Internal
{
One = 1,
Too = 2
}
[DataContract]
enum External
{
[EnumMember]
One = 1,
[EnumMember]
Two = 2
}
However, you would have to be careful that they did not become out of sync
I have an ASP application, which calls an HTTP WCF service, which calls a TCP WCF service (all on different servers). I'm ultimately trying to pass one class object between the three.
I've discovered that I can't do this directly in the HTTP WCF, even though my class object is defined identically in BOTH WCFs. Like this:
Public Function CallOtherFunction(ByVal ThisClass as MyClass)
Dim RetVal as Boolean
RetVal = CallMyOtherWCFFunction(ThisClass)
End Function
Instead I have to:
Public Function CallOtherFunction(ByVal ThisClass as MyClass)
Dim RetVal as Boolean
Dim MyOutgoingClass as MyOtherWCF.MyClass
MyOutgoingClass.MyString = ThisClass.MyString
RetVal = CallMyOtherWCFFunction(MyOutgoingClass)
End Function
My objects are rather large, to say they have a lot of properties. Any way to not have to declare a new variable in my calling function, so my code can be a little easier (like the first example)?
Thanks,
Jason
You can't pass it directly because those are two diffrent types. You can, however, declare your data contracts in a shared assembly (used by the three projects, or at least by the HTTP and the TCP services), an when adding the service reference to create the proxy in the HTTP service, you specify that you want to "reuse types in referenced assemblies". This way it should use the same type in all projects.
I have an object called Parameters that gets tossed from method to method down and up the call tree, across package boundaries. It has about fifty state variables. Each method might use one or two variables to control its output.
I think this is a bad idea, beacuse I can't easily see what a method needs to function, or even what might happen if with a certain combination of parameters for module Y which is totally unrelated to my current module.
What are some good techniques for decreasing coupling to this god object, or ideally eliminating it ?
public void ExporterExcelParFonds(ParametresExecution parametres)
{
ApplicationExcel appExcel = null;
LogTool.Instance.ExceptionSoulevee = false;
bool inclureReferences = parametres.inclureReferences;
bool inclureBornes = parametres.inclureBornes;
DateTime dateDebut = parametres.date;
DateTime dateFin = parametres.dateFin;
try
{
LogTool.Instance.AfficherMessage(Variables.msg_GenerationRapportPortefeuilleReference);
bool fichiersPreparesAvecSucces = PreparerFichiers(parametres, Sections.exportExcelParFonds);
if (!fichiersPreparesAvecSucces)
{
parametres.afficherRapportApresGeneration = false;
LogTool.Instance.ExceptionSoulevee = true;
}
else
{
The caller would do :
PortefeuillesReference pr = new PortefeuillesReference();
pr.ExporterExcelParFonds(parametres);
First, at the risk of stating the obvious: pass the parameters which are used by the methods, rather than the god object.
This, however, might lead to some methods needing huge amounts of parameters because they call other methods, which call other methods in turn, etcetera. That was probably the inspiration for putting everything in a god object. I'll give a simplified example of such a method with too many parameters; you'll have to imagine that "too many" == 3 here :-)
public void PrintFilteredReport(
Data data, FilterCriteria criteria, ReportFormat format)
{
var filteredData = Filter(data, criteria);
PrintReport(filteredData, format);
}
So the question is, how can we reduce the amount of parameters without resorting to a god object? The answer is to get rid of procedural programming and make good use of object oriented design. Objects can use each other without needing to know the parameters that were used to initialize their collaborators:
// dataFilter service object only needs to know the criteria
var dataFilter = new DataFilter(criteria);
// report printer service object only needs to know the format
var reportPrinter = new ReportPrinter(format);
// filteredReportPrinter service object is initialized with a
// dataFilter and a reportPrinter service, but it doesn't need
// to know which parameters those are using to do their job
var filteredReportPrinter = new FilteredReportPrinter(dataFilter, reportPrinter);
Now the FilteredReportPrinter.Print method can be implemented with only one parameter:
public void Print(data)
{
var filteredData = this.dataFilter.Filter(data);
this.reportPrinter.Print(filteredData);
}
Incidentally, this sort of separation of concerns and dependency injection is good for more than just eliminating parameters. If you access collaborator objects through interfaces, then that makes your class
very flexible: you can set up FilteredReportPrinter with any filter/printer implementation you can imagine
very testable: you can pass in mock collaborators with canned responses and verify that they were used correctly in a unit test
If all your methods are using the same Parameters class then maybe it should be a member variable of a class with the relevant methods in it, then you can pass Parameters into the constructor of this class, assign it to a member variable and all your methods can use it with having to pass it as a parameter.
A good way to start refactoring this god class is by splitting it up into smaller pieces. Find groups of properties that are related and break them out into their own class.
You can then revisit the methods that depend on Parameters and see if you can replace it with one of the smaller classes you created.
Hard to give a good solution without some code samples and real world situations.
It sounds like you are not applying object-oriented (OO) principles in your design. Since you mention the word "object" I presume you are working within some sort of OO paradigm. I recommend you convert your "call tree" into objects that model the problem you are solving. A "god object" is definitely something to avoid. I think you may be missing something fundamental... If you post some code examples I may be able to answer in more detail.
Query each client for their required parameters and inject them?
Example: each "object" that requires "parameters" is a "Client". Each "Client" exposes an interface through which a "Configuration Agent" queries the Client for its required parameters. The Configuration Agent then "injects" the parameters (and only those required by a Client).
For the parameters that dictate behavior, one can instantiate an object that exhibits the configured behavior. Then client classes simply use the instantiated object - neither the client nor the service have to know what the value of the parameter is. For instance for a parameter that tells where to read data from, have the FlatFileReader, XMLFileReader and DatabaseReader all inherit the same base class (or implement the same interface). Instantiate one of them base on the value of the parameter, then clients of the reader class just ask for data to the instantiated reader object without knowing if the data come from a file or from the DB.
To start you can break your big ParametresExecution class into several classes, one per package, which only hold the parameters for the package.
Another direction could be to pass the ParametresExecution object at construction time. You won't have to pass it around at every function call.
(I am assuming this is within a Java or .NET environment) Convert the class into a singleton. Add a static method called "getInstance()" or something similar to call to get the name-value bundle (and stop "tramping" it around -- see Ch. 10 of "Code Complete" book).
Now the hard part. Presumably, this is within a web app or some other non batch/single-thread environment. So, to get access to the right instance when the object is not really a true singleton, you have to hide selection logic inside of the static accessor.
In java, you can set up a "thread local" reference, and initialize it when each request or sub-task starts. Then, code the accessor in terms of that thread-local. I don't know if something analogous exists in .NET, but you can always fake it with a Dictionary (Hash, Map) which uses the current thread instance as the key.
It's a start... (there's always decomposition of the blob itself, but I built a framework that has a very similar semi-global value-store within it)