(Sonarqube) (Custom Plugin) (get SonarQube Session) - sonarqube5.1

Dear SonarQube developers,
I'd like to get the SonarQube session in "Initializer" phase, and put some object variables in the session in order to recover them when analyzing my custom rules.
So that I don't calculate these objects many times in the same analysis session.
Is it possible?
How can I do it?
Thanks a lot for you advice.
Wognin

I've solved the problem by using Java System Properties in the ProjectBuilder phase, this way :
ParametresProjet paramProj = parseFichierPom(projDef.getKeyWithBranch());
if (paramProj != null) {
Properties properties = System.getProperties();
// sauvegarde des objets dans la session
properties.put(PROPRIETES_DU_POM, paramProj );
Boolean isException = ProjetControleur.isProjetEnException(paramProj.getGroupId(), paramProj.getArtifactId());
properties.put(PROJET_EN_EXCEPTION, isException);
System.setProperties(properties);
}
And further I could recover these properties during the Java analyse, this way :
private static Boolean isException;
/** static initializer */
static {
isException = (Boolean) System.getProperties().get(PROJET_EN_EXCEPTION);
}
That solved my problem but I don't know if it's a good approach to put and recover object in SonarQube's session.
What do you think about it ?
WOGNIN

Related

The entity was never added to this scoreDirector exception during custom cloning

I'm trying to implement custom cloning in my solution, i followed the instructions as in the documentation, and i encountered a roadblock in the form of this exception : The entity was never added to this ScoreDirector. Maybe that specific instance is not in the return values of the PlanningSolution's entity members. I know that this is not true because before the custom cloning, this exception wasn't thrown.
My planningClone method is setup like this :
#Override
public Solution planningClone() {
Solution clonedSolution = new Solution();
clonedSolution.id = id;
clonedSolution.code = code;
clonedSolution.score = score;
clonedSolution.field1 = field1;
clonedSolution.field2 = field2;
...............
clonedSolution.fieldN = fieldN;
List<PlanningEntity1> clonedPlanningEntity1List= new ArrayList<PlanningEntity1>(planningEntity1List.size());
List<PlanningEntity2> clonedPlanningEntity2List= new ArrayList<PlanningEntity2>(planningEntity1List.size());
for (PlanningEntity1 planningEntity: planningEntity1List) {
clonedPlanningEntity1List.add(planningEntity.clone());
}
for (PlanningEntity2 planningEntity: planningEntity2List) {
clonedPlanningEntity1List.add(planningEntity.clone());
}
clonedSolution.planningEntity1List = clonedPlanningEntity1List;
clonedSolution.planningEntity2List = clonedPlanningEntity2List;
return clonedSolution;
{
The clone method for my planning entities is implemented through the Java interface Cloneable:
protected PlanningEntity clone() {
try {
return (PlanningEntity) super.clone();
} catch (CloneNotSupportedException e) {
e.printStackTrace();
}
return null;
}
Just to be sure, i checked every entity instance and their collections to make sure my cloning was working correctly, and it in fact is.
What step am i missing here?
If there is one planning entity pointing to another planning entity from a different class or maybe pointing to a list, than the cloning process needs to take care of the references for those planning entities so they point to the cloned objects.This is something that the default cloning process is doing without a problem, and thus leaving the solution in a consistent state. It even updates the Lists of planning entity instances in the parent planning entities correctly (covered by the method "cloneCollectionsElementIfNeeded" from the class "FieldAccessingSolutionCloner" from the OptaPlanner core).
So for example if we have the next two planning entity classes
#PlanningEntity
public class ParentPlanningEntityClass{
List<ChildPlanningEntityClass> childPlanningEntityClassList;
}
#PlanningEntity
public class ChildPlanningEntityClass{
ParentPlanningEntityClass parentPlanningEntityClass;
}
The "parentPlanningEntityClass" variable needs to be set to point to the cloned object. When it comes to the list "childPlanningEntityClassList" it first needs to be created from scratch with "new ArrayList();" so that both the working and new best solution (the one that is currently getting cloned) don't point to the same list. At the end the newly created list needs to be filled with the cloned objects.

CRM 2011 - ITracingService getting access to the traceInfo at runtime

I have some custom logging in my plugin and want to include the contents of my tracingService in my custom logging (which is called within a catch block, before the plugin finishes).
I cant seem to access the content of tracingService. I wonder if it is accessible at all?
I tried tracingService.ToString() just incase the devs had provided a useful overload, alas as expected I get name of the class "Microsoft.Crm.Sandbox.SandboxTracingService".
Obviously Dynamics CRM makes use of the tracingService content towards the end of the pipeline if it needs to.
Anybody have any ideas on this?
Kind Regards,
Gary
The tracing service does not provide access to the trace text during execution but that can be overcome by creating your own implementation of ITracingService. Note, you cannot get any text that was written to the trace log prior to the Execute method of your plugin being called - meaning if you have multiple plugins firing you won't get their trace output in the plugin that throws the exception.
public class CrmTracing : ITracingService
{
ITracingService _tracingService;
StringBuilder _internalTrace;
public CrmTracing(ITracingService tracingService)
{
_tracingService = tracingService;
_internalTrace = new StringBuilder();
}
public void Trace(string format, params object[] args)
{
if (_tracingService != null) _tracingService.Trace(format, args);
_internalTrace.AppendFormat(format, args).AppendLine();
}
public string GetTraceBuffer()
{
return _internalTrace.ToString();
}
}
Just instantiate it in your plugin passing in the CRM provided ITracingService. Since it is the same interface it works the same if you pass it to other classes and methods.
public class MyPlugin : IPlugin
{
public void Execute(IServiceProvider serviceProvider)
{
var tracingService = new CrmTracing((ITracingService)serviceProvider.GetService(typeof(ITracingService)));
tracingService.Trace("Works same as always.");
var trace = tracingService.GetTraceBuffer();
}
}
To get the traceInfo string from traceService at runtime I used debugger to interrogate the tracingService contents.
So the trace string is accessible from these expressions...
for Plugins
((Microsoft.Crm.Extensibility.PipelineTracingService)(tracingService)).TraceInfo
for CWA
((Microsoft.Crm.Workflow.WorkflowTracingService)(tracingService)).TraceInfo
You can drill into the tracing service by debugging and extract the expression.
However, at design time neither of these expressions seem to be accessible from any of the standard CRM 2011 SDK dlls... so not sure if its possible as yet.

How do I get Xtext's model from a different plugin?

I've written an Xtext-based plugin for some language. I'm now interested in creating a new independent view (as a separate plugin, though it requires my first plugin), which will interact with the currently-active DSL document - and specifically, interact with the model Xtext parsed (I think it's called the Ecore model?). How do I approach this?
I saw I can get an instance of XtextEditor if I do something like this when initializing my view:
getSite().getPage().addPartListener(new MyListener());
And then, in MyListener, override partActivated and partInputChanged to get an IWorkbenchPartReference, which is a reference to the XtextEditor. But what do I do from here? Is this even the right approach to this problem? Should I instead use some notification functionality from the Xtext side?
Found it out! First, you need an actual document:
IXtextDocument doc = editor.getDocument();
Then, if you want to access the model:
doc.modify(new IUnitOfWork.Void<XtextResource>() { // Can also use just IUnitOfWork
#Override public void process(XtextResource state) throws Exception {
...
}
});
And if you want to get live updates whenever it changes:
doc.addModelListener(new IXtextModelListener() {
#Override public void modelChanged(XtextResource resource) {
for (EObject model : resource.getContent()) {
...
}
}
});

Very slow performance deserializing using datacontractserializer in a Silverlight Application

Here is the situation:
Silverlight 3 Application hits an asp.net hosted WCF service to get a list of items to display in a grid. Once the list is brought down to the client it is cached in IsolatedStorage. This is done by using the DataContractSerializer to serialize all of these objects to a stream which is then zipped and then encrypted. When the application is relaunched, it first loads from the cache (reversing the process above) and the deserializes the objects using the DataContractSerializer.ReadObject() method. All of this was working wonderfully under all scenarios until recently with the entire "load from cache" path (decrypt/unzip/deserialize) taking hundreds of milliseconds at most.
On some development machines but not all (all machines Windows 7) the deserialize process - that is the call to ReadObject(stream) takes several minutes an seems to lock up the entire machine BUT ONLY WHEN RUNNING IN THE DEBUGGER in VS2008. Running the Debug configuration code outside the debugger has no problem.
One thing that seems to look suspicious is that when you turn on stop on Exceptions, you can see that the ReadObject() throws many, many System.FormatException's indicating that a number was not in the correct format. When I turn off "Just My Code" thousands of these get dumped to the screen. None go unhandled. These occur both on the read back from the cache AND on a deserialization at the conclusion of a web service call to get the data from the WCF Service. HOWEVER, these same exceptions occur on my laptop development machine that does not experience the slowness at all. And FWIW, my laptop is really old and my desktop is a 4 core, 6GB RAM beast.
Again, no problems unless running under the debugger in VS2008. Anyone else seem this? Any thoughts?
Here is the bug report link: https://connect.microsoft.com/VisualStudio/feedback/details/539609/very-slow-performance-deserializing-using-datacontractserializer-in-a-silverlight-application-only-in-debugger
EDIT: I now know where the FormatExceptions are coming from. It seems that they are "by design" - they occur when when I have doubles being serialized that are double.NaN so that that xml looks like NaN...It seems that the DCS tries to parse the value as a number, that fails with an exception and then it looks for "NaN" et. al. and handles them. My problem is not that this does not work...it does...it is just that it completely cripples the debugger. Does anyone know how to configure the debugger/vs2008sp1 to handle this more efficiently.
cartden,
You may want to consider switching over to XMLSerializer instead. Here is what I have determined over time:
The XMLSerializer and DataContractSerializer classes provides a simple means of serializing and deserializing object graphs to and from XML.
The key differences are:
1.
XMLSerializer has much smaller payload than DCS if you use [XmlAttribute] instead of [XmlElement]
DCS always store values as elements
2.
DCS is "opt-in" rather than "opt-out"
With DCS you explicitly mark what you want to serialize with [DataMember]
With DCS you can serialize any field or property, even if they are marked protected or private
With DCS you can use [IgnoreDataMember] to have the serializer ignore certain properties
With XMLSerializer public properties are serialized, and need setters to be deserialized
With XmlSerializer you can use [XmlIgnore] to have the serializer ignore public properties
3.
BE AWARE! DCS.ReadObject DOES NOT call constructors during deserialization
If you need to perform initialization, DCS supports the following callback hooks:
[OnDeserializing], [OnDeserialized], [OnSerializing], [OnSerialized]
(also useful for handling versioning issues)
If you want the ability to switch between the two serializers, you can use both sets of attributes simultaneously, as in:
[DataContract]
[XmlRoot]
public class ProfilePerson : NotifyPropertyChanges
{
[XmlAttribute]
[DataMember]
public string FirstName { get { return m_FirstName; } set { SetProperty(ref m_FirstName, value); } }
private string m_FirstName;
[XmlElement]
[DataMember]
public PersonLocation Location { get { return m_Location; } set { SetProperty(ref m_Location, value); } }
private PersonLocation m_Location = new PersonLocation(); // Should change over time
[XmlIgnore]
[IgnoreDataMember]
public Profile ParentProfile { get { return m_ParentProfile; } set { SetProperty(ref m_ParentProfile, value); } }
private Profile m_ParentProfile = null;
public ProfilePerson()
{
}
}
Also, check out my Serializer class that can switch between the two:
using System;
using System.IO;
using System.Runtime.Serialization;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
namespace ClassLibrary
{
// Instantiate this class to serialize objects using either XmlSerializer or DataContractSerializer
internal class Serializer
{
private readonly bool m_bDCS;
internal Serializer(bool bDCS)
{
m_bDCS = bDCS;
}
internal TT Deserialize<TT>(string input)
{
MemoryStream stream = new MemoryStream(input.ToByteArray());
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
return (TT)dc.ReadObject(stream);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
return (TT)xs.Deserialize(stream);
}
}
internal string Serialize<TT>(object obj)
{
MemoryStream stream = new MemoryStream();
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(stream, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(stream, obj);
}
// be aware that the Unicode Byte-Order Mark will be at the front of the string
return stream.ToArray().ToUtfString();
}
internal string SerializeToString<TT>(object obj)
{
StringBuilder builder = new StringBuilder();
XmlWriter xmlWriter = XmlWriter.Create(builder);
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(xmlWriter, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(xmlWriter, obj);
}
string xml = builder.ToString();
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern("<?xml*>", WildcardSearch.Anywhere), string.Empty);
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern(" xmlns:*\"*\"", WildcardSearch.Anywhere), string.Empty);
xml = xml.Replace(Environment.NewLine + " ", string.Empty);
xml = xml.Replace(Environment.NewLine, string.Empty);
return xml;
}
}
}
This is a guess, but I think it is running slow in debug mode because for every exception, it is performing some actions to show the exception in the debug window, etc. If you are running in release mode, these extra steps are not taken.
I've never done this, so I really don't know id it would work, but have you tried just setting that one assembly to run in release mode while all others are set to debug? If I'm right, it may solve your problem. If I'm wrong, then you only waste 1 or 2 minutes.
About your debugging problem, have you tried to disable the exception assistant ? (Tools > Options > Debugging > Enable the exception assistant).
Another point should be the exception handling in Debug > Exceptions : you can disable the user-unhandled stuff for the CLR or only uncheck the System.FormatException exception.
Ok - I figured out the root issue. It was what I alluded to in the EDIT to the main question. The problem was that in the xml, it was correctly serializing doubles that had a value of double.NaN. I was using these values to indicate "na" for when the denominator was 0D. Example: ROE (Return on Equity = Net Income / Average Equity) when Average Equity is 0D would be serialized as:
<ROE>NaN</ROE>
When the DCS tried to de-serialize it, evidently it first tries to read the number and then catches the exception when that fails and then handles the NaN. The problem is that this seems to generate a lot of overhead when in DEBUG mode.
Solution: I changed the property to double? and set it to null instead of NaN. Everything now happens instantly in DEBUG mode now. Thanks to all for your help.
Try disabling some IE addons. In my case, the LastPass toolbar killed my Silverlight debugging. My computer would freeze for minutes each time I interacted with Visual Studio after a breakpoint.

Passing IList<T> vs. IEnumerable<T> with protobuf-net

I noticed in the protobuf-net changelog that IList<> was supported but I'm getting the "Cannot create an instance of an interface" exception. If I change to IEnumerable<> then life is good. Does this sound correct?
// Client call
public override IList<IApplicationWorkflow> Execute(IRoleManagement service)
{
IList<ApplicationWorkflowMessagePart> list = service.RetrieveWorkflows(roleNames);
IList<IApplicationWorkflow> workflows = new List<IApplicationWorkflow>(list.Count);
foreach (ApplicationWorkflowMessagePart a in list)
{
workflows.Add(new ApplicationWorkflowImpl(a));
}
return workflows;
}
// Service contract
[OperationContract, ProtoBehavior]
[ServiceKnownType(typeof (ServiceFault))]
[FaultContract(typeof (ServiceFault))]
IList<ApplicationWorkflowMessagePart> RetrieveWorkflows(string[] roleNames);
// Service implementation
public IList<ApplicationWorkflowMessagePart> RetrieveWorkflows(string[] roleNames)
{
IList<IApplicationWorkflow> workflows = manager.RetrieveApplicationWorkflows(roleNames);
IList<ApplicationWorkflowMessagePart> workflowParts = new List<ApplicationWorkflowMessagePart>();
if (workflows != null)
{
foreach (IApplicationWorkflow workflow in workflows)
{
workflowParts.Add(
ModelMediator.GetMessagePart<ApplicationWorkflowMessagePart, IApplicationWorkflow>(workflow));
}
}
return workflowParts;
}
Thanks,
Mike
Also, is there document site that has this and other answers? I hate to be asking newb questions. :)
Currently it will support IList<T> as a property, as long as it doesn't have to create it - i.e. allowing things like (attributes not shown for brevity):
class Order {
private IList<OrderLine> lines = new List<OrderLine>();
public IList<OrderLine> Lines {get {return lines;}}
}
I would have to check, but for similar reasons, I expect it would work with Merge, but not Deserialize (which is what the WCF hooks use). However, I can't think of a reason it couldn't default to List<T>... it just doesn't at the moment.
The simplest option is probably to stick with List<T> / T[] - but I can have a look if you want... but I'm in a "crunch" on a (work) project at the moment, so I can't lift the bonnet today.
Re "this and other answers"... there is a google group, but that is not just protobuf-net (protobuf-net is simply one of many "protocol buffers" implementations).
You are also free to log an issue on the project site. I do mean to collate the FAQs and add them to the wiki on the site - but time is not always my friend...
But hey! I'm here... ;-p