I need to find all the issues discovered in a snapshot/scan in Sonarqube. I can't use the web API since the volume can be excessive for new projects on first scan. I have a query that can find the latest snapshot with the project information. I can query issues by project. I can't figure out how to relate issues to a snapshot. There has to be a way since Sonarqube does it - New issues on the Project page.
Has anyone done this or have enough experience with the crazy schema to be able to figure it out? Can't wait for the schema rationalization...
Sonarqube 5.6.3 on Windows 2012 R2 with SQL Server 2012.
There is currently no association between snapshot and issue. Nor has there ever been one. The closest you can come is to use date parameters to narrow the set of issues created right around the time of your analysis. Note that this could be difficult if you run analyses close together.
The "new issues" metrics shown on the project homepage are just that - metrics. However, if you click through on one, you'll find yourself in a date-based Issues search.
You can do the same sort of thing using the web service, again, via date-based criteria. Or you could use the sinceLeakPeriod parameter.
Related
our team has migrated TFS projects to a new TFS server but few members of the team are committing code to old TFS server unknowingly. How do we migrate the code (Pull Requests, Branches)on the old TFS server and just replace the code part on the Target Server. Because, there are other items like user stories, TFS tickets valid on new server as users creating them.
Do we need to take down TFS server and will it affect other projects in collection.
I cannot find a straight forward way.
You can't find a straightforward way because there isn't one. The amount of effort that would be involved in re-synchronizing the two instances would be absolutely massive and would require the setup and usage of various third-party utilities.
Source code is easy enough to address on a person-by-person basis (have them update their remotes to point to the new server and push there instead). Pull requests? No way to migrate. Work items? Possible but a huge pain. Other stuff? Who knows.
Your best bet here is to cut your losses.
We have some old reports which are not being used anymore by the business and wish to remove/archive these.
I have 2 queries related to this:
a) What is the best way to find if a report has not been used for the past 12+ months?
b) Is there any simple way of moving the reports no longer being used (not used for > 12 months) to a different location (i.e. new folder), while keeping the folder structure intact?
We have searched for solutions on the web, but have not been able to find an automated solution for this. as the number of reports which we have figured out is in the range of ~5000, we are searching for an automated way to work this out.
Would running an SQL query on the server (physical machine) be advisable? If we run this query on the content store, we wish to figure out the column/field on which the actual report lies (in the below query have used , but not sure if such a field or query can be used):
update <table> set <report_path>='/content/folder[#name='Home']/folder[#name='Report']/report[#name='ABC012 - My Report']' where <report_path>='/content/folder[#name='Home']/folder[#name='Archive_test']/report[#name='ABC012 - My Report']'
Would this kind of a query work?
If not, can anyone suggest a way on which we can move reports to a single folder on the same Cognos box? (we are using Cognos 10, with DB2 and Netezza)
I can help you with a)
http://pic.dhe.ibm.com/infocenter/cfpm/v10r1m0/index.jsp?topic=%2Fcom.ibm.swg.im.cognos.ug_cra.10.1.0.doc%2Fug_cra_id4425SampleAuditReports.html
You can use provided package and sample reports to find information you need.
And for b) you should look in Cognos SDK. I don't think that update tables directly is a good idea.
I have searched high and low for a solution to this problem to no avail.
Basically, the situation is as follows:
We are currently migrating our existing TFS server to another machine, which has been going well up until now.
Unfortunately i'm unable to complete the configuration of Report server and the likes as I get the following error:
"Failed to add SWSERVER\susan account to the TFSEXECROLE role on the Tfs_Warehouse relational database"
SWSERVER is the name of the previous machine that hosted the TFS server.
The thing is that SWSERVER\susan is an absolete account, and was actually removed as a user account on the previous machine, which I think is a major part of the problem.
From what I can gather is that TFS can still see it in the restored databases and thinks it's a viable account but seeing as the account technically doesn't exist it can't actually do anything with it.
Another part of the question is that if I go to the original (SWSERVER) and remove the SWSERVER\susan user, will that have an effect on how TFS or SQL operate especially if that account (or any other similar account) are linked to anything in either program?
I'd much appreciate any help anyone can provide.
I've hope i've explained my situation well enough but if anybody needs any more information, please don't hesitate to let me know.
You can't remove users, they will fall out of scope anyway, however that is not your problem. Your TFS instance has been moved from one server to another without following the documented procedure.
You need to follow the instruction to Move Team Foundation Server from one environment to another. Although they will be based on the more common move of Domain to Domain you can think of a non-domain joined server as having a domain of the same name as the local computer.
Now this documentation also follow as using the same hardware so you will need to mix and match between Move Team Foundation Server from one environment to another and Move Team Foundation Server from one hardware configuration to another.
While not really that hard you do need to follow all of the steps...
Just want to thank you for your reply and help. As it turns out I was flogging a dead horse with the TFS Reporting setups when I found out that the reports aren't even used currently on the existing setup.
I did however manage to figure out that if I added every user that previously existed as Windows users on the new machine and then used the TFSconfig Identities /change command to change the domain (machine name, in this case) name to that of the new server then I stopped getting the error messages and after 3-4 reinstall attempts all seems to be working the way it should.
This link was incredibly helpful:
http://msdn.microsoft.com/en-us/library/ms404883.aspx
Thanks again!
I am trying to provide a consolidated method of retrieving the results of the last scan for each project in HP Fortify Security Center.
I have gone the route of querying the fortifySSC database and am falling a little short. Has anyone ever attempted to do this by way of a SQL query?
I would be happy to provide the query that I have thus far, if needed.
Or if anyone has an idea on how to accomplish this via the command line, I'd be in interested in that, too.
Thanks in advance for your help!
i´ve never done this in the CLI via SQL commands. But there is a utility called fortifyclient that is able to download the latest FPRs from a project. if you want to automate things, fortifyclient can handle token-based authentication. works for us perfectly in our CI environment.
I did get the query I needed via the SQL Profiler, so that's how I resolved this.
We have the standard 3 environment setup of development, testing and production. Each environment has their own report server, web server, database server, etc.
Part of our migration is to move our business objects (xi r2) reports between the servers but as of right now we need to manually update the connection settings for each report. This is mildly painful now at 40+ reports and will become a nightmare as we continue.
Due to how we generate reports we cannot dynamically change the connection string when we generate the report. We are using stored procs instead of Universes because that is what the team is most familiar with.
Any suggestions would be greatly appreciated.
There is an API that you can use to programatically update this sort of thing, although I can't remember how to do it. Check out the docs provided by Business Objects - IIRC they are not publically available (at least they weren't in 2006 when I last worked with it) so you may have to get them from the vendor.
Take a look at the Report Class' ReportLogon Class in the CrystalDecisions.Enterprise.Desktop.Report Assembly of the BusinessObjects SDK. Quite a few options for changing the database connection.
I wrote something similar for a client to make bulk changes Universes and WebI reports. I would imagine that it is very similar for Crystal Reports.
Are you changing the Universe Connection or Universe themselves?
In our environment, we worked around this by having the Universes named the same between environments but they each have a different Connection by environment. This prevents needing to change each report.
I searched far and wide and it seems like this is an unusual circumstance. My final solution which seems to be okay is to have a consistent DSN connection string in each environment. This means that each connection string is effectively the same.
It still feels wrong and if anyone has other ideas that would be great.
EDIT:
This failed miserably after a little bit of testing I found out many of our stored proceedures would not run using the DSN. After that I gave up completely.