CxOSA scan saying "0 libraries were analyzed" - bamboo

I have a bamboo plan with Checkmarx plugin . The CxSAST scan is working fine , scanning the code and giving the scan report, but CxOSA scan is not taking place I guess. The scan for CxOSA says 0 libraries were analyzed though I am using a lot of open source JS libraries like lodash, Jquery , etc. I went through the docs as well , but with little luck. I am pretty new to Checkmarx, any help is appreciated. Here is the CxOSA related config that I am using in my Bamboo PlanSpec.java file :
.put("cxOsaArchiveIncludePatterns", "*.zip, *.war, *.ear, *.tgz")
.put("osaEnabled", "true")

I have some experience with CxOSA. When I got 0 results, it happened because I wasn't scanning the correct files (the binaries instead of the code) or because I didn't enable dependency resolution in the OSA scan. I think that is the issue for you, you should add a parameter like ("executepackagedependency", "true"). I'm using the plugin and not PlanSpec.java so I'm not sure the exact parameter name

So it seems I have the answer. There are several key value pairs that we need to set as part of checkmarx configuration. Initially, I had deleted some keys whose values were an empty string. And of those key was cxOsaFilterPatterns. When I added this key with an empty string value, Checkmarx started scanning the CxOSA part .
For reference , you can use this piece of code as configuration .
("serverCredentialsSection", "globalConfigurationServer")
("projectName", "Your project name")
("teamPathName", "Your team name")
("teamPathId", "Your team id")
("serverUrl", "Checkmarx server URL")
("username", "Checkmarx username")
("password", "Checkmarx password")
("presetName", "Checkmarx Default")
("cxSastSection", "customConfigurationCxSAST")
("folderExclusions", "node_modules")
("filterPatterns","!**/_cvs/**/*, !**/.svn/**/*, !**/.hg/**/*, !**/.git/**/*, !**/.bzr/**/*, !**/bin/**/*,!**/obj/**/*, !**/backup/**/*, !**/.idea/**/*, !**/*.DS_Store)
("isIncremental", "true")
("generatePDFReport", "true")
("intervalBegins", "01:00")
("intervalEnds", "04:00")
("osaEnabled", "true")
("cxOsaFilterPatterns", "")
("cxOsaArchiveIncludePatterns", "*.zip, *.war, *.ear, *.tgz")
("scanControlSection", "globalConfigurationControl")
("isSynchronous", "true")
("presetId", "36")

Related

Error with both blogdown:new_site() and New Project > website using blogdown

I am following "blogdown: Creating Websites with R Markdown" and unfortunately am stuck in Section 1.2. I created a new project in an empty folder, but get this error:
blogdown::new_site()
'C:\Users\rose89\AppData\Roaming\Hugo\hugo.exe" new site ".' is not recognized as an internal or external command, operable program or batch file.
Error in shell(cmd, mustWork = TRUE, intern = intern) :
'"C:\Users\rose89\AppData\Roaming\Hugo\hugo.exe" new site "." --force -f toml' execution failed with error code 1
I have tried removing and re-installing blogdown. When I try File > New Project > New Directory > website using blogdown, I get a popup error: R code execution error
I am using Windows, R studio 1.3.959 and R version 4.0.2. Here is some other info:
getwd()
1 "C:/Users/rose89/Documents/anothernewproject"
list.files('content', '.md$', full.names = TRUE, recursive = TRUE)
character(0)
I prefer to use the first approach in the console, but feel hopeless as I can't even get the "point and click" approach to work. If anyone has suggestions, I would greatly appreciate them! Thank you. Also this is my first post on StackOverflow and first time trying Blogdown, apologies if my question is not phrased clearly!
Update: I believe the error is due to my domain's "group policy" blocking hugo.exe (and other zipped .exe programs) on my Windows machine. I am working with my department's IT department to find a work-around.

Terraform/GCP: ssh-keys not being added to metdata

I'm trying to add ssh-keys to my Google Cloud project at the project level with terraform:
resource "google_compute_project_metadata_item" "oslogin" {
project = "${google_project_services.myproject.project}"
key = "enable-oslogin"
value = "false"
}
resource "google_compute_project_metadata_item" "block-project-ssh-keys" {
project = "${google_project_services.myproject.project}"
key = "block-project-ssh-keys"
value = "false"
}
resource "google_compute_project_metadata_item" "ssh-keys" {
key = "ssh-keys"
value = "user:ssh-rsa myverylongpublickeythatireplacewithtexthereforobviousreasons user#computer.local"
depends_on = [
"google_project_services.myproject",
]
}
I tried all types of combinations of the 2 metadata flags oslogin and block-project-ssh-keys, which always get set without issues. But the ssh keys never appear in GCPs web GUI let alone the authorized_keys file. I even tried adding the depends_on, to make sure the project is existent before adding the keys, but that didn't help either.
Yet, Terraform says:
google_compute_project_metadata_item.ssh-keys: Creation complete after 8s (ID: ssh-keys)
Adding the exact same key manually on the web GUI works fine. At this point I believe I have tried everything, read all the first page Google results to 'terraform gcp add ssh key' and similar queries ... I'm at my wits end.
The issue was that the ssh key was being added to a different project.
I started with Google's tutorial on GCP/Terraform. This creates a generic project with the gcloud tool first. Then proceeds to create accounts using that generic project. This is necessary because you need a user to run terraform against their API. Then they create a new project facilitating these users with terraform each time you apply. The generic project created with gcloud is not being touched after the initial creation.
If you omit the "project" parameter from the google_compute_project_metadata_item.ssh-keys resource, it used the generic project and added the ssh keys there - at least in my case.
Solution: explicitly add the project parameter to the metadata resource item to make sure it's being added to the right project

Qlik Sense: how to specify path in Google Drive?

I have a Google drive account divided into some folders (say, Folder1, Folder2, etc.), with some subfolders in it.
I successfully managed to connect my Qlik Sense app to it.
I need to make it look for files only in a given subfolder.
At the moment, I read as follows ([...] is the location)
(URL IS [[...]connectorID=GoogleDriveConnector&table=ListSpreadsheets&appID=], qvx);
It works and reloads successfully, but I need it to filter the Spreadsheets properly. How could I get what I need?
To connect to Google Drive in fact you use web connector. Once web connector is installed it can be initialized as service or manually from its folder.
Once it i installed (recent version can be downloaded from https://qliksupport.force.com/apex/QS_Home_Page but it seems that you've got it as Google Drive is part of it ) it is much nicer to configure connection to online drives there.
You just go to http://localhost:5555/web and generate ready code.
In my implementation I used following options step by step to get data which I wanted:
1) CanAuthenticate to generate permanent token
2) ListSpreadsheets
3) ListWorksheets
4) GetWorksheet
You can't just specify path. But it's possible to retrieve the path from QWC services. Please use algorithm like that:
Use tables like ListFiles/ListWorksheets
Iter through every row with 'for' cycle:
FOR i=0 to (NoOfRows('Google_ListWorksheets')-1);
Let vWorksheetKey = Peek('worksheetKey', $(i), 'Google_ListWorksheets');
Let vTitle = left(Peek('title', $(i), 'Google_ListWorksheets'),3);
Using 'if' statement find desired folder id/worksheet key by its name (stored in vTitle variable) and use it:
load * FROM [$(vQwcConnectionName)]
(URL IS [http://localhost:5555/data?connectorID=GoogleDriveConnector&table=GetWorksheet&worksheetKey=$(vWorksheetKey)&appID=], qvx);
At the end you will get your files by their location.

Using the TFS API, filetypes with extensions like .svnExe get ignored

I'm working on a tool which migrates from SVN to TFS using the TFS API.
workspace.CheckIn(
pendingChanges,
currentUser.TfsUser,
set.LogMessage + " on " + String.Format("{0:d/M/yyyy HH:mm:ss}", set.TimeStamp) + " by " + currentUser.SvnUser,
(CheckinNote)null,
(WorkItemCheckinInfo[])null,
(PolicyOverrideInfo)null
);
This is the way i check my revision in, but sometimes it ignores files like .svnExe, or other "unknown" file types.
Is there a way to check ALL filetypes in TFS?
There are two possibilities that I can think of:
Possibility 1: Something is causing the PendAdd() to fail.
For example, if the path already exists in Version Control, you have to use a PendEdit() instead.
To diagnose this possibility, you should subscribe to the VersionControlServer.NonFatalError event.
Possibility 2: You could have a corrupt workspace cache
You can refresh the cache by calling Workstation.Current.EnsureUpdateWorkspaceInfoCache() or by following the steps in this answer (run tf workspaces /collection:http://yourserver:8080/tfs/DefaultCollection, or delete the directories manually).

DB Query no longer recognizes SQL parameters in existing application when debugging in VS2010

I just started working with an application that I inherited from someone else and I'm having some issues. The application is written in C# and runs in VS2010 against the 3.5 framework. I can't run the application on my machine to debug because it will not recognize the way they referenced their parameters when writing their DB queries.
For instance wherever they have a SQL or DB2 query it is written like this:
using (SqlCommand command = new SqlCommand(
"SELECT Field1 FROM Table1 WHERE FieldID=#FieldID", SQLconnection))
{
command.Parameters.AddWithValue("FieldID", 10000);
SqlDataReader reader = command.ExecuteReader();
...
If you will notice the "parameters.AddWithValue("FieldID", 10000);" statement does not include the "#" symbol from the original command text. When I run it on my machine I get an error message stating that the parameter "FieldID" could not be found.
I change this line:
command.Parameters.AddWithValue("FieldID", 10000);
To this:
command.Parameters.AddWithValue("#FieldID", 10000);
And all is well... until it hits the next SQL call and bombs out with the same error. Obviously this must be a setting within visual studio, but I can't find anything about it on the internet. Half the examples for SQL parameter addition are written including the "#" and the other half do not include it. Most likely I just don't know what to search for.
Last choice is to change every query over to use the "#" at the front of the parameter name, but this is the transportation and operations application used to manage the corporation's shipments and literally has thousands of parameters. Hard to explain the ROI on your project when the answer to the director's question "How's progress?" happens to be "I've been hard at it for a week and I've almost started."
Has anyone run into this problem, or do you know how to turn this setting off so it can resolve the parameter names without the "#"?
Success! System.Data is automatically imported whenever you create a .NET solution. I removed this reference and added it back to make sure that I had the latest version of this library and that fixed the issue. I must have had an old version of this library that was originally pulled in... only thing I can figure.
Its handled by the .NET Framework data providers not Visual Studio.
It depends on the data source. Look here:Working with Parameter Placeholders
You can try working with System.Data.Odbc provider and using the question mark (?) place holder. In thios case dont forget to add the parameters in the same order they are in the query.