ZAP API scan context file format - zap

I'm running the ZAP API scan script on a REST API but I have to host the Open API spec file on my own web server. When I run the scan it logs alerts against the URL where the spec is hosted, I would like to exclude it from the context. I saw that you can provide a context file using the following command line flag
-n context_file context file which will be loaded prior to scanning the target
I was wondering where I could find the format of the context file?

Launch ZAP desktop, create the context with the details you want, export it and use it in your API scan.

Related

File upload request failed in Jmeter even after following correct steps

I have followed below steps to upload file in jmeter but it didn't worked. It throws Sorry, an error occurred while trying to execute your request. Please try again I have attached screenshots for more details.
Enabled Use multipart/form-data
Copied the file which is going to upload in /bin directory
I have tried with check/Uncheck Use multipart/form-data but no luck
In my HTTP request I passes action_id=1203 as Query parameters and in Form Parameters I am Passing other parameters like msgId, fieldId etc. but from screenshot you can observe when I execute that it passes my whole form parameters in just one single key of "msgId" I don't know why?
This are the headers which I pass
My Request with Query and Form parameters
File upload tab of http request
After execution request failed with this output. Here it passes all form params in single "msgId" key
F12 - Network request of Webpage form parameters (checked manually on web it works fine. Problem is in my jmeter request)
Just record the file upload using JMeter's HTTP(S) Test Script Recorder and it will generate the relevant HTTP Request sampler and HTTP Header Manager configuration which can be later on correlated/parameterized.
The only thing you need to do is to copy the file you're uploading to "bin" folder of your JMeter installation before recording. File path can be changed to whatever you want afterwards.
Also according to JMeter Best Practices you should always be using the latest version of JMeter so consider upgrading to JMeter 5.5 (or whatever is the latest stable version which is available at JMeter Downloads page) as soon as possible.

Using Workbox via a local copy, without a CDN

I have a small device that serves a webpage using Nginx in a local network. I'm developing the webpage using Vue and I need that once a person got connected to the server and visited the page, on disconnection, the page needs to work as normal
I'm currently using Workbox plugin and I get this code:
importScripts("https://storage.googleapis.com/workbox-cdn/releases/4.3.1/workbox-sw.js");
importScripts(
"/precache-manifest.b62cf508e2c3da8c27f2635f7aab384a.js"
);
The problem is that it goes to the internet to download that file and I will not have an internet connection.
I tried downloading this file, but inside goes to the internet again.
Is there a way to get this to work in an offline environment?
You can follow the guidance in the workbox-sw docs to download a local copy of the bundled Workbox runtime libraries, and modify your service worker script to use those.
Running:
$ npx workbox-cli#4.3.1 copyLibraries /path/to/dir
from the command line will download a local copy of the runtime to the specified directory (replace /path/to/dir with the desired location).
You can then modify your service worker script so that it reads:
importScripts("/path/to/dir/workbox-v4.3.1/workbox-sw.js");
workbox.setConfig({
modulePathPrefix: '/path/to/dir/workbox-v4.3.1/'
});
importScripts(
"/precache-manifest.b62cf508e2c3da8c27f2635f7aab384a.js"
);

Azure DevOps Testing

The basic purpose is to test a profile image upload API. I have an API which takes image file as an input and updates the profile picture according to given auth. I ran this API in postman and it was working fine. Now what I want to achieve is that I want to run this collection which has just one API for now on Azure devOps using npm, newman and publish test results. The issue that i am facing is that I can not find a way to upload that file. In postman collection, file path is the path in which your file is placed on your pc. In order to Run that api on devOps, what path should i give? Also, is there a way to upload an image or any sort of file?
In postman collection, file path is the path in which your file is
placed on your pc.
According to this description, you should want to upload the picture to a local file path.
The first thing that is clear is that you need to use the self-hosted agent to run the pipeline,because only if you use the agent installed on your local machine to run, you can track your local file path in Azure Devops. You can refer to this document on how to install a self-hosted agent.
Usually, the postman collection is exported as a .json file and then pushed it into the repo. If you set the file path in the collection, is it included in the collection.json?
In addition, it is difficult to reproduce your issue based on the information available, more detailed information is needed for further investigation:1. Which api are you using?
2.Postman collection settings 3.Specific operation process 4. Pipeline definition. It will be easier to understand in the form of screenshots.

Authentication using Spinnaker expression helper function

I have built a pipeline that is triggered by a Git push on a specific file which contains additional meta information like the target namespace and version of the kubernetes manifest to be deployed.
Within an expression I would like to read the artifact using
${ #fromUrl( execution['trigger']['resolvedExpectedArtifacts'][0]['boundArtifact']['reference'] ) }
What I try to achieve is a GitOps approach with a set of config files in Git which trigger a pipeline for a parameterized Kubernetes manifest to deploy multiple resources.
When I execute that expression either by starting the pipeline or using curl I get 401 (in orca logs). The Git credentials are configured using username/password and token as well in config as in orca-local.yml.
But it seems they are not used.
Am I on the wrong path, is there an easier way to access a file's content in a pipeline?
That helper won't go through any sort of authentication, it will expect the endpoint to be open to your spinnaker instance.
Spinnaker normally treats artifacts as pass-through, so in order to get the contents of the file inside the pipeline you'll have to go through an intermediate stage such as writing out a property file in a jenkins stage ( https://www.spinnaker.io/guides/user/pipeline/expressions/#property-files ) or via webhook with custom auth headers.

Export file from NetSuite's FileCabinet to FTP

File resides in the NetSuite file cabinet and needs to be placed on an FTP server each day.
I'm not sure how to handle this via Suitelet/RESTlet, or if it's possible - but would prefer to not use an external source/application.
My current and hopefully temporary workaround is a local scheduled task to run a script to pull files from NetSuite & upload to the FTP.
In SuiteScript 2.0, although unsecured FTP is still not support, but SS2.0 has the capability to do SFTP. See http://www.upilioconsulting.com/blog/netsuite-2016-2-sftp-suitescript-2-0/
In SuiteScript 1.0, it's not supported. The workaround is that you'll need to write a middleware code (i.e. in PHP) and let the middleware do the FTP transfer.
Netsuite doesn't interact with FTP.
You need a bridge server of some sort that runs a web app (full blown Apache or nginx running PHP or just a simple Node service)
Just get a server and install some web server/web service and POST your files to it (nlapiRequestURL with a Scheduled script). Have the web app on the bridge server send the files to the FTP server. If you are using Netsuite you can afford the cost of the bridge server.
One possible solution is to create a saved search on the Documents to list out all the files in Netsuite filtering by createdate or lastmodifieddate. Create a scheduler to fetch only the new files and save them locally where you want.
Note all the files will be in base64 encoded string format, you need to decode again to obtain the file.
As bknights said NetSuite doesn't support FTP. You need a web server(any server side language can do for that matter, I have written one in Node.js), to receive the files.
The file content for text file will be in Text format, so, no decode logic required for text files. However, binary/pdf/image and other would be in base64 format, as NetSuite's JS has no way of handling binary data. So, make sure you decode it before you create the file on your FTP Server.