Should I set a limit for the uploading file size in jfrog through UI? - file-upload

As a DevOps admin, I want to know if we should set up a limit for uploading a file to the artifactory through UI.
If yes, what is the recommended size?

The UI upload limit by default is 100MB for the Artifactory. The default limit is the optimal setting to prevent browser session timeouts.
If you want to set the limit according to your requirement you can go to Admin -> Artifactory -> General -> Setting and change "File Upload In UI Max Size"
In general, the recommended way to deploy large files is via REST API, where a deploy query might look something like this:
curl -X PUT -u myUser:myPassword -T test.txt "http://localhost:8081/artifactory/libs-release-local/test/test.txt"
More Info: In case of UI failures you can use this Knowledge Base article in order to find and resolve the issue.

Related

Can't upload or see DEPLOY button getting highlighted - can't upload artifact .tar .tgz

Artifactory Version: EnterpriseX license 7.15.3
I see I have valid access to deploy artifacts in all the repositories where I'm trying to upload an artifact using Artifactory UI.
I go to the repository, see, under effective permissions, that my userid is listed with all permissions checked-in including upto Manage level.
When I try to upload a small .docx/.tar/.tgz file, it works fine.
When I'm trying to upload a large .tgz file 10+GB (in size) (to be used in an air-gapped environment tar ball), then I'm not seeing the DEPLOY button highlighted so that I can click on it. It's still kind of greyed out (Greenish), see picture. It seems like after you select a file to upload, it does an animation of 1%--->reaching--->100% and then after some time, shows you 2 check boxes for deploying as a deploy bundle and if you want to use a user-defined Layout format, but with the 10+GB tar file, when it reaches 100%, I don't see these 2 check-boxes in UI showing up and also the DEPLOY Green button is not highlighted so that I can click on it. I see a RED circle with a cross (which indicates, something is still not ready). Why it's even showing me 100% then for pre-upload/processing then?
Am I missing anything? I have already fixed the file size limit that I was facing for 100MB default file size limit in Artifactory repo.
PS: Related post when trying at command line level gives me 502 Bad Gateway and 403 errors: Artifactory Curl -X PUT large file - 502 Bad Gateway The proxy server received an invalid response from an upstream server / 403 Bad props auth token

ZAP API scan context file format

I'm running the ZAP API scan script on a REST API but I have to host the Open API spec file on my own web server. When I run the scan it logs alerts against the URL where the spec is hosted, I would like to exclude it from the context. I saw that you can provide a context file using the following command line flag
-n context_file context file which will be loaded prior to scanning the target
I was wondering where I could find the format of the context file?
Launch ZAP desktop, create the context with the details you want, export it and use it in your API scan.

Can I transfer images between shopify sites?

I'm doing some work for a client who has an existing shopify website. They want to make some big changes to the site, so i have set up a new development site in shopify, exported all of the products/pages/blog posts to it and am now working on getting all the new functionality/design working on the dev site.
Once the new build is finished though, i want to transfer everything back over to their current site. Products/pages/blog posts will be fine (ive written a custom export/import thing using their api), but what about images?
I am uploading lots of images to the dev site and i am worried they will be deleted when development is finished and i shut down the dev site. Is it possible to transfer over images from one site to another?
Ideally, keeping the same urls on shopifys cdn when doing so, although if i have to change the urls, then i can probably do an automated replace on the csv files that will get uploaded.
There are going to be hundreds of images involved, and they will be used in various places throughout the site, including in the rich text area of pages/blogs, so it's not going to be practical to do manually in any way, must be something I can automate.
Thanks for any help.
When you export products as a CSV, you get links to your images. You could write a script to download each of the images in the CSV. Just redirect the output of curl to save the image.
curl link_url > imagename
Have you tried transferring between the two sites using FTP? If you have SSH Access
login to the server via SSH
change to the right directory to file location or desired location
FTP into the other server using ftp <name_or_IP_address_of_other_server> and your login details
use cd to locate your location / desired destination
use the binary command
hash if you want a progress bar
if sending the file from the server you SSHed into issue the put <filename> command, and if you want to pull the file from the other server to the one you are logged into use get <filename> instead.
Wait a while for the transfer to complete - might take a while

How to upload large files to mediawiki in an efficient way

We have to upload a lot of virtual box images witch are between 1G and 6G.
So i would prefer to use ftp for upload and then include the files in mediawiki.
Is there a way to do this?
Currently I use a jailed ftp user who can upload to a folder and then use the UploadLocal extension to include the files.
But this works only for files smaller then around 1G. If we upload bigger files we get a timeout and even by setting execution_time of PHP to 3000s the including stops after about 60s with a 505 gateway time out (witch is also the only thing appearing in the logs).
So is there a better way of doing this?
You can import files from shell using maintenance/importImages.php. Alternatively, upload by URL by flipping $wgAllowCopyUploads, $wgAllowAsyncCopyUploads and friends (requires that job queue be run using cronjobs). Alternatively, decide if you need to upload these files into MediaWiki at all, because just linking to them might suffice.

cors amazon s3 benchmarking by Jmeter

#All
Can you guys please suggest, how can we use J-meter to benchmark the performance of uplaod content to cors amazon s3.
we are using plupload module in drupal. We need to click on "Add Files", and select the file.
Then, the request will go to cors amazon s3 server.
We want to user J-meter for benchmarking with a set of users.
This should be your starting point,
https://github.com/bigstepinc/jmeter_s3_custom_sampler
First of all I would recommend recording your file upload action via HTTP(S) Test Script Recorder
Once you have the script skeleton you need to care about authentication and correlation. See i.e. A JMeter test plan for Drupal for details.
As soon as you are able to replay your test script up to "Add files" page check out Performance testing: Upload and Download Scenarios with Apache JMeter for a set of instructions on proper simulating file upload.
You may with to parametrize files along with MIME types so not only one file could be uploaded during performance test. Take a look at CSV Data Set Config element.