Dotnet core code publish push to s3 as Zip from gitlab CI/CD - amazon-s3

How can I zip the artifacts before copying to s3 bucket, this is done as the beanstalk requires zip file to update.
I wanted to deploy the dotnet publish code in beanstalk. I am using Gitlab CI/CD to trigger the build when new changes are pushed to the gitlab repo
In my .gitlab-ci.yml file what am doing is
build and publish the code using dotnet publish
copy the published folder artifact to s3 bucket as zip
create new beanstalk application version
update beanstalk environment to reflect the new changes.
Here I was able to perform all the steps except step 3. Can anyone please help me on how can I Zip the published folder and copy that zip to s3 bucket. Please find my relavant code below:
build:
image: mcr.microsoft.com/dotnet/core/sdk:3.1
script:
- dotnet publish -c Release -o /builds/maskkk/samplewebapplication/publish/
stage: build
artifacts:
paths:
- /builds/maskkk/samplewebapplication/publish/
deployFile:
image: python:latest
stage: deploy
script:
- pip install awscli
- aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID
- aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY
- aws configure set region us-east-2
- aws s3 cp --recursive /builds/maskkk/samplewebapplication/publish/ s3://elasticbeanstalk-us-east-2-654654456/JBGood-$CI_PIPELINE_ID
- aws elasticbeanstalk create-application-version --application-name Test5 --version-label JBGood-$CI_PIPELINE_ID --source-bundle S3Bucket=elasticbeanstalk-us-east-2-654654456,S3Key=JBGood-$CI_PIPELINE_ID
- aws elasticbeanstalk update-environment --application-name Test5 --environment-name Test5-env --version-label JBGood-$CI_PIPELINE_ID````

I got the answer to this issue we can simply run a
zip -r ../published.zip *
this will create a zip file and can upload this zip folder to s3.
Please let me know if we have any other better solution to this.

Related

Persistent Bitbucket pipeline build artifacts greater than 14 days

I have a pipeline which loses build artifacts after 14 days. I.e, after 14 days, without S3 or Artifactory integration, the pipeline of course loses "Deploy" button functionality - it becomes greyed out since the build artifact is removed. I understand this is by intention by BB/Atlassian to reduce costs etc (detail in below link).
Please check last section of this page "Artifact downloads and Expiry" - https://support.atlassian.com/bitbucket-cloud/docs/use-artifacts-in-steps/
If you need artifact storage for longer than 14 days (or more than 1
GB), we recommend using your own storage solution, like Amazon S3 or a
hosted artifact repository like JFrog Artifactory.
Question:
Is anyone able to provide advice or sample code on how to approach BB Pipeline integration with Artifactory (or S3) in order to retain artifacts. Is the Artifactory generic upload/download pipe approach the only way or is the quote above hinting at a more native BB "repository setting" to provide integration with S3 or Artifactory? https://www.jfrog.com/confluence/display/RTF6X/Bitbucket+Pipelines+Artifactory+Pipes
Bitbucket give an example of linking to an S3 bucket on their site.
https://support.atlassian.com/bitbucket-cloud/docs/publish-and-link-your-build-artifacts/
The key is Step 4 where you link the artefact to the build.
However the example doesn't actually create an artefact that is linked to S3, but rather adds a status code with a description that links to the uploaded item's in S3. To use these in further steps you would then have to download the artefacts.
This can be done using the aws cli and an image that has this installed, for example the amazon/aws-sam-cli-build-image-nodejs14.x (SAM was required in my case).
The following is an an example that:
Creates an artefact ( a txt file ) and uploads to an AWS S3 bucket
Creates a "link" as a build status against the commit that triggered the pipeline, as per Amazon's suggestion ( this is just added for reference after the 14 days... meh)
Carrys out a "deployment", where by the artefact is downloaded from AWS S3, in this stage I also then set the downloaded S3 artefact as a BitBucket artefact, I mean why not... it may expire after 14 days but at if I've just re-deployed then I may want this available for another 14 days....
image: amazon/aws-sam-cli-build-image-nodejs14.x
pipelines:
branches:
main:
- step:
name: Create artefact
script:
- mkdir -p artefacts
- echo "This is an artefact file..." > artefacts/buildinfo.txt
- echo "Generating Build Number:\ ${BITBUCKET_BUILD_NUMBER}" >> artefacts/buildinfo.txt
- echo "Git Commit Hash:\ ${BITBUCKET_COMMIT}" >> artefacts/buildinfo.txt
- aws s3api put-object --bucket bitbucket-artefact-test --key ${BITBUCKET_BUILD_NUMBER}/buildinfo.txt --body artefacts/buildinfo.txt
- step:
name: Link artefact to AWS S3
script:
- export S3_URL="https://bitbucket-artefact-test.s3.eu-west-2.amazonaws.com/${BITBUCKET_BUILD_NUMBER}/buildinfo.txt"
- export BUILD_STATUS="{\"key\":\"doc\", \"state\":\"SUCCESSFUL\", \"name\":\"DeployArtefact\", \"url\":\"${S3_URL}\"}"
- curl -H "Content-Type:application/json" -X POST --user "${BB_AUTH_STRING}" -d "${BUILD_STATUS}" "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/commit/${BITBUCKET_COMMIT}/statuses/build"
- step:
name: Test - Deployment
deployment: Test
script:
- mkdir artifacts
- aws s3api get-object --bucket bitbucket-artefact-test --key ${BITBUCKET_BUILD_NUMBER}/buildinfo.txt artifacts/buildinfo.txt
- cat artifacts/buildinfo.txt
artifacts:
- artifacts/**
Note:
I've got the following secrets/variables against the repository:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
BB_AUTH_STRING

How to copy data from private S3 bucket to Azure Blob storage via Azure yaml pipeline

I have one S3 bucket storing CSV files in it. New CSV files get added to this bucket at the beginning of each month. I want these new files to be uploaded automatically to the Azure blob storage at the beginning of each month.
The way I was thinking to do this is to create a script(bash/PowerShell) that pulls data from the AWS S3 bucket to Azure blob storage via AZ Copy command. and then plug this script into an Azure YAML pipeline which runs every start of the month to execute this script. but I can't find a way to integrate this script in an Azure YAML pipeline. Is this command feasible with the YAML pipeline? or is there any simple way to do this?
We can copy data from S3 bucket to Azure Blob Storage using azcopy.
azcopy copy "<s3-bucket-uri>" "https://StorageAccountName.blob.core.windows.net/container-name/?sas-token" --recursive
We can integrate the azcopy with YAML pipeline.
Firstly, install azcopy in pipeline agent as below :
- task: Bash#3
displayName: Install azcopy
inputs:
targetType: 'inline'
script: |
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
mkdir $(Agent.ToolsDirectory)/azcopy
wget -O $(Agent.ToolsDirectory)/azcopy/azcopy_v10.tar.gz https://aka.ms/downloadazcopy-v10-linux
tar -xf $(Agent.ToolsDirectory)/azcopy/azcopy_v10.tar.gz -C $(Agent.ToolsDirectory)/azcopy --strip-components=1
Create cli-task with azcopy in pipeline to copy data from S3 bucket to Azure Blob Storage using azcopy
Reference code :
- task: AzureCLI#2
displayName: Download using azcopy
inputs:
azureSubscription: 'Service-Connection'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
end=`date -u -d "180 minutes" '+%Y-%m-%dT%H:%M:00Z'`
$(Agent.ToolsDirectory)/azcopy/azcopy copy "<s3-bucket-uri>" "https://StorageAccountName.blob.core.windows.net/container-name/?sas-token" --recursive --check-md5=FailIfDifferent
Reference SO thread : Azure Pipelines - Download files with azcopy - Stack Overflow

Why is my static site broken using github action and azure cli to deploy?

I'm trying to deploy my static site to Azure storage but have been having issues getting the site open correctly even though the github action executes without errors and the files seem to be in place. In the browser, index.html seems to load along with the css and js.... but the site does not run properly. The console shows a failure in the js:
The odd thing is that I don't have any issues using the azure storage extension in vscode or using the azure cli:
az storage blob upload-batch --account-name <ACCOUNT_NAME> -d '$web' -s ./dist --connection-string '<CONNECTION_STRING>'
when I deploy from my laptop.
My github action looks like this:
name: Blob storage website CI
on:
push:
branches: [master]
pull_request:
branches: [master]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: npm install
run: |
npm install
- name: npm build
run: |
npm run build
- name: Azure Login
uses: azure/login#v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: Azure CLI script
uses: azure/CLI#v1
with:
azcliversion: latest
inlineScript: |
az storage blob upload-batch --account-name <ACCOUNT_NAME> -d '$web' -s ./dist --connection-string '${{ secrets.BLOB_STORAGE_CONNECTION_STRING }}'
# Azure logout
- name: logout
run: |
az logout
based on this article here.
I thought that it might be due to the azure cli version, but none of the versions I've tried have made a difference.
Any ideas why my site broken using github action and azure cli to deploy?
For anyone interested - I was missing environment variables during the build process in the GitHub Action. I was able to pass these without checking in the .env files using github secrets.
There's now a step in the action to create a .env,
- name: Set Environment Variables
run: |
touch .env
echo ENVIRONMENT_VARIABLE=${{secrets.ENVIRONMENT_VARIABLE}} >> .env
and another to remove it:
- name: Remove Environment Variables
run: |
rm .env

CircleCI Deploy to AWS S3: What is the path to my files?

My Deployment fails in CircleCI
In my config file, I have the following:
deploy:
docker:
- image: circleci/python:2.7-jessie
working_directory: ~/circleci-docs
steps:
- run:
name: Install awscli
command: sudo pip install awscli
- run:
name: Deploy to s3
command: aws s3 sync <filepath> s3://BUCKET-NAME/ --delete
It fails on the deploy and I get the error
The user provided path does not exist
I have tried a few different file paths:
/
~/applicationname
~/working-directoryname
~/
But they all give the same error.
Then I tried using the working_directory name and also /home/circleci/working_directory_name
Both seem to succeed, yet no files appear in the bucket
What is the path that I should be using for the filepath?

Travis CI: Uploading artifacts to S3 results in "The bucket you are attempting to access must be addressed using the specified endpoint"

I have a Travis CI build that is configured to upload the build artifacts to S3. I've followed the Travis artifacts documentation but when the build completes I get the following error (and the S3 container is empty).
ERROR: failed to upload: /home/travis/build/jonburney/KingsgateMediaPlayer-Android/
app/build/outputs/apk/app-release-unsigned.apk
err: The bucket you are attempting to access must be addressed using the specified
endpoint. Please send all future requests to this endpoint.
I have tried to specify the "endpoint" option in the configuration but it was ignored. It appears to be attempting to upload the file to
https://s3.amazonaws.com/kmp-build-output/jonburney/KingsgateMediaPlayer-Android/30/30.1/app/build/outputs/apk/app-release-unsigned.apk.
Here is a copy of the relevant section from my .travis.yml file
addons:
artifacts: true
s3_region: "us-west-2"
artifacts:
paths:
- $(git ls-files -o app/build/outputs | tr "\n" ":")
Have I missed a configuration option for this scenario? Any help is appreciated!
This was fixed after an email to the Travis-CI support team and some investigation. The code in my .travis.yml file was modified to ensure that "artifacts" was only present once, like so:
addons:
artifacts:
s3_region: "us-west-2"
paths:
- $(git ls-files -o app/build/outputs | tr "\n" ":")