How to build docker image from github repository - redis

In official docs we can see:
# docker build github.com/creack/docker-firefox
It just works fine to me. docker-firefox is a repository and has Dockerfile within root dir.
Then I want to buid redis image and exact version 2.8.10 :
# docker build github.com/docker-library/redis/tree/99c172e82ed81af441e13dd48dda2729e19493bc/2.8.10
2014/11/05 16:20:32 Error trying to use git: exit status 128 (Initialized empty Git repository in /tmp/docker-build-git067001920/.git/
error: The requested URL returned error: 403 while accessing https://github.com/docker-library/redis/tree/99c172e82ed81af441e13dd48dda2729e19493bc/2.8.10/info/refs
fatal: HTTP request failed
)
I got error above. What's the right format with build docker image from github repos?

docker build url#ref:dir
Git URLs accept context configuration in their fragment section,
separated by a colon :. The first part represents the reference that
Git will check out, this can be either a branch, a tag, or a commit
SHA. The second part represents a subdirectory inside the repository
that will be used as a build context.
For example, run this command to use a directory called docker in the
branch container:
docker build https://github.com/docker/rootfs.git#container:docker
https://docs.docker.com/engine/reference/commandline/build/

The thing you specified as repo URL is not a valid git repository. You will get error when you will try
git clone github.com/docker-library/redis/tree/99c172e82ed81af441e13dd48dda2729e19493bc/2.8.10
Valid URL for this repo is github.com/docker-library/redis. So you may want to try following:
docker build github.com/docker-library/redis
But this will not work too. To build from github, docker requires Dockerfile in repository root, howerer, this repo doesn't provide this one. So, I suggest, you only have to clone this repo and build image using local Dockerfile.

One can use the following example which sets up a Centos 7 container for testing ORC file format. Make sure to escape the # sign:
$ docker build https://github.com/apache/orc.git\#:docker/centos7 -t orc-centos7

Related

How to build container serving Vue SPA using Cloud Native Buildpacks

Currently I'm trying to build container serving VueJS application via Cloud Native Buildpacks.
I already have working Docker file that builds VueJS in production mode and then copy results to nginx image, but I would like to try to use CNB.
So I just have created empty VueJS project for test via vue create vue-tutorial and trying to do with CNB somehting like described there https://cli.vuejs.org/guide/deployment.html#heroku but using CNB.
Does anyone know working recipe how to do that with CNB?
P.S. Currently I'm trying to build that with
pack build spa --path . \  SIGINT(2) ↵  17:22:41
--buildpack gcr.io/paketo-buildpacks/nodejs \
--buildpack gcr.io/paketo-buildpacks/nginx
but getting next error (and I'm not sure that I'm on right way):
===> DETECTING
ERROR: No buildpack groups passed detection.
ERROR: Please check that you are running against the correct path.
ERROR: failed to detect: no buildpacks participating
ERROR: failed to build: executing lifecycle: failed with status code: 100
UPD
My current dockerfile
# build stage
FROM node:lts-alpine as build-stage
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
RUN npm run build
# production stage
FROM nginx:1.19-alpine as production-stage
COPY --from=build-stage /app/dist /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]
We chatted about this in Slack, but I wanted to capture it here too:
pack build --buildpack heroku/nodejs --buildpack https://cnb-shim.herokuapp.com/v1/heroku-community/static yourimage
This command may do what you want. The static buildpack used in that example is not yet converted to a cloud native buildpack, but the shim may allow you to build a workable artifact. Then run your image with something like docker run -it -e PORT=5000 -p 5000:5000 yourimagename

Gitlab-CI cannot clone

I have a very basic integration configured for Gitlab-CI but it fails almost at the beginning when it has to clone the code.
My integration is this:
image: node:latest
stages:
- build
- test
cache:
paths:
- node_modules/
- dist/
build-prod:
stage: build
script:
- npm install
- npm run build-prod
artifacts:
paths:
- node_modules/
- dist/
test_with_karma:
stage: test
script: ng test
And the error that I get is this:
Running with gitlab-runner 11.7.0 (8bb608ff)
on fakehost 2eaf11ea
Using Docker executor with image node:latest ...
Pulling docker image node:latest ...
Using docker image sha256:8c67bfd7b95bdc535edc4a4144f5392b0f73efd6385fbcb47747d028d7059359 for node:latest ...
Running on runner-2eaf11ea-project-56-concurrent-0 via fakehost...
Cloning repository...
Cloning into '/builds/redacted/frontend'...
remote: You are not allowed to download code from this project.
fatal: unable to access 'https://gitlab-ci-token:xxxxxxxxxxxxxxxxxxxx#working-domain.com/redacted/frontend.git/': The requested URL returned error: 403
/bin/bash: line 65: cd: /builds/redacted/frontend: No such file or directory
ERROR: Job failed: exit code 1
What is the problem here?
Check if this is covered by gitlab-org/gitlab-ce issue 39469
YAY - it works for me. This problem seems to have multiple solutions.
The one that worked for me is #44855
To summarize. Being an Administrator on Gitlab does not mean you have the "access" to do whatever you want to do in Gitlab.
"Unable to access" permissions applies to the person who is logged into Gitlab and running the job.
To fix the problem - the person / account running the job must be a member (master) of the project.
This will apply to private projects.
It is not necessary to make a private project Public even though that appears to fix the problem. GITLAB suggests you must have https for the project to work you can use http.
SOLUTION - add your account to the project even if you are the Administrator
And:
Conrad has described it correctly.
You need to have rights to the project to run pipeline, however, as administrator, you can start any pipeline.
I've got the case when the user being Admin in Gitlab could push his commit from command line, although theoretically having no rights to project - and the pipeline has failed.
This inconsistency need to be fixed, either Admin user should not be able to push/start pipeline, having no rights for it, or he should authomatically be granted all rights to all projects. I'd prefer the first one, because it separates gitlab administration from project rights. Sometimes I prefer not having full rights, just like working as non-root under Linux.

When I try to add local repository as per the tutorial, it gives me error as 'this directory does not appear to be a Git repository'

I want to upload my project to GitHub account. When I try to add local repository as per the tutorial, it gives me error as 'this directory does not appear to be a Git repository'
enter image description here
You need to initialize repository first. You can do it by running command below in your terminal.
git init
Basically you need to follow these steps.
First initialize the git in a specific folder.
git init
Then take the https or ssh link of the github repository and add as a remote.
git remote add origin master [url of repository]
Then need to add all files or folders
git add -A
It will add all the files and folders of the project.
If you want to know the status of it that which files and folder are going to be upload
git status
Then you need to commit and write the message
git commit -m "first commit"
Then you need to push all this.
git push origin master

Make.git open in ssh

I have download a git with a wget on a vps through putty.
I see the file is listed on the vps like so:
bitcoin-sniffer.git .lastlogin .python_history
Now how can I execute the .git, or actually use the files that are within it? I have tried
git clone bitcoin-sniffer.git
The error:
fatal: destination path 'bitcoin-sniffer.git' already exists and is not an empty directory.
Generally, the git clone command is followed by an address with ssh or HTTPS path to download a repo. The git command is not run against a *.git "package".
An example would be:
bash
git clone https://github.com/sebicas/bitcoin-sniffer.git
This would download and create a folder by the name bitcoin-sniffer. Within this folder, git commands can be run, like git status.
The "git" you acquired is a full git repository, with the entire history of the protect and all the information you need to get the current state of the files. Judging by the .git extension, I would assume that the repository is "bare", meaning that it only contains the compressed history but not a working copy of the current state of the project. Conventionally, bare repos have a .git extension, while a full working copy would have a .git folder in the project root.
Your intuition to clone the repository to get a working copy is correct. It's not working because by default, git clone running locally will try to make a folder with the same name as the repo. Give it a different folder name as an additional parameter instead:
git clone bitcoin-sniffer.git bitcoin-sniffer
This is actually doing an extra step in all probability. You can clone directly from a remote location using either SSH or HTTPS. If your project comes from GitHub, for example, you can get a read-only copy (that you can modify locally but not push back) anonymously over HTTPS:
git clone https://github.com/sebicas/bitcoin-sniffer.git
You really shouldn't be getting "gits" using WGET under normal circumstances.

How to publish docker images to docker hub from gitlab-ci

Gitlab provides a .gitlab-ci.yml template for building and publishing images to its own registry (click "new file" in one of your project, select .gitlab-ci.yml and docker). The file looks like this and it works out of the box :)
# This file is a template, and might need editing before it works on your project.
# Official docker image.
image: docker:latest
services:
- docker:dind
before_script:
- docker login -u "$CI_REGISTRY_USER" -p "$CI_REGISTRY_PASSWORD" $CI_REGISTRY
build-master:
stage: build
script:
- docker build --pull -t "$CI_REGISTRY_IMAGE" .
- docker push "$CI_REGISTRY_IMAGE"
only:
- master
build:
stage: build
script:
- docker build --pull -t "$CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG" .
- docker push "$CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG"
except:
- master
But by default, this will publish to gitlab's registry. How can we publish to docker hub instead?
No need to change that .gitlab-ci.yml at all, we only need to add/replace the environment variables in project's pipeline settings.
1. Find the desired registry url
Using hub.docker.com won't work, you'll get the following error:
Error response from daemon: login attempt to https://hub.docker.com/v2/ failed with status: 404 Not Found
Default docker hub registry url can be found like this:
docker info | grep Registry
Registry: https://index.docker.io/v1/
index.docker.io is what I was looking for.
2. Set the environment variables in gitlab settings
I wanted to publish gableroux/unity3d images using gitlab-ci, here's what I used in Gitlab's project > Settings > CI/CD > Variables
CI_REGISTRY_USER=gableroux
CI_REGISTRY_PASSWORD=********
CI_REGISTRY=docker.io
CI_REGISTRY_IMAGE=index.docker.io/gableroux/unity3d
CI_REGISTRY_IMAGE is important to set.
It defaults to registry.gitlab.com/<username>/<project>
regsitry url needs to be updated so use index.docker.io/<username>/<project>
Since docker hub is the default registry when using docker, you can also use <username>/<project> instead. I personally prefer when it's verbose so I kept the full registry url.
This answer should also cover other registries, just update environment variables accordingly. 🙌
To expand on the GabLeRoux's answer,
I had issues on the pushing stage of the GitLab CI build:
denied: requested access to the resource is denied
By changing my CI_REGISTRY to docker.io (remove the index.) I was able to successfully push.