How do I get notified when someone creates a new Repository in Gitlab? - notifications

I need to be notified when someone creates a new Repository in a group (or in the instance) of my gitlab. I want only to be notified on new repositories, not all pull requests, merges, etc.
I have tried:
Creating a Project Template that e-mails me when it is used. This should have worked but Projects do not run a pipeline when created from a template.
Checking Webhooks. Webhooks provide the option to be notified on a number of things but not Repository Creation Specifically
Checking Notifications. Similar to Webhooks.
Extensively searching on google for anything related to creating new Repositories and notifications or triggering
There has to be some easier way to do this than setting up a service to constantly pull the list of projects and alert when it is different. Gitlab must have an integration for notifications about New Repositories.

System hooks can be used to hook into a number of system events, including project_create events for the instance.
This feature is only available on self-managed GitLab instances and can only be configured by administrators.

Related

Alternate lambda functions deployment for production

I'm developping a web extension for the chrome store that calls a backend deployed on aws lambda using the serverless framework.
When developing the Rest API, I may introduce breaking chances. As publishing an update on the chrome store can take a lot of time and is unpredictable (1 day to 3 weeks), the solution I'm think about to keep a compatible API with the extension is to deploy 2 different lambda functions in production.
The idea is when I push new changes on the master branch, the lambda function with the oldest version is updated and ready to receive calls as soon as the update is approved on the chrome store, without erasing the API currently used.
First, is it a good pattern to handle the updates of a client consuming an API when you don't have full control on them?
Second, is this something doable with the serverless framework and how? I couln't find any resources on the subject.
Thanks

Create releases from within a GitLab runner/pipeline

With the release of Gitlab 11.7 in January 2019, we get the new key feature Publish releases for your projects. I want precisely what the screenshot on that page shows and I want to be able to download compiled binaries using the releases API.
I can do it manually. Of course, instructions for the manual approach can be found here on stack overflow. The problem I need help with is doing it as part of a CI/CD pipeline, which is not covered by the answers one can find easily.
The release notes contain a link to the documentation, which states:
we recommend doing this as one of the last steps in your CI/CD release pipeline.
From this I gather it's possible. However, the only approach I can imagine is using the GitLab API just as I do, when I create releases manually. When one wants to access the GitLab API one has essentially three options for authentication, according to the fine manual: OAUTH2 tokens, personal access tokens and session cookies. Consequently I would need a method for having either of these available in my CI/CD pipeline, with sufficient privileges. Solutions for this problem are an ongoing discussion with lots of contributions, but virtually no tangible progress in recent years.
So, how does one create releases as one of the last steps in one's CI/CD release pipeline?
Storing my personal access key with API access in a CI/CD variable or even a file in the repo is not an option for obvious reasons.
They've put up a blog post explaining how to do this:
https://about.gitlab.com/blog/2020/05/07/how-gitlab-automates-releases/
They've created a tool (gitlab-releaser) to help with this task. Basically you create a new step, where you use a docker image that provides this tool, and then call the tool with the proper parameters.
release_upload:
image: registry.gitlab.com/gitlab-org/release-cli:v0.1.0
script:
- gitlab-releaser create --name="My Release" --description="My Release description"

Is it possible to remote access and parse git revision history?

I have a usecase where I need to be able to inspect Git repositories as part of a web service and the average repo size will be very large - 1GB+ due to being used for video game projects. I need to do simple actions such as listing the revision history, etc.
Right now I'm implementing it via API calls to the remote Git host services (Github, Bitbucket, etc). This works okay, however there are some great Git projects like GitVersion that only work with real Git repos, that use libGit2sharp, and I cannot easily write a work around for.
I feel like this'll be a longshot, but I was wondering if anyone has discussed or begun work upon an implementation of libGit2sharp that works with the major Git hosts via their API's. Obviously not all actions available in libGit2 will work with an API interface, but at least most read-only actions should be.
If this is an entirely new feature request - I'd like to get the opinion of someone with knowledge of the libGit2sharp codebase about how difficult such a feature request would be to implement.
Git only specifies the network protocol for fetching, pushing and creating an archive. Nothing else can be done via the Git protocol (and providers will likely disable the archive so they can leverage their existing caching solutions).
If this is an entirely new feature request - I'd like to get the opinion of someone with knowledge of the libGit2sharp codebase about how difficult such a feature request would be to implement.
This feature would be out of scope and impossible as Git does not provide a way to perform these tasks.
Once you're trying not to do Git, then you're out of the Git world into each provider's API. Trying to replicate Git operations and git commands on top of each provider's API is a whole project unto itself, and one which is likely to get you to hit these provider's API limits, as in-depth analysis of the repositories is not generally why they provide these services.
Not to mention that looking up each necessary object over HTTP would be extremely slow and you'd likely not gain anything over grabbing a Gigabyte or two from the network.
But if all you need is a few questions that can be easily answered from the APIs themselves (say, latest commit and its relationship to different branches), and you do need the logic in GitVersion, then you're probably better off making its history analysis pluggable so you can put in the data from your API lookups.
I'm not familiar with how GitVersion makes its decisions, but if it doesn't just want references and their relationships to each other and the tags, but rather it wants to look at the repositories themselves, and you do need it rather than just replicate some of its logic, I would recommend to download the repositories and perform all the analysis there. It'll be a much more efficient use of time to rent a bit of disk space from some provider than try to fit each individual provider's API into some idealised version of a git command where you then still need to figure out the edge cases of both the command and its API you're using.

OWASP ZAP share Context between environments and change base URI's addresses

I'm new to ZAP tool so sorry in advance if question is stupid, but I cannot find answer on it so far...
I have to fix all the vulnerabilities in some application, so I installed ZAP proxy tool locally, then explored application manually, collected all the requests and ran 'Active scanner' against it. So far everything is good, but the problem is that application quite big and it's very difficult and time consuming to cover everything manually. Fortunately we have dedicated automation environment where I can setup ZAP proxy and let test's go and populate context (set of url's for test) for me
So now my task is somehow share context's between different environments with ability to change base addresses
e.g. I populated context on somedomain/myapp and want run ZAP tool against same application deployed locally, or in different server (e.g. localhost/myapp)
It would be very helpful if someone could share any info how to achieve that.
Thank you in advance,
Eugene
It seems that you can create new context and then add existing links to that context.
Craete a new context
Add existing link to the selected content (right click)
Check this link.
https://chrisdecairos.ca/intercepting-traffic-with-zaproxy/
Tiago

How to push/sync tickets between Trac instances/projects?

Does anyone know how to push tickets from one Trac instance to another?
The problem that I'am trying to solve is the following:
Our company is doing some development for big international firm (let's call it CompanyX) that has everything behind VPN's. We have our Trac, hosted at our firm, which we use for management of all our projects. CompanyX also uses Trac, and since the developers from CompanyX cannot use our Trac for tracking bugs, requests and issues, they use their own. The reason is that their security policy is very restrictive with no Internet access to our server, and nothing can be done about that.
The problem is that we are also forced to use THEIR Trac because they prefer to communicate everything through it internally, and they expect from us to conform to their workflow as well. And for that purpose we have to connect to their VPN via some IE java plugin client from Juniper (which does not remember passwords) and every time have to configure whatnot, just to see a ticket or two on weekly basis which is really tiresome.
Since the communication is mainly one directional, from client to our firm, with no real interaction, I was wondering is there an EASY way to just push (or even sync) the tickets and their updates from client's Trac to our Trac server which would satisfy their outsourced security provider?
(It is not possible for us to touch the Trac's source on their server, so by EASY, I mean some plugin or script or something similar which would be easily accepted by their admin)
There is TicketImportPlugin that can import tickets from csv or excel files. The opposite is to export tickets as csv or tsv files via link on the bottom of a ticket page.
TicketMoverPlugin is able to move tickets from one Trac instance to another.
You should be able to do something using the XMLRPC plugin. You can script up an application that queries your client's Trac for tickets (using ticket.query()), then grabs the ticket details (ticket.get()) and posts them to your Trac in a new ticket (ticket.create()).
This would require both Trac instances to install that plugin and create a trac account that has the XML_RPC permission. You will have to make sure that your client finds this acceptable. Since it's a pre-packaged plugin that you can enable on a per-account basis, it shouldn't disrupt their normal workflow very much.