How to find the GitLab project size by using API? - api

I trying to find the size of the project by using GitLab API. I got some Idea about this in GitLab document. But it seems to get the particular branch file size. Also, I tried this but I faced below exception in my browser.
{"message":"400 (Bad request) \"file_path\" not given"}
I do not know, how to use this below API to get the project size. By using this same API I got the above error.
https://gitlab.company.com/api/v3/projects/<project_ID>/repository/files?private_token=GMecwr8umMN4wx5L

You've got this error because this end-point have two required parameters
file_path (required) - Full path to new file. Ex. lib/class.rb
ref (required) - The name of branch, tag or commit
Anyway, getting files count with this end-point is impossible because this is for
CRUD for repository files
Create, read, update and delete repository files using this API
So you can just make CRUD operations for one specified file.
Listing files may be done with https://docs.gitlab.com/ee/api/repositories.html#list-repository-tree

Related

How to bundle next.js app as a javascript widget or npm package?

I have a survey built with Next.js using Incremental Static Regeneration (ISR). I would like to bundle it so I can can either publish it to npm or host a single entry file so I can use the survey on other applications.
It's currently hosted on Vercel and uses getStaticProps and getStaticPaths to call my API of 'surveys' and 'survey questions'. ISR is great because it allows me to dynamically load each step of the survey based off of the API structure and if I modify it then the revalidate property will regenerate the new order or questions from the survey. It also lets me get away with only having one page for all surveys/question types.
My App structure is like this:
- src
- pages
- [surveyid]
- [...question].tsx
Based on the request (and response that was received during build time/revalidation) the static files for the survey-id are created and the next router will route to each survey question based off the next step in the json object from the api eg., /surveyid/question-1, /surveyid/question-2 etc..
This is all working well in production when deployed to Vercel.
When it comes to bundling this so a survey can be loaded into other sites I have been quite lost.
When I run next build it builds the prod files that are served to vercel but there are many entry points and not a single .js file.
I tried running next export and serving the .out folder locally and the pages and links are accurate but this breaks the ISR and after reading next.js documentation it states that next export doesn't work with ISR.
Ideally I would like to be able to build the application to a single entry file eg., index.js and then either publish as a package to npm or host on my server. Then load the survey by installing the package or adding the direct url to the src file of a script tag in my other projects eg., <script src='https://survey.com/widget.js'></script> and provide some settings/options to the request so I can tell it which survey to return.
Is there a way for this to be done while still maintaining ISR?
Would I have to create some sort of entry file to dispatch the request and return the static files from my vercel server instead as a workaround?
I am currently trying to see if I can use rollup to build it out to a single file but I am unsure if this will break the next router when it comes to the dynamic rendering (or revalidation) of pages.
In a perfect world I would like to leverage some of the cool features of next like their middleware to determine the geolocation from the request header as well. But i'm happy if I can just get the survey to render in another project at this point.

Azure DevOps Testing

The basic purpose is to test a profile image upload API. I have an API which takes image file as an input and updates the profile picture according to given auth. I ran this API in postman and it was working fine. Now what I want to achieve is that I want to run this collection which has just one API for now on Azure devOps using npm, newman and publish test results. The issue that i am facing is that I can not find a way to upload that file. In postman collection, file path is the path in which your file is placed on your pc. In order to Run that api on devOps, what path should i give? Also, is there a way to upload an image or any sort of file?
In postman collection, file path is the path in which your file is
placed on your pc.
According to this description, you should want to upload the picture to a local file path.
The first thing that is clear is that you need to use the self-hosted agent to run the pipeline,because only if you use the agent installed on your local machine to run, you can track your local file path in Azure Devops. You can refer to this document on how to install a self-hosted agent.
Usually, the postman collection is exported as a .json file and then pushed it into the repo. If you set the file path in the collection, is it included in the collection.json?
In addition, it is difficult to reproduce your issue based on the information available, more detailed information is needed for further investigation:1. Which api are you using?
2.Postman collection settings 3.Specific operation process 4. Pipeline definition. It will be easier to understand in the form of screenshots.

How to get file path in Asp.net core to work in Azure Dev Ops

Am trying to read Text File content. So i am using below line of code it is working in local .
string src = #"Templates\UserMailTemplate.txt";
string[] content = System.IO.File.ReadAllLines(src);
After hosting in Azure Dev Ops .Am getting issue as
Could not find file 'D:\home\site\wwwroot\Templates\UserMailTemplate.txt'.
Server.Mappath is not working in ASP.Net Core .
Please let me know how can i resolve this in DEV ops.
The cause of your issue is in the error message: Could not find file. This probably means the file is either not published, or not published at the location you're expecting it to be.
Please validate that you're looking in the right location.
To enable having the files available after publish, either
1. Have a look at IHostingEnvironment and its properties
2. Set the Build Action to Content and access it using Application.GetContentStream
3. Set the Build Action to Embedded Resource and call Assembly.GetManifestResourceStream to read the file from the assembly
4. Set Copy to Output Directory to Copy always or Copy if newer and access the files in the output directory
'Slightly' off-topic
Please have a look at a completely different way to manage mail templates in Azure. The easiest change from what you're doing now is probably to store them in Azure Storage.
If you persist in using files, please refer to this Azure Tip: Working with Files in Azure App Service

How can I list all uploads for a project?

I would like to access the list of all uploads that have been added to a given project on my company GitLab server.
I don't mean versionned files, I mean attached files: binaries and other types of files that have been attached to issues, merge requests, etc.
It's OK if I have to use the API for that.
What I've tried
My first approach was through GET /projects/:id/repository/files/:file_path, but that's for the versionned files.
Then, I found out about POST /projects/:id/uploads, but that's only for uploading and not for listing already uploaded files.
Is there a way to list all those uploaded files?
I believe this is not possible.
There is an open issue for retrieving specific files which has not received much attention:
https://gitlab.com/gitlab-org/gitlab-ce/issues/55520
Hopefully, in the future, there will eventually be an endpoint
GET /projects/:id/uploads
I had the same question and after getting in touch with gitlab support they confirmed that this is not currently implemented (as of now, November 2021), and forwarded me the 3 following feature requests :
API list all files on a project : https://gitlab.com/gitlab-org/gitlab/-/issues/197361
Attachment Manager : https://gitlab.com/gitlab-org/gitlab/-/issues/16229
Retrieve uploaded files using API : https://gitlab.com/gitlab-org/gitlab/-/issues/25838
A workaround seems to be to export the whole project, and you'll find the uploads in that archive, and you'll be able to list them.

Authentication using Spinnaker expression helper function

I have built a pipeline that is triggered by a Git push on a specific file which contains additional meta information like the target namespace and version of the kubernetes manifest to be deployed.
Within an expression I would like to read the artifact using
${ #fromUrl( execution['trigger']['resolvedExpectedArtifacts'][0]['boundArtifact']['reference'] ) }
What I try to achieve is a GitOps approach with a set of config files in Git which trigger a pipeline for a parameterized Kubernetes manifest to deploy multiple resources.
When I execute that expression either by starting the pipeline or using curl I get 401 (in orca logs). The Git credentials are configured using username/password and token as well in config as in orca-local.yml.
But it seems they are not used.
Am I on the wrong path, is there an easier way to access a file's content in a pipeline?
That helper won't go through any sort of authentication, it will expect the endpoint to be open to your spinnaker instance.
Spinnaker normally treats artifacts as pass-through, so in order to get the contents of the file inside the pipeline you'll have to go through an intermediate stage such as writing out a property file in a jenkins stage ( https://www.spinnaker.io/guides/user/pipeline/expressions/#property-files ) or via webhook with custom auth headers.