Itunes Enterprise Partner Feed (EPF) directory structure - itunes

I'm trying to create a script to download the most recent feed information file (e.g. video.tbz) but there does not seem to be a pattern for the directory structure. Basically how can I get the most recent incremental feed file?

As you said, there is no structure. You have to iterate thorugh possible dates.
You can do it with wget or curl and just check if you get a http 200.

Related

How can I list all uploads for a project?

I would like to access the list of all uploads that have been added to a given project on my company GitLab server.
I don't mean versionned files, I mean attached files: binaries and other types of files that have been attached to issues, merge requests, etc.
It's OK if I have to use the API for that.
What I've tried
My first approach was through GET /projects/:id/repository/files/:file_path, but that's for the versionned files.
Then, I found out about POST /projects/:id/uploads, but that's only for uploading and not for listing already uploaded files.
Is there a way to list all those uploaded files?
I believe this is not possible.
There is an open issue for retrieving specific files which has not received much attention:
https://gitlab.com/gitlab-org/gitlab-ce/issues/55520
Hopefully, in the future, there will eventually be an endpoint
GET /projects/:id/uploads
I had the same question and after getting in touch with gitlab support they confirmed that this is not currently implemented (as of now, November 2021), and forwarded me the 3 following feature requests :
API list all files on a project : https://gitlab.com/gitlab-org/gitlab/-/issues/197361
Attachment Manager : https://gitlab.com/gitlab-org/gitlab/-/issues/16229
Retrieve uploaded files using API : https://gitlab.com/gitlab-org/gitlab/-/issues/25838
A workaround seems to be to export the whole project, and you'll find the uploads in that archive, and you'll be able to list them.

Can I transfer images between shopify sites?

I'm doing some work for a client who has an existing shopify website. They want to make some big changes to the site, so i have set up a new development site in shopify, exported all of the products/pages/blog posts to it and am now working on getting all the new functionality/design working on the dev site.
Once the new build is finished though, i want to transfer everything back over to their current site. Products/pages/blog posts will be fine (ive written a custom export/import thing using their api), but what about images?
I am uploading lots of images to the dev site and i am worried they will be deleted when development is finished and i shut down the dev site. Is it possible to transfer over images from one site to another?
Ideally, keeping the same urls on shopifys cdn when doing so, although if i have to change the urls, then i can probably do an automated replace on the csv files that will get uploaded.
There are going to be hundreds of images involved, and they will be used in various places throughout the site, including in the rich text area of pages/blogs, so it's not going to be practical to do manually in any way, must be something I can automate.
Thanks for any help.
When you export products as a CSV, you get links to your images. You could write a script to download each of the images in the CSV. Just redirect the output of curl to save the image.
curl link_url > imagename
Have you tried transferring between the two sites using FTP? If you have SSH Access
login to the server via SSH
change to the right directory to file location or desired location
FTP into the other server using ftp <name_or_IP_address_of_other_server> and your login details
use cd to locate your location / desired destination
use the binary command
hash if you want a progress bar
if sending the file from the server you SSHed into issue the put <filename> command, and if you want to pull the file from the other server to the one you are logged into use get <filename> instead.
Wait a while for the transfer to complete - might take a while

How to find the GitLab project size by using API?

I trying to find the size of the project by using GitLab API. I got some Idea about this in GitLab document. But it seems to get the particular branch file size. Also, I tried this but I faced below exception in my browser.
{"message":"400 (Bad request) \"file_path\" not given"}
I do not know, how to use this below API to get the project size. By using this same API I got the above error.
https://gitlab.company.com/api/v3/projects/<project_ID>/repository/files?private_token=GMecwr8umMN4wx5L
You've got this error because this end-point have two required parameters
file_path (required) - Full path to new file. Ex. lib/class.rb
ref (required) - The name of branch, tag or commit
Anyway, getting files count with this end-point is impossible because this is for
CRUD for repository files
Create, read, update and delete repository files using this API
So you can just make CRUD operations for one specified file.
Listing files may be done with https://docs.gitlab.com/ee/api/repositories.html#list-repository-tree

Make Indexed File Downloadable In Apache Solr

I am trying to indexed pdf file to Solr which I have done successfully using the command
curl "http://localhost:8983/solr/update/extract?literal.id=id=true"-F myfile=#filename.pdf"
I am able to see the file contents and search, but when I try to click on file name it shows
HTTP ERROR 404
Problem accessing /solr/collection1/id. Reason:
not found
What I want is to have a link which allows downloading the file, I know Solr merely indexes the file and stores it. I was wondering if there is a way by which I can add attribute location like you have done and proceed from there, can you please share with me what you have done, if you want any more clarity regarding my problem do ask.
We have the actual files hosted through a separate web application to be download from with auditing and additional security.
you can always directly host these files through http server.
If you are having the file names with id, it is as easy as appending the id.extension to the fixed http hosted url.
Else index the path of the file with an additional parameter e.g. literal.url.
The url will the solr field which will now be available with the Solr response.

Get metadata of file present on dropbox before downloading in iphone

I am using Dropbox Api to display the list of files and folders present on it.. User can download file and view it..
I am downloading file every time .. but now i want to download file only if it is modified..
I can get date when file was updated in its metadata LastModifiedDate..
I have displayed all files in tableview and at the time of display all files i am getting metadata of all files in following function i have mentioned. And if after displaying all list any file get change I will not get latest file..I am getting the latest LastModifiedDate after file gets download..
Following method gets called when file download is complete:
- restClient:loadedMetadata:
Is there any way by which i can get it before it starts downloading? Is there any method of dropbox api which gives metadata of file before download?
If anybody wants further explanation then please let me know..
Thanks :)
Caching policies should be your friends, as far as the server is configured to support ETag.
wiki
short guide