NextCloud: Append only? - nextcloud

I know how to make an upload dir and how to make a read-only dir. Can I combine those?
I want people to be able to upload but not change existing files uploaded by others. They should, however, be able to see the files uploaded by others. The uploaders will only use this once and will therefore not have an account on my NextCloud installation.

No, right now you can't do this in Nextcloud. A work-around would be to create a file-drop link AND a read-only link, so they can see in the read-only link and upload in the file-drop link ;-)
You can create multiple upload links by entering an email address. The recipient gets a link by mail which you can configure separately. So, for example, you create a read-only public link and share that with ppl. Then everyone you want to give upload rights you sent a file-drop link via mail ;-)

Related

Can I transfer images between shopify sites?

I'm doing some work for a client who has an existing shopify website. They want to make some big changes to the site, so i have set up a new development site in shopify, exported all of the products/pages/blog posts to it and am now working on getting all the new functionality/design working on the dev site.
Once the new build is finished though, i want to transfer everything back over to their current site. Products/pages/blog posts will be fine (ive written a custom export/import thing using their api), but what about images?
I am uploading lots of images to the dev site and i am worried they will be deleted when development is finished and i shut down the dev site. Is it possible to transfer over images from one site to another?
Ideally, keeping the same urls on shopifys cdn when doing so, although if i have to change the urls, then i can probably do an automated replace on the csv files that will get uploaded.
There are going to be hundreds of images involved, and they will be used in various places throughout the site, including in the rich text area of pages/blogs, so it's not going to be practical to do manually in any way, must be something I can automate.
Thanks for any help.
When you export products as a CSV, you get links to your images. You could write a script to download each of the images in the CSV. Just redirect the output of curl to save the image.
curl link_url > imagename
Have you tried transferring between the two sites using FTP? If you have SSH Access
login to the server via SSH
change to the right directory to file location or desired location
FTP into the other server using ftp <name_or_IP_address_of_other_server> and your login details
use cd to locate your location / desired destination
use the binary command
hash if you want a progress bar
if sending the file from the server you SSHed into issue the put <filename> command, and if you want to pull the file from the other server to the one you are logged into use get <filename> instead.
Wait a while for the transfer to complete - might take a while

Upload file with tags through OwnCloud or NextCloud API

I have a database of files that are already tagged. Now, I would like to upload these files to an OwnCloud or NextCloud Server and pass on my already existing tags so that they show up as tags in the respective system. I wasnt able yet to find a way how I could do that in the documentation, does anyone have an idea how I could do it?
Thanks!
I just made available the source code of the (remote) file tagging micro-service for Nextcloud on github (https://github.com/julianthome/taggy). The implementation consists of two parts: 1) the taggy client for uploading files to the Nextcloud server, and for invoking the taggy server; 2) the taggy server for adding specified tags to uploaded files.
I will polish the code further within the next days. I am also planning to add SSL support which is important because username and password are currently transmitted unencrypted to the taggy server. The server uses these credentials in order to check whether the user can be properly authenticated before tagging any files.
Please let me know if you have other ideas, suggestions or feedback ;)
Kind regards

Get public url from file path in dropbox

Suppose I have dropbox account, and I have shared folder their, for instance the name of the shared folder is "SampleFolder". And inside that folder I have folder and file hierarchy, which also is shared being inside a shared folder. How can I having SampleFolder url and knowing the file path that I want to download easily get it's url either by Dropbox Core API or just know the way the urls are constructed and construct the url by hand. For instance I want to download file in path SampleFolder/Folder1/Folder2/image.png, how can I get the url of that file knowing only the url of SampleFolder? Let me note that I don't want to login into dropbox, here there is a get method for retrieving file by it's path, but it requires authorisation. Basically I want a public place of storing files, and in my code downloading them by their urls.
Thanks for the answers.
[Addition Aug2017: This method has been disabled by Dropbox for all users. See https://www.dropbox.com/en/help/files-folders/public-folder]
See https://www.dropbox.com/help/16/en towards the bottom under "Creating a Public folder"
While newer accounts do not have the Public Folder enabled, it is possible to enable it by going to this link when logged into that account: https://www.dropbox.com/enable_public_folder
Then you can follow the path to the file after https://dl.dropbox.com/u/<user id>/
This used to be possible with Dropbox, at least in the spring of 2012, but it appears no longer to work. Before, if you had a shared folder, you could browse the subcontents of that folder relative to the shared URL, but now, all the subcontents have distinct absolute URLs.
Breaking basic UNIX file path conventions like this is a huge loss in functionality, in my opinion.
Set your link permission to anyone with a link.
try the following link to access it as public static file:
https://dl.dropboxusercontent.com/s/<docId>/yourfile.ext
docId can be found by clicking on Share button of the doc on the dropbox UI.
This the alphanumeric string found after /s/.
Unfortunately, no, this isn't currently possible in an official or supported way. These shared links don't offer any metadata or API for access like this.
For free users, the Public Folder method for relative paths won't work anymore now. See here.
As of March 15, 2017 the Public folder in your Dropbox account has been converted into a standard folder. By default this folder is private to your account. This transition will occur automatically.

Display list files and folder using Mediafire API

I tryed to use Mediafire API, but when I use Folder, get_info, it doesn't return file & folder array like the example.
Full url I used: http://www.mediafire.com/api/folder/get_info.php?folder_key=l461cm2d8hfxd
What's wrong with my attempt? Thank you.
You can try the MediaFire API PHP Library. This class currently implements all the functions in the Mediafire API.
Ok I just took a look at their API documentation. They've updated the get_info function for the folders. They've taken out the file tree....
So if you are uploading via the dropbox (which doesn't return the quickkey associated with the file), you CAN NEVER dropbox upload and then use the api to find the file and download it. This renders their API as useless as tits on a boar hog.
The point of a dropbox is to allow remote uploads to a folder, you then know the folder key so you can query the API and return the document quickkeys that are in the folder which then allows you to manage those files remotely, move, delete, download etc. Now you cannot do this FAIL FAIL FAIL.
Despite the Functionality of get_info not working, folder search can resolve at least some issues with retrieving quick keys. In my case i searched for ".mp3" and was handed all the mp3s in my folder

Directory Listing in S3 Static Website

I have set up an S3 bucket to host static files.
When using the website endpoint (http://.s3-website-us-east-1.amazonaws.com/): it forces me to set an index file. When the file isn't found, it throws an error instead of listing directory contents.
When using the s3 endpoint (.s3.amazonaws.com): I get an XML listing of the files, but I need an HTML listing that users can click the link to the file.
I have tried setting the permissions of all files and the bucket itself to "List" for "Everyone" in the AWS Console, but still no luck.
I have also tried some of the javascript alternatives, but they either don't work under the website url (that redirects to the index file) or just don't work at all. As a last resort, a collapsible javascript listing would be better than nothing, but I haven't found a good one.
Is this possible? If so, do I need to change permissions, ACL or something else?
I've created a simple bit of JS that creates a directory index in HTML style that you are looking for: https://github.com/rgrp/s3-bucket-listing
The README has specific instructions for handling Amazon S3 "website" buckets: https://github.com/rgrp/s3-bucket-listing#website-buckets
You can see a live example of the script in action on this s3 bucket (in website mode): http://data.openspending.org/
There is also this solution: https://github.com/caussourd/aws-s3-bucket-listing
Similar to https://github.com/rgrp/s3-bucket-listing but I couldn't make it work with Internet Explorer. So https://github.com/caussourd/aws-s3-bucket-listing works with IE and also add the possibility to order the files by names, size and date. On the downside, it doesn't follow folders: only the files at one level are displayed.
This might solve your problem. Security settings for Everyone group:
(you need the bucketexplorer.com software for this)
If you are sharing files of HTTP, you may or may not want people to be able to list the contents of a bucket (folder.) If you want the bucket contents to be listed when someone enters the bucket name (http://s3.amazonaws.com/bucket_name/), then edit the Access Control List and give the Everyone group the access level of Read (and do likewise with the contents of the bucket.) If you don’t want the bucket contents list-able but do want to share the file within it, disable Read access for the Everyone group for the bucket itself, and then enable Read access for the individual files within the bucket.
I created a much simpler solution. Just place the index.html file in root of your folder and it will do the job. No configuration required. https://github.com/prabhatsharma/s3-directorylisting
I had a similar problem and created a JavaScript-and-iframe solution that works pretty well for listing directories in S3 website files. You just have to drop a couple of .html files into the directory you want to list. You can find it here:
https://github.com/adam-p/s3-file-list-page
I found s3browser, which allowed me to set up a directory on the main web site that allowed browsing of the s3 bucket. It worked very well and was very easy to set up.
Using another approach base in pure JavaScript and AWS SDK JavaScript API. Not need PHP or other engine just pure web site (Apache or even IIS).
https://github.com/juvs/s3-bucket-browser
Not intent for deploy on your own bucket (for me, no make sense).
Using the new IAM Users from AWS you can provide more specific and secure access to your buckets. No need to publish your bucket to website and make all public.
If you want secure the access, you can use the conventional methods to authenticate users for your current web site.
Hope this help too!