Display list files and folder using Mediafire API - api

I tryed to use Mediafire API, but when I use Folder, get_info, it doesn't return file & folder array like the example.
Full url I used: http://www.mediafire.com/api/folder/get_info.php?folder_key=l461cm2d8hfxd
What's wrong with my attempt? Thank you.

You can try the MediaFire API PHP Library. This class currently implements all the functions in the Mediafire API.

Ok I just took a look at their API documentation. They've updated the get_info function for the folders. They've taken out the file tree....
So if you are uploading via the dropbox (which doesn't return the quickkey associated with the file), you CAN NEVER dropbox upload and then use the api to find the file and download it. This renders their API as useless as tits on a boar hog.
The point of a dropbox is to allow remote uploads to a folder, you then know the folder key so you can query the API and return the document quickkeys that are in the folder which then allows you to manage those files remotely, move, delete, download etc. Now you cannot do this FAIL FAIL FAIL.

Despite the Functionality of get_info not working, folder search can resolve at least some issues with retrieving quick keys. In my case i searched for ".mp3" and was handed all the mp3s in my folder

Related

How can I list all uploads for a project?

I would like to access the list of all uploads that have been added to a given project on my company GitLab server.
I don't mean versionned files, I mean attached files: binaries and other types of files that have been attached to issues, merge requests, etc.
It's OK if I have to use the API for that.
What I've tried
My first approach was through GET /projects/:id/repository/files/:file_path, but that's for the versionned files.
Then, I found out about POST /projects/:id/uploads, but that's only for uploading and not for listing already uploaded files.
Is there a way to list all those uploaded files?
I believe this is not possible.
There is an open issue for retrieving specific files which has not received much attention:
https://gitlab.com/gitlab-org/gitlab-ce/issues/55520
Hopefully, in the future, there will eventually be an endpoint
GET /projects/:id/uploads
I had the same question and after getting in touch with gitlab support they confirmed that this is not currently implemented (as of now, November 2021), and forwarded me the 3 following feature requests :
API list all files on a project : https://gitlab.com/gitlab-org/gitlab/-/issues/197361
Attachment Manager : https://gitlab.com/gitlab-org/gitlab/-/issues/16229
Retrieve uploaded files using API : https://gitlab.com/gitlab-org/gitlab/-/issues/25838
A workaround seems to be to export the whole project, and you'll find the uploads in that archive, and you'll be able to list them.

Get public url from file path in dropbox

Suppose I have dropbox account, and I have shared folder their, for instance the name of the shared folder is "SampleFolder". And inside that folder I have folder and file hierarchy, which also is shared being inside a shared folder. How can I having SampleFolder url and knowing the file path that I want to download easily get it's url either by Dropbox Core API or just know the way the urls are constructed and construct the url by hand. For instance I want to download file in path SampleFolder/Folder1/Folder2/image.png, how can I get the url of that file knowing only the url of SampleFolder? Let me note that I don't want to login into dropbox, here there is a get method for retrieving file by it's path, but it requires authorisation. Basically I want a public place of storing files, and in my code downloading them by their urls.
Thanks for the answers.
[Addition Aug2017: This method has been disabled by Dropbox for all users. See https://www.dropbox.com/en/help/files-folders/public-folder]
See https://www.dropbox.com/help/16/en towards the bottom under "Creating a Public folder"
While newer accounts do not have the Public Folder enabled, it is possible to enable it by going to this link when logged into that account: https://www.dropbox.com/enable_public_folder
Then you can follow the path to the file after https://dl.dropbox.com/u/<user id>/
This used to be possible with Dropbox, at least in the spring of 2012, but it appears no longer to work. Before, if you had a shared folder, you could browse the subcontents of that folder relative to the shared URL, but now, all the subcontents have distinct absolute URLs.
Breaking basic UNIX file path conventions like this is a huge loss in functionality, in my opinion.
Set your link permission to anyone with a link.
try the following link to access it as public static file:
https://dl.dropboxusercontent.com/s/<docId>/yourfile.ext
docId can be found by clicking on Share button of the doc on the dropbox UI.
This the alphanumeric string found after /s/.
Unfortunately, no, this isn't currently possible in an official or supported way. These shared links don't offer any metadata or API for access like this.
For free users, the Public Folder method for relative paths won't work anymore now. See here.
As of March 15, 2017 the Public folder in your Dropbox account has been converted into a standard folder. By default this folder is private to your account. This transition will occur automatically.

red5 with s3(i want to customize the path for streaming videos)

I am using red5 for streaming videos in my project and I am able to play the videos from the local system which are saved in default folder "streams".
Now i want to customize the path and want to get the videos from S3. How do i configure red5 to work with S3. Is this a good practice?
I've got code using the IStreamFilenameGenerator works with S3; I'll warn you now that it may not work with the latest jets3 library, but you'll get the point of how it works by looking through the source. One problem / issue that you must understand when using S3 is that you cannot "record" to the bucket on-the-fly; your flv files can only be transferred to S3 once the file is finalized; there is an example upload call in the Application.class. Whereas "play" from S3 will work as expected.
I added the S3 code to the red5-examples repo: https://github.com/Red5/red5-examples
Search for:
https://stackoverflow.com/search?q=IStreamFilenameGenerator
Or https://www.google.com.au/search?q=IStreamFilenameGenerator+example&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:de:official&client=firefox-a
and you will find some examples howto modify the path(s).
You could alternatively also of course simply mount some drive into the streams folder or I guess a symbolic link would even work. But it might be not that flexible as if you can do it with IStreamFilenameGenerator and generate really some string completely like you want it.
Sebastian

Upload entire directory via FTP in VB.NET

I have successfully been able to upload single files to an FTP with this:
My.Computer.Network.UploadFile
Is there a way to upload an entire directory, such as this:
My.Computer.Network.UploadFolder
Never worked with FTP in VB.NET, but it seems there is no direct way to do it.
Here is what people say about how it can be done:
Traversal the local directory
Create the same directory structure in the FTP server.
Upload the file in each local directory.
Also see here for same question answered (C#). It confirms there is no built-in way. You would have to write some code or use 3rd party libraries.

Directory Listing in S3 Static Website

I have set up an S3 bucket to host static files.
When using the website endpoint (http://.s3-website-us-east-1.amazonaws.com/): it forces me to set an index file. When the file isn't found, it throws an error instead of listing directory contents.
When using the s3 endpoint (.s3.amazonaws.com): I get an XML listing of the files, but I need an HTML listing that users can click the link to the file.
I have tried setting the permissions of all files and the bucket itself to "List" for "Everyone" in the AWS Console, but still no luck.
I have also tried some of the javascript alternatives, but they either don't work under the website url (that redirects to the index file) or just don't work at all. As a last resort, a collapsible javascript listing would be better than nothing, but I haven't found a good one.
Is this possible? If so, do I need to change permissions, ACL or something else?
I've created a simple bit of JS that creates a directory index in HTML style that you are looking for: https://github.com/rgrp/s3-bucket-listing
The README has specific instructions for handling Amazon S3 "website" buckets: https://github.com/rgrp/s3-bucket-listing#website-buckets
You can see a live example of the script in action on this s3 bucket (in website mode): http://data.openspending.org/
There is also this solution: https://github.com/caussourd/aws-s3-bucket-listing
Similar to https://github.com/rgrp/s3-bucket-listing but I couldn't make it work with Internet Explorer. So https://github.com/caussourd/aws-s3-bucket-listing works with IE and also add the possibility to order the files by names, size and date. On the downside, it doesn't follow folders: only the files at one level are displayed.
This might solve your problem. Security settings for Everyone group:
(you need the bucketexplorer.com software for this)
If you are sharing files of HTTP, you may or may not want people to be able to list the contents of a bucket (folder.) If you want the bucket contents to be listed when someone enters the bucket name (http://s3.amazonaws.com/bucket_name/), then edit the Access Control List and give the Everyone group the access level of Read (and do likewise with the contents of the bucket.) If you don’t want the bucket contents list-able but do want to share the file within it, disable Read access for the Everyone group for the bucket itself, and then enable Read access for the individual files within the bucket.
I created a much simpler solution. Just place the index.html file in root of your folder and it will do the job. No configuration required. https://github.com/prabhatsharma/s3-directorylisting
I had a similar problem and created a JavaScript-and-iframe solution that works pretty well for listing directories in S3 website files. You just have to drop a couple of .html files into the directory you want to list. You can find it here:
https://github.com/adam-p/s3-file-list-page
I found s3browser, which allowed me to set up a directory on the main web site that allowed browsing of the s3 bucket. It worked very well and was very easy to set up.
Using another approach base in pure JavaScript and AWS SDK JavaScript API. Not need PHP or other engine just pure web site (Apache or even IIS).
https://github.com/juvs/s3-bucket-browser
Not intent for deploy on your own bucket (for me, no make sense).
Using the new IAM Users from AWS you can provide more specific and secure access to your buckets. No need to publish your bucket to website and make all public.
If you want secure the access, you can use the conventional methods to authenticate users for your current web site.
Hope this help too!