What is a good AWS client? - amazon-s3

The web based AWS console seems so limited in what it can do. For example, to create a private stream distribution, you have to create CloudFront Origin Access Identity, create private content distribution, and modify ALC on the private objects, all through XML calls (WTF?). I really expect something so common to be integrated into their Console.
Is there a client smart enough that allows me to do simple tasks in simple ways?

Some options are:
ylastic - A web based tool that does automate many multi-step operations.
Cloudberry Explorer - A Windows only client application
Bucket Explorer - A cross platform client application
I'm not sure if these perform the tasks you need by they are worth a look.

You can configure CloudFront Private Content with cloudberry freeware. http://blog.cloudberrylab.com/2010/03/how-to-configure-private-content-for.html

More generally, you should probably look for cloud management softwares. They are an additional layer on top of Amazon Web Services and leverage AWS API to offer automations tools like automated backups, auto-scaling, failover by default...
Ylastic was mentionned by Geoff, but you can try Scalr (disclaimer: I work there), RightScale or enStratus.

I have felt this way too...but it seems the aws command line tools is the only options that I know..

Basic answer: No! It is even hard to find a GUI to manage more than one aws product
And no one will put in effort to develop this as aws keep changing the API interface.
For me:
For stable aws service like S3, I use Cyberduck
And the rest of them, I write my own program to do it, it is more customized for my own need and not easy to make mistake (and help me to familiar with the api)

Related

Have you found a way of persisting the odoo core modules in v14 different form a volume? And so, it is possible deploying odoo in gcloud run?

I want to deploy odoo as cheap as possible. I tried with gcloud sql (15-30€/m) + cloud run. But after some minutes passed the odoo interface shows me a white screen with so many logs in the console similar to this:
GET 404 1.04 KB24 ms Chrome 91 https://bf-dev3-u7raxlu3nq-ew.a.run.app/web/content/290-f328144/1/website.assets_editor.css
My interpretation is that, as cloud run is stateless, and the web static files seems to be stored in the core module, after the container is killed this information is lost. As I've been one month working looking for a solution, before trying any another way of deploying I ask the community: Have you found a way of persisting the odoo core modules in v14 different form a volume? And so, it is possible deploying odoo in gcloud run?
Here I listed all the ideas that I tried:
First, I thought that this css files were store in the werkzeug session, so I tried two addons that stored this session in a place different from the filestore. These addons were camptocamp odoo-cloud-platform-14.0/session-redis and misc-addons-13.0/base_session_store_psql. But, then the problem persisted.
Then I read that the static css and js file generated in the web editor are stored in odoo as attachments, and the addons misc-addons-13.0/ir_attachment_s3 could store these files in s3. But, although I configured this addon the problem persisted.
Next, I found this link describing needing to regenerate assets so them to be stored in the db. But, although I did that the problem persisted.
Finally, I thought to deploy odoo in other ways. The way of directly in a vm seems to be the more minimalistic and standard, and so seem to have the more chances to work, although it will be difficult to implement gitops. It can be deployed containers in the vm through docker compose what will help deploying updates. Gke anthos seems to implement gitops too and seems to persist volumes, but in the description it shows gke anthos is stateless. Finally, there's the way of deploying in a k8s cluster, this way will implement containers and allow autoscaling vs the docker compose way in a vm. But it's true it seems to be more expensive and more difficult to implement. Regarding seem to be more expensive it is thought of trying little working nodes machines so the cost stays small during the night. Regarding the difficulty of deploying, it is desired to implement gitops so it seems argo or other should be added. Also, I heard gke autopilot has a good free tier and is easier to deploy.
Thanks in advance :)
Cloud Run isn't the good solution for that. Indeed, if the werkzeug session is persisted in memory, the same client isn't sure to access to the same instance each time, and thus to lost the file even in the middle of a session.
The best solution is to use VM with sticky session configuration. You can use old school deployment on Compute Engine, or Cloud Native solution with GKE/K8S. It's more or less the same cost if you have only 1 cluster (the first one is free)
Just a correction about GKE Anthos. I think you talk about Cloud Run on Anthos, and yes, it's like Cloud Run but use KNative on GKE to manage the containers, and it's also serverless. But GKE can handle stateful deployment, as you need with odoo

Solution for storing custom files by clients in the cloud

We have multiple clients using our service.
Each client may create multiple projects.
Each client may upload multiple files to any of his projects.
Each file may have custom meta data associated.
Each client may "share" any of the projects to another client.
Each client may comment any of his or shared projects/files.
My question is about file storing in a cloud. What will best solution? I thought about Amazon S3 but maybe there are better alternatives?
You can explore Box.com solution. They are an advanced file management solution in the cloud and support fine-grained permission management as you explained above. Dropbox for Teams is also another option - The permission model is not as extensive as Box, but the sync client is very stable here. In one of my recent projects, I used box.com mainly due to their fine-grained permission controls
You can also build this on S3 (Dropbox and I guess Box too is behind the scenes built on S3). To achieve all the functionality as you mentioned, it is quite some programming work !

Can I create new S3 users and add IAM policies from the Linux command line?

Is there any good way of creating and managing S3 policies and users from the command line of my Raspberry Pi?
The AWS Universal Command Line Tools are newer and better supported. They rely on Python, so if you can get Python for Raspberry Pi, you should be set.
I have no experience of using it myself, but I found a tool for interacting with Amazon IAM, the access control service for AWS, in a manner that might work for you:
IAM Command Line Toolkit (note: last updated September 2010)
There may be more usable stuff under the IAM Resources section.
If you are unfamiliar with IAM, the documentation is one place to start. Although, knowing the general style of AWS documentation, there may be better resources and tutorials to be found elsewhere.

online backup solution with api for desktop

I made a small backup application that simply creates an archive out specified files and folders. Now I need an online service to backup that online. Which service can i use that can be integrated into my app ?
Options I considered:
dropbox is ideal, but they have all but abandoned the desktop.
skydrive has no api.
I couldn't find any free reliable backup service that uses ftp .
anything else ? it should provide 1-2 gb of free space and be reasonably reliable.
Thanks
My app is in C#, but can be ported to any other language as well..
In your case, Amaxon's S3 seems more fitting but that's not free.
Depending on your target audience, you can create a local archive and have that picked up by your regular backup solution. You might try Wuala,or SpiderOak. Expand Wuala by adding your own space. Spideroak is free up to 2GB (more if you invite friends), and also provides a good alternative to Dropbox (if you want to see how to migrate from dropbox to spideroak see my blogpost about that).
Try box.net, now known as box.com or simply Box
reference: http://developers.box.com/docs

Updating permissions on Amazon S3 files that were uploaded via JungleDisk

I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.