How To implement LuceneNet using Amazon S3 - lucene

I'm trying to implement Lucene in my app using Amazon S3 to storage the indexes that I generate, but I can find any code examples or a clear article. So anyone that have some kind of experience with this please give a guide or something that can help me start

There's a similar question here.
Here's an interesting article of how the biggest Solr service provider Lucid Imagination proposes to deploy their Solr implementation on EC2.
And here's their Search-as-a-Service solution.
If you're not bound to S3, you can use dedicated Solr cloud service called WebSolr.
Also, if you need complete ALM/CI solution for your development project, there's a WebSolr module included in CloudBees.

Related

Have you found a way of persisting the odoo core modules in v14 different form a volume? And so, it is possible deploying odoo in gcloud run?

I want to deploy odoo as cheap as possible. I tried with gcloud sql (15-30€/m) + cloud run. But after some minutes passed the odoo interface shows me a white screen with so many logs in the console similar to this:
GET 404 1.04 KB24 ms Chrome 91 https://bf-dev3-u7raxlu3nq-ew.a.run.app/web/content/290-f328144/1/website.assets_editor.css
My interpretation is that, as cloud run is stateless, and the web static files seems to be stored in the core module, after the container is killed this information is lost. As I've been one month working looking for a solution, before trying any another way of deploying I ask the community: Have you found a way of persisting the odoo core modules in v14 different form a volume? And so, it is possible deploying odoo in gcloud run?
Here I listed all the ideas that I tried:
First, I thought that this css files were store in the werkzeug session, so I tried two addons that stored this session in a place different from the filestore. These addons were camptocamp odoo-cloud-platform-14.0/session-redis and misc-addons-13.0/base_session_store_psql. But, then the problem persisted.
Then I read that the static css and js file generated in the web editor are stored in odoo as attachments, and the addons misc-addons-13.0/ir_attachment_s3 could store these files in s3. But, although I configured this addon the problem persisted.
Next, I found this link describing needing to regenerate assets so them to be stored in the db. But, although I did that the problem persisted.
Finally, I thought to deploy odoo in other ways. The way of directly in a vm seems to be the more minimalistic and standard, and so seem to have the more chances to work, although it will be difficult to implement gitops. It can be deployed containers in the vm through docker compose what will help deploying updates. Gke anthos seems to implement gitops too and seems to persist volumes, but in the description it shows gke anthos is stateless. Finally, there's the way of deploying in a k8s cluster, this way will implement containers and allow autoscaling vs the docker compose way in a vm. But it's true it seems to be more expensive and more difficult to implement. Regarding seem to be more expensive it is thought of trying little working nodes machines so the cost stays small during the night. Regarding the difficulty of deploying, it is desired to implement gitops so it seems argo or other should be added. Also, I heard gke autopilot has a good free tier and is easier to deploy.
Thanks in advance :)
Cloud Run isn't the good solution for that. Indeed, if the werkzeug session is persisted in memory, the same client isn't sure to access to the same instance each time, and thus to lost the file even in the middle of a session.
The best solution is to use VM with sticky session configuration. You can use old school deployment on Compute Engine, or Cloud Native solution with GKE/K8S. It's more or less the same cost if you have only 1 cluster (the first one is free)
Just a correction about GKE Anthos. I think you talk about Cloud Run on Anthos, and yes, it's like Cloud Run but use KNative on GKE to manage the containers, and it's also serverless. But GKE can handle stateful deployment, as you need with odoo

Efficient Nuxt generated static site hosting: Better on Amazon AWS or a Cloud Droplet

not sure if it belongs here or is well titled, but I finish soon my first Nuxt project and I am not sure, where to host it.
Usually I would use a Ionos or digital ocean droplet, but I was told that aws amplify or S3 (I have no Idea about any solution) might be cheaper or maybe cost nothing, since it is a small project, cause it depends on how intense process are ...
If true, would that apply as well, when I would need to run git pull and then the build/generate process, once a day, to get new content (via nuxt/content)?
Sorry if expressed poorly and thanks in advance for any helpful suggestion.
This question do not really belong to stackoverflow because it's essentially opinion based.
By order of preference, I do personally recommend those:
Netlify
Vercel
Digitalocean
Github pages
Surge
More on the official documentation of Nuxt: https://nuxtjs.org/docs/2.x/deployment/netlify-deployment

Can I create new S3 users and add IAM policies from the Linux command line?

Is there any good way of creating and managing S3 policies and users from the command line of my Raspberry Pi?
The AWS Universal Command Line Tools are newer and better supported. They rely on Python, so if you can get Python for Raspberry Pi, you should be set.
I have no experience of using it myself, but I found a tool for interacting with Amazon IAM, the access control service for AWS, in a manner that might work for you:
IAM Command Line Toolkit (note: last updated September 2010)
There may be more usable stuff under the IAM Resources section.
If you are unfamiliar with IAM, the documentation is one place to start. Although, knowing the general style of AWS documentation, there may be better resources and tutorials to be found elsewhere.

What is a good AWS client?

The web based AWS console seems so limited in what it can do. For example, to create a private stream distribution, you have to create CloudFront Origin Access Identity, create private content distribution, and modify ALC on the private objects, all through XML calls (WTF?). I really expect something so common to be integrated into their Console.
Is there a client smart enough that allows me to do simple tasks in simple ways?
Some options are:
ylastic - A web based tool that does automate many multi-step operations.
Cloudberry Explorer - A Windows only client application
Bucket Explorer - A cross platform client application
I'm not sure if these perform the tasks you need by they are worth a look.
You can configure CloudFront Private Content with cloudberry freeware. http://blog.cloudberrylab.com/2010/03/how-to-configure-private-content-for.html
More generally, you should probably look for cloud management softwares. They are an additional layer on top of Amazon Web Services and leverage AWS API to offer automations tools like automated backups, auto-scaling, failover by default...
Ylastic was mentionned by Geoff, but you can try Scalr (disclaimer: I work there), RightScale or enStratus.
I have felt this way too...but it seems the aws command line tools is the only options that I know..
Basic answer: No! It is even hard to find a GUI to manage more than one aws product
And no one will put in effort to develop this as aws keep changing the API interface.
For me:
For stable aws service like S3, I use Cyberduck
And the rest of them, I write my own program to do it, it is more customized for my own need and not easy to make mistake (and help me to familiar with the api)

Anyone actually using Mosso Files (Amazon S3 competitor)?

We have a bunch of data on S3 (images) but just started reading about Mosso Files (rackspace). Sometime this month they are going to add CDN capabilities so any file you upload is part of the limelight CDN.
Anyone using this service, it's not as well documented or publicized at S3.
Yes, it's not well documented or publicized as S3. But dude it has CDN support which S3 is lack off (unless you willing to pay extra of course). Bad thing is you can't FTP into Mosso CloudFile, you will either have to upload it through web-based control panel or API. Yet, it's still cheap and worth especially with CDN.
I am using the service and it's pretty good and cost effective compare to S3.
We use it for all our client sites, from images to podcasts, and it's hand down, the best way to distribute content and make it highly available - especially at this price!
cheers