I am looking for a simple HTTP server which accepts (only GET) commands, queries the redis DB with a key and sends the reply (value) back in text format. My only requirement is that the server is very lightweight and can access a backend DB. Thanks in advance.
Try Webdis.
Webdis is a simple HTTP server which forwards commands to Redis and sends the reply back using a format of your choice.
I recently looked for the same kind of HTTP server.
As mentioned by Evandro you can try Webdis or you can go for Nginx with some modules.
In your case, for GET requests only, you can install Nginx with the HttpRedis module.
If later your requirements evolve, you can always go for HttpRedis2Module which support all the Redis commands.
I personally use the HttpLuaModule with the lua-resty-redis module and lua-cjson.
Once you got the HttpLuaModule running it's really easy to add new lua-modules and to extend the capabilities of Nginx. The resty-redis module let you add some logic between the HTTP request handling and your Redis queries using lua. You also have a large number of examples on the module setup and usage on github.
Adding cjon let you return JSON instead of raw text.
Use Webdis as suggested or Mod_redis (module for Redis) with nginx or apache2 server as per your requirements.
Related
In my project, I am planning to use multiple backend to store different data in my spring cloud conifg server setup: use git backend to store un-sensitive data, and use vault to store sensitive data like password/token. This is simiar to what https://content.pivotal.io/blog/spring-cloud-services-supports-vault-multiple-backends-use-the-right-config-repo-for-the-job suggests.
My question is since the returned decrypted value from vault is passed back to "client application" through config server, will config server cache/store/log the response from vault in any way. If this is true, config server will be a big target for hacker and we may have to protect the config server with extra configuration.
I suppose the true answer to your concern would be to secure each and every layer of your stack to prevent intrusion at any single point.
The Spring documentation makes no explicit reference to caching data - so you should be safe in that regard. It would also not make a lot of sense for Config Server to cache the configuration from external data stores as it is not the source-of-truth for that data. We want it to always fetch the data from source to ensure we get the latest version of the data. I'm supposing that there might be a case of caching if Config Server stored the configuration locally and was able to watch the files for changes and refresh its cache accordingly. But having said that, I'm still not sold on the benefit of caching at this layer.
From personal use of Spring Cloud Config Server I've yet to see it logging out the entire configuration; in-fact it logs very little to start off with. I'm sure you can suppress logging even further by setting the appropriate levels.
What you should also look at doing is securing the connections between Vault & Config Server and Config Server and each application using SSL. That will prevent you from transmitting data in clear text and will provide you with an additional layer of security.
I'm trying to run a SPARQL query on Wikidata, but it times out. I'd like to download a dump and index it in some database, so I can run local SPARQL queries using HTTP requests. I also need to support Wikidata-specific extensions like SERVICE wikibase:label. I've downloaded an RDF dump. What are the next steps?
Wikimedia has documentation on how to run your own SPARQL endpoint from one of their dumps. They also have an updater that streams updates from their servers, to keep your endpoint up-to-date.
You won't need to do anything special to support their extensions, it is included by default.
I recommend using a reverse proxy (like nginx or apache) with http auth in production, the admin dashboard is accessible by default.
I am building a project which requires constant connection with the server.
There are two major ways to achieve this:
Ajax pull
Ajax push
I have to decide between pinging a server (expensive) and maintaining keep-alive connections (firewalls block that.)
I was thinking about the live video streams. They are not keep-alive connections, nor frequent pings.
Is it possible, to send data, like JSON strings through rtmp?
It would be theoretically possible to implement RTMP's AMF3 and AMF0 Message types to carry the data. RTMP [Wikipedia]
The problem is that using a protocol typically used for streaming video might get your connection blocked or throttled by some service providers that limit such protocols to conserve bandwidth (and prevent employees from watching internet videos at work).
Maybe this article may be of some use to you. It explains how to set up an RTMP server with nginx.
From the article:
nginx is an extremely lightweight web server, but someone wrote a RTMP module for it, so it can host RTMP streams too. However, to add the RTMP module, we have to compile nginx from source rather than use the apt package. Don't worry, it's really easy. Just follow these instructions. :)
One comment on this article by a user named 'stefaniuk' linked to a github respitory for this that I think you should look in to. Check it out here.
I am working on a project which has following requirements:
Perform sticky based load balancing(based on SOAP session ID) onto multiple backend servers.
Possibility to plugin my own custom based load balancer.
Easy to write and deploy.
A central configuration file(Possibly an XML), to take care of all the backend servers.
Easy extraction of a node from this configuration file(Possibly with xpath).
I tried working with camel for a while but, wasn't able to do perform certain task with it.
So thought of giving a try to Akka.
Will akka be possibly able to satisfy the above requirements?
If so is there a load balancing example in akka or proxy example?
Would really appreciate some feedbacks.
You can do everything you've described with Akka.
You don't mention what language you're working with, Scala or Java. I've included links to the Scala documentation.
Before you do anything with Akka you HAVE TO read the documentation and understand how Akka works.
http://doc.akka.io/docs/akka/2.0.3/
Doing so, you'll find Akka is perfect for the project you've described with some minor caveats.
Once you read the documentation the following answers should make a lot of sense.
Perform sticky based load balancing(based on SOAP session ID) onto multiple backend servers.
Load balancing is already part of the framework (it's called Routing in Akka http://doc.akka.io/docs/akka/2.0.3/scala/routing.html) and Remoting (http://doc.akka.io/docs/akka/2.0.3/scala/remoting.html) will take care of the backend servers. You can easily combine the two.
To my knowledge the idea of sticky load balancing is not a part of Akka but I can envision this being accomplished with a Map using the session ID as the key and the Actor name (or path) as the value. A quick actorFor will take care of the rest. Not well thought out but should give you a good idea of where to start.
Possibility to plugin my own custom based load balancer.
Refer to the Routing documentation.
Easy to write and deploy.
This depends on your aptitude and effort but after you read certain parts of the documentation you should be build a proof of concept in a couple of hours.
Deployment can be a bit frustrating mostly because the documentation isn't really great with respect to deploying Akka networks with remote components. However, there are enough examples on the web that you can figure out how to get it done...eventually. Once you do it once it's no big deal.
A central configuration file(Possibly an XML), to take care of all the backend servers.
Akka uses Typesafe Config (https://github.com/typesafehub/config) which is a lot easier to work with than XML (but I hate XML so take that with a grain of salt). As far as a central configuration, I'm not sure what you're trying to accomplish but it sounds like something that can be solved using remote actor creation. Again, see the Remoting documentation.
Easy extraction of a node from this configuration file(Possibly with xpath).
Akka provides a lookup method .actorFor. There's no need to go to the configuration file once the system is up and running.
If so is there a load balancing example in akka or proxy example?
Google is your friend.
I'm building a queue-based system to scale user-uploaded images.
Users will upload images which will get transferred to a storage server. The web server will then add a message to a queue which will be listened to by image scaling workers that will retrieve the image files, scale them and add them to the storage server.
I was planning on using celery over rabbitmq for this, but my web tier will be running PHP so for convenience I'd rather find a PHP way of doing this.
What suggestions do people have?
If it came to it (although I don't want to complicate the web tier by having python and PHP) how easy would it be to control celery from PHP, and how would I do that? Some kind of RPC protocol (like thrift?) or something simpler since celery needn't be on a different server?
I made the Celery-PHP library and it works smoothly for a few months now.
I'll just use thrift to allow me to invoke python from php, and use python with celery.