I have read tutorials on Laravel Queue using Beanstalkd etc and the idea of using queue is fantastic because in my current project, sending a Welcome mail to a registered user takes up to 10 seconds to process cause of the attachment of a logo. I can imagine what will happen if more users register at an instance. So, using a queue for this will speed up things.
In the shared server I am working on, I have no SSH Access. So, setting up the queue according to the tutorials is far fetched.
I want to know if there is a way to setup Laravel Queue without SSH Access, if there is a way, I need a guide.
You can't use Beanstalkd on a shared server because you can't install the service and I don't know any hosting service that offers it for shared hosting. However you could use IronMQ which is a remotely hosted service (so you don't need to install anything on the server). The Laravel queues API is the same for any queue service, so you can just use Queue::push like you would with beanstalkd.
Here's a great video on setting this up by Taylor Otwell, the creator of Laravel:
http://vimeo.com/64703617. You can also read this tutorial which explains how to use IronMQ with Laravel in more detail.
IronMQ is a paid service, but it does have a Free Plan for developers which offers 1 million API requests per month.
Instead of using artisan queue:listen like you would for beanstalkd, you just define a route for IronMQ to call when processing each job on the queue:
Route::post('queue/receive', function()
{
return Queue::marshal();
});
Related
I intend to build a set of skills for Amazon Alexa that will integrate with a custom software suite that runs on a RaspberryPi in my home.
I am struggling to figure out how I can make the Echo / Dot itself make an API call to the raspberry pi directly - without going through the internet, as the target device will have nothing more then an intranet connection - it will be able to receive commands from devices on the local network, but is not accessible via the world.
From what I have read, the typical workflow is as follows
Echo -> Alexa Service -> Lambda
Where a Lambda function will return a blob of data to the Smart Home device; using this return value
Is it possible, and how can I make the Alexa device itself make an API request to a device on the local network, after receiving a response from lambda?
I have the same problem and my solution is to use SQS as the message bus so that my RaspberryPi doesn't need to be accessible from the internet.
Echo <-> Alexa Service <-> Lambda -> SQS -> RaspberryPi
A |
+------ SQS <-----+
This works fine as long as:
you enable long polling (20sec) of SQS on the RaspberryPi and set the max messages per request to 1
you don't have concurrent messages going back and forth between Alexa and the RaspberryPi
This give the benefit of:
with a max message size of 1 the SQS request will return as soon as one message is available in the queue, even before the long poll timeout is met
with only 1 long polling at a time to SQS for the entire month this fit under the SQS free tier of 1 million requests
no special firewall permission for accessing your RaspberryPi from the internet, so the RaspberryPi's connection from the lambda always "just works"
more secure than exposing your RaspberryPi to the internet since there are no open ports exposed for malicious programs to attack
You could try using AWS IoT:
Echo <-> Alexa Service <-> Lambda <-> IoT <-> RaspberryPi
I though about using this for my Alexa RasberryPi project but abandoned the idea since AWS IoT doesn't offer a permanent free tier. But the free tier is no longer a concern since Amazon now offers Alexa AWS promotional credits.
https://developer.amazon.com/alexa-skills-kit/alexa-aws-credits
One possibility is to install node-red on your rPi. Node-red has plugins (https://flows.nodered.org/node/node-red-contrib-alexa-local) to simulate Philips hue and makes Alexa talk to it directly. It's an instant response. The downside is that it only works for 3 commands: on , off, set to x %. Works great for software/devices that control lights, shades and air-con.
It was answered in this forum a while ago and I'm afraid to tell you that situation hasn't changed since:
Alexa is cloud based and requires access to the internet / Amazon servers to function, so you cannot use it only within the intranet without external access.
There are a couple workaround methods I've seen used.
The first method is one that I've used:
I setup If This Then That (IFTTT) to listen for a specific phrase from Alexa, then transmit commands through the Telegram secure chat/messaging service where I used a "chat bot" running on my raspberry PI to read and act on those messages.
The second method I most recently saw would use IFTTT to add rows to a google spreadsheet which the raspberry pi could monitor and act on.
I wasn't particularly happy with the performance/latency of either of these methods but if I wrote a custom Alexa service using a similar methodology it might at least eliminate the IFTTT delay.
Just open an SSH tunnel into your rPi with a service like https://ngrok.com/ and then communicate with that as either your endpoint or from the lambda.
You can achieve this by using proxy. BST has a tool for that , I currently use that one http://docs.bespoken.tools/en/latest/commands/proxy/
So rather than using a Lambda you can use local machine.
Essentially it becomes Echo -> Alexa Service -> Local Machine
Install npm bst to your local machine https://www.npmjs.com/package/bespoken-tools
npm install bespoken-tools --save
Go to your projects index.js folder and run proxy command
bst proxy lambda index.js
This will give you a url as follow:
https://proxy.bespoken.tools?node-id=xxx-xxx-xxx-xxx-xxxxxxxx
Now go to your alexa skill on developer.amazon and click to configure your skill.
Choose your service endpoint as https and enter the url printed out by BST
Then click save, and boooom your local machine becomes the final end point.
I ask early about notification on client here.
Now I am interesting in notification for server side. In particular I am interesting in fact that notification inform all servers.
My problem is cluster of servers. I have some database elements cashed on all servers. If some user on any server update database element cash need to be refresh. Notification could do the job.
Or is there another way to deal with cluster of servers ?
Marko
There is no complete tutorial on this topic, I'm afraid.
Scout however does have the functionality you are looking for in the form of the IClusterSynchronizationService.
You can use it to register listeners and send messages between Eclipse Scout servers.
For it to work, you'll need a message passing system (message queue) such as ApacheMQ or RabbitMQ. You simply have to install the necessary connector from the Eclipse Marketplace and register them for integration in your application. A detailed explanation is in this tutorial. (You need to add the new connector as dependencies to your product files, register the cluster synchronization service, and configure it with properties for host, port, ...).
The "BahBah" Demo chat application on GitHub has an implementation of these listeners and how to register them.
The (inofficial) fork of the BahBah Chat demo has some of these changes already built in.
I have a Web Server implemented using dot net MVC4. There are clients connected to this web server which perform some operations and upload live logs to the server using WebClient.UploadString method. Sending these logs from client to server is being done in group of 2500 characters at a time.
Things work fine until 2-3 client upload logs. However when more than 3 clients try to upload logs simultaneously they start receiving "http 500 internal server error".
I might have to scale up and add more slaves but that will make the situation worse.
I want to implement Jenkins like live logging, where logs from slave are updated live.
Please suggest some better and scalable solution to this problem.
Have you considered looking into SignalR?
It can be used for anything from instant messaging to stocks! I have implemented both a chatbox, and a custom system that sends off messages, does calculations and then passes them back down to client. It is very reliable, there are some nice tutorials, and I think it's awesome.
I have a Rails app running on AWS elastic beanstalk on a web tier. I want to send email notifications to users so I'm using sqs to send messages to a queue:
sqs = AWS::SQS.new
sqs.queues.named("messaging_queue").send_message("HELLO")
and then I would like to take these messages off the queue using a worker tier instance.
My issues is that when I create the worker tier instance from the console it asks for the application version which defaults to the latest deployed version to my web tier. I don't want to upload my entire web application to the worker, just the code responsible for performing the emailing.
What's the best way to do this? I could upload a zip but I would like to just use git
Can you refactor the code that is responsible for sending emails into a separate library? That way you can create a new web app which just wraps around the email functionality in your library and runs on a worker tier environment. The worker daemon will post messages to your new worker tier app which will then send the email. That way you do not have to deploy your entire code base to your worker tier environment.
You can use git and eb to achieve this. Your worker tier application version and webapp application version can be managed in different branches or in your case it seems better to keep them in different git repositories. If you wish to use branches then you can read about the eb command "eb branch", it may be useful.
Read more about eb here.
What is the best way to combine a single instance WCF service that uses ActiveMQ and runs within IIS/AppFabric?
Our Services need to support both HTTP transports and ActiveMQ (listening and sending messages). We've elected not to use MSMQ, and will use Spring.Net.NMS. The fundamental issue I have now is that ActiveMQ needs to connect to the queue(s) at startup and remain connected, but WAS is getting in the way with it's message-activation feature. If the service is not activated until a message arrives (HTTP/MSMQ, etc) then there is no trigger to have the connection to AMQ occur.
I know I can disable the recycling behavior, and I know I can do self-hosting with a Windows Service. But I want to take advantage of the monitoring and other features in AppFabric. I've already been down the route with IServiceBehavior and will use that for other nice things. But that interface is not called until a (non-AMQ) message arrives. So it won't work for this. What I was hoping for was something along the line of how ServletContextListeners work in Java, where you get both the start up and shutdown events. But it seems no such thing exists in WAS... it is driven only by messages arriving.
I've scoured every inch of web info for 3 days and the only thing I came across was to use a static class construction (C#) trick as the trigger. That's a hack, but i can live with it. It still leaves the issue of cleanly shutting down, which I can figure out later.
Anyone have a solid solution to this?
The direct WCF support for ActiveMQ that Ladislav mentions is still being supported. There just hasn't been an official release for the module in a while. However, you can still get the latest version of it from the 1.5.x branch or trunk and compile it yourself.
1.5.x branch for use with Apache.NMS 1.5.0:
https://svn.apache.org/repos/asf/activemq/activemq-dotnet/Apache.NMS.WCF/branches/1.5.x/
Check out instructions:
http://activemq.apache.org/nms/source.html
There was direct WCF support for ActiveMQ but I guess it is not developed anymore. Your problem actually is the IIS / WAS (provides hosting for non-http protocols) hosting architecture. Services in WAS are always activated when message arrives - there is no global startup. The reason for this is that WAS hosting expects that there is separate process (windows service) running the listener all the time and this process has adapter which calls WAS and uses message level activation. I guess you don't have such process for ActiveMQ and because of that you will have trouble to use ActiveMQ endpoint hosted in WAS. Developing such listener can be challenging task (example for UDP).
Creating custom listener can be probably avoided by using IIS 7.5 / AppFabric auto start feature. There is also not very well documented way to run the code when the application starts.