How to share a variable between two routers module in cro? - raku

I try to use Cro to create a Rest API that will publish messages in rabbitMQ. I would like to split my routes in different modules and compose them with an "include". But I would like to be able to share the same connection to rabbitMQ in each of those modules too. I try with "our" but it does not work :
File 1:
unit module XXX::YYY;
use Cro::HTTP::Router;
use Cro::HTTP::Server;
use Cro::HTTP::Log::File;
use XXX::YYY::Route1;
use Net::AMQP;
our $rabbitConnection is export = Net::AMQP.new;
await $rabbitConnection.connect;
my $application = route {
include <api v1 run> => run-routes;
}
...
File 2:
unit module XXX::YYY::Route1;
use UUID;
use Cro::HTTP::Router;
use JSON::Fast;
use Net::AMQP;
my $channel = $XXX::YYY::rabbitConnection.open-channel().result;
$channel.declare-queue("test_task", durable=> True );
sub run-routes() is export { ... }
Error message:
===SORRY!===
No such method 'open-channel' for invocant of type 'Any'
Thanks!

When you define your exportable route function you can specify arguments then in your composing module you can create the shared objects and pass them to the routes. For example in your router module :
sub run-routes ($rmq) is export{
route {
... $rmq is available in here
}
}
Then in your main router you can create your Queue and pass it in when including
my $rmq = # Insert queue creation code here
include product => run-routes( $rmq );
I've not tried this but I can't see any reason why it shouldn't work.

The answer by #Scimon is certainly correct, but it does not addresses the OP. On the other hand, the two comments by #ugexe and #raiph are spot-on, so I'll try to summarize them here and explain what's going on.
The error itself
This is the error:
Error message:
===SORRY!=== No such method 'open-channel' for invocant of type 'Any'
It indicates the invocant ($XXX::YYY::rabbitConnection) is of type Any, which is the type usually assigned to variables when they don't have a defined value; that is, basically, $XXX::YYY::rabbitConnection is not defined. It certainly is not since XXX::YYY is not included among the imported modules, as indicated by #ugexe.
The additioal problem indicated by the OP
That module was eliminated from the imported list because, as indicated by the OP
I certainly code it the wrong way because if i try to add use
XXX::YYY;, i get a Circular module loading detected error
But of course. since use XXX::YYY::Route1; which is file 2, is included in File 1.
The final solution is to reorganize files
That circular dependence probably points out to the fact that they should be in the same file, or else common code should be factored out to a third file, which would be eventually be included by both. So you should have something like
unit module XXX::YYY::Common;
use Net::AMQP;
our $rabbitConnection is export = Net::AMQP.new;
await $rabbitConnection.connect;
And then
use XXX::YYY::Common;
in both modules.

Related

Raku Cro service subscribing to data "in the background" general guidance

I am attempting to put together a Cro service that has a react/whenever block consuming data "in the background" So unlike many examples of websocket usage with Cro, this has nothing to do with routes that may be accessed via the browser.
My use case is to consume message received via an MQTT topic and do some processing with them. At a later stage in development I might create a supply out of this data, but for now, when data is received it will be stored in a variable and dependant on certain conditions, be sent to another service via a http post.
My thought was to include a provider() in the Cro::HTTP::Server setup like so:
use Cro::HTTP::Log::File;
use Cro::HTTP::Server;
use Routes;
use DataProvider; # Here
my Cro::Service $http = Cro::HTTP::Server.new(
http => <1.1>,
host => ...,
port => ...,
application => [routes(), provider()], # Made this into an array of subs?
after => [
Cro::HTTP::Log::File.new(logs => $*OUT, errors => $*ERR)
]
);
And in the DataProvider.pm6:
use MQTT::Client;
sub provider() is export {
my $mqtt = MQTT::Client.new: server => 'localhost';
react {
whenever $mqtt.subscribe('some/mqtt/topic') {
say "+ topic: { .<topic> } => { .<message>.decode("utf8-c8") }";
}
}
}
This throws a bunch of errors:
A react block:
in sub provider at DataProvider.pm6 (DataProvider) line 5
in block <unit> at service.p6 line 26
Died because of the exception:
Invocant of method 'write' must be an object instance of type
'IO::Socket::Async', not a type object of type 'IO::Socket::Async'. Did
you forget a '.new'?
in method subscribe at /home/cam/raku/share/perl6/site/sources/42C762836A951A1C11586214B78AD34262EC465F (MQTT::Client) line 133
in sub provider at DataProvider.pm6 (DataProvider) line 6
in block <unit> at service.p6 line 26
To be perfectly honest, I am totally guessing that this is how I would approach the need to subscribe to data in the background of a Cro service, but I was not able to find any information on what might be considered the recommended approach.
Initially I had my react/whenever block in the main service.pm6 file, but that did not seem right. And needed to be wrapped in a start{} block because as I have just learned, react is blocking :) and cro was not able to actually start.
But following the pattern of how Routes are implemented seemed logical, but I am missing something. The error speaks about setting up a new method, but I'm not convinced that is the root cause. Routes.pm6 does not have a constructor.
Can anyone point me in the right direction please?
Thanks to all who have provided information, this has been a very valuable learning exercise.
The approach of passing additional sub routines, along side router() in the application parameter to Cro::HTTP::Server.new gave further trouble. (an array is not allowed, and broke routing)
Instead, I have moved the background work into a class of it's own, and given it a start and stop method more akin to Cro::HTTP::Server.
My new approach:
service.pm6
use Cro::HTTP::Log::File;
use Cro::HTTP::Server;
use Routes;
use KlineDataSubscriber; # Moved mqtt functionality here
use Database;
my $dsn = "host=localhost port=5432 dbname=act user=.. password=..";
my $dbh = Database.new :$dsn;
my $mqtt-host = 'localhost';
my $subscriber = KlineDataSubscriber.new :$mqtt-host;
$subscriber.start; # Inspired by $http.start below
my Cro::Service $http = Cro::HTTP::Server.new(
http => <1.1>,
host => ...,
port => ...,
application => routes($dbh), # Basically back the way it was originally
after => [
Cro::HTTP::Log::File.new(logs => $*OUT, errors => $*ERR)
]
);
$http.start;
say "Listening at...";
react {
whenever signal(SIGINT) {
say "Shutting down...";
$subscriber.stop;
$http.stop;
done;
}
}
And in KlineDataSubscriber.pm6
use MQTT::Client;
class KlineDataSubscriber {
has Str $.mqtt-host is required;
has MQTT::Client $.mqtt = Nil;
submethod TWEAK() {
$!mqtt = MQTT::Client.new: server => $!mqtt-host;
await $!mqtt.connect;
}
method start(Str $topic = 'act/feed/exchange/binance/kline-closed/+/json') {
start {
react {
whenever $!mqtt.subscribe($topic) {
say "+ topic: { .<topic> } => { .<message>.decode("utf8-c8") }";
}
}
}
}
method stop() {
# TODO Figure how to unsubscribe and cleanup nicely
}
}
This feels much more "Cro idiomatic" to me, but I would be happy to be corrected.
More importantly, it works as expected and I feel is somewhat future proof. I should be able to create a supply to make real-time data available to the router, and push data to any connected web clients.
I also intend to have a http GET endpoint /status with various checks to ensure everything healthy
The root cause
The error speaks about setting up a new method, but I'm not convinced that is the root cause.
It's not about setting up a new method. It's about a value that should be defined instead being undefined. That typically means a failure to attempt to initialize it, which typically means a failure to call .new.
Can anyone point me in the right direction please?
Hopefully this question helps.
Finding information on a recommended approach
I am totally guessing that this is how I would approach the need to subscribe to data in the background of a Cro service, but I was not able to find any information on what might be considered the recommended approach.
It might be helpful for you to list which of the get-up-to-speed steps you've followed from Getting started with Cro, including the basics but also the "Learn about" steps at the end.
The error message
A react block:
in sub provider ...
Died because of the exception:
...
in method subscribe ...
The error message begins with the built in react construct reporting that it caught an exception (and handled it by throwing its own exception in response). A "backtrace" corresponding to where the react appeared in your code is provided indented from the initial "A react block:".
The error message continues with the react construct summarizing its own exception (Died because ...) and explains itself by reporting the original exception, further indented, in subsequent lines. This includes another backtrace, this time one corresponding to the original exception, which will likely have occurred on a different thread with a different callstack.
(All of Raku's structured multithreading constructs[1] use this two part error reporting approach for exceptions they catch and handle by throwing another exception.)
The first backtrace indicates the react line:
in sub provider at DataProvider.pm6 (DataProvider) line 5
use MQTT::Client;
sub provider() is export {
my $mqtt = MQTT::Client.new: server => 'localhost';
react {
The second backtrace is about the original exception:
Invocant of method 'write' must be an object instance of type
'IO::Socket::Async', not a type object of type 'IO::Socket::Async'. ...
in method subscribe at ... (MQTT::Client) line 133
This reports that the write method called on line 133 of MQTT::Client requires its invocant is an instance of type 'IO::Socket::Async'. The value it got was of that type but was not an instance, but instead a "type object". (All values of non-native types are either type objects or instances of their type.).
The error message concludes with:
Did you forget a '.new'?
This is a succinct hint based on the reality that 99 times out of a hundred the reason a type object is encountered when an instance is required is that code has failed to initialize a variable. (One of the things type objects are used for is to serve the role of "undefined" in languages like Perl.)
So, can you see why something that should have been an initialized instance of 'IO::Socket::Async' is instead an uninitialized one?
Footnotes
[1] Raku's constructs for parallelism, concurrency, and asynchrony follow the structured programming paradigm. See Parallelism, Concurrency, and Asynchrony in Raku for Jonathan Worthington's video presentation of this overall approach. Structured constructs like react can cleanly observe, contain, and manage events that occur anywhere within their execution scope, including errors such as error exceptions, even if they happen on other threads.
You seem to be fine now but when I first saw this I made this https://github.com/jonathanstowe/Cro-MQTT which turns the MQTT client into a first class Cro service.
I haven't released it yet but it may be instructive.

How to use module.exports and requireJS?

im quite a noob in html and js, so forgive me if this is a dumb question but, im trying to use requireJs to export modules in node and i can't get the function work right.
here is the code extracted from example.
first i have this main.js, as the note in the documentation says http://requirejs.org/docs/node.html#2
var sayHi = require(['./greetings.js'], function(){});
console.log(sayHi);
and a greetings.js who export the answer
module.exports= 'Hello';
});
and get nothing as result, so i define the exports and modules
define( function(exports,module){
module.exports= 'Hello';
});
and get as result:
function localRequire()
what am i doing wrong? i read the documentation and examples, but somehow i can't make this works.
I'm assuming the require call you are using is RequireJS's require call, not Node's require. (Otherwise, you'd get a very different result.)
You are using the asynchronous form of the require call. With the asynchronous form, there is no return value for you to use, you have to use the callback to get module values, like this:
require(['./greetings.js'], function(sayHi){
console.log(sayHi);
});
However, because you are running in Node, you can do this:
var sayHi = require('./greetings.js');
Note how the first argument is a string, not an array of dependencies. This is the synchronous form of the require call. The returned value is the module you required. When you are in Node, RequireJS allows you to call this synchronous form anywhere. When you are running the browser, it is only available inside a define call.

Undefined global variable when using QuestaSim

I have a variable defined in foo_const.v which is defined like this in foo_const.v:
localparam NUM_BITS = 32;
Then I have another file foo_const_slice.v which does this:
localparam SLICE_ADDR_BITS = NUM_BITS;
This compiles fine with the vcs command:
vcs -sverilog foo_const.v foo_const_slice.v
But when I try to use QuestaSim:
vlog -work work -sv foo_const.v foo_const_slice.v
I get the following error message:
** Error: foo_const_slice.v(46): (vlog-2730) Undefined variable: 'NUM_BITS'.
The problem is that by default, each file that vlog compiles goes into a separate compilation unit, just like C/C++ and many other languages. By default, vcs concatenates all files together into a single compilation unit.
Although there is a way to change the default (you can look it up in the user manual), the proper way to code this is to put your parameters in a package, and import the package where needed.
Dave

Is it possible to use dojo/text! in an Intern functional test?

Is it possible to use "dojo/text!" in an Intern functional test?
I am able to setup my test page as a JSON string, but ideally I'd like to externalise the string in a file for ease of editing. I'm just getting started with Intern at the moment so I'm just experimenting with what's possible, but here is the start of my test code).
This works with the commented "testData" variable used, but is currently failing when I try to provide the same String by the dojo/text! statement.
Code:
define([
'intern!object',
'intern/chai!assert',
'dojo/text!./firstTestPageConfig.json',
'require'
], function (registerSuite, assert, PageConfig, require) {
registerSuite({
name: 'firstTest',
'greeting form': function () {
var testData = PageConfig;
// var testData = '{"widgets":[{"name":"alfresco/menus/AlfMenuBar","config":{"widgets":[{"name":"alfresco/menus/AlfMenuBarPopup","config":{"id":"DD1","label":"Drop-Down","iconClass":"alf-configure-icon","widgets":[{"name":"alfresco/menus/AlfMenuGroup","config":{"label":"Group 1","widgets":[{"name":"alfresco/menus/AlfMenuItem","config":{"label":"Item 1","iconClass":"alf-user-icon"}},{"name":"alfresco/menus/AlfMenuItem","config":{"label":"Item 2","iconClass":"alf-password-icon"}}]}},{"name":"alfresco/menus/AlfMenuGroup","config":{"label":"Group 2","widgets":[{"name":"alfresco/menus/AlfMenuItem","config":{"label":"Item 3","iconClass":"alf-help-icon"}}]}}]}}]}}]}';
var testPage = 'http://localhost:8081/share/page/tp/ws/unittest?testdata=';
return this.remote
.get(testPage + testData)
.waitForElementByCssSelector('.alfresco-core-Page.allWidgetsProcessed', 5000)
.elementById('DD1')
.clickElement()
.end()
}
});
});
The error I'm getting is this:
/home/dave/ScratchPad/ShareInternTests/node_modules/intern/node_modules/dojo/dojo.js:742
throw new Error('Failed to load module ' + module.mid + ' from ' + url +
^
Error: Failed to load module dojo/text from /home/dave/ScratchPad/ShareInternTests/dojo/text.js (parent: dojo/text!17!*)
at /home/dave/ScratchPad/ShareInternTests/node_modules/intern/node_modules/dojo/dojo.js:742:12
at fs.js:207:20
at Object.oncomplete (fs.js:107:15)
I've tried playing around with the loader/package/map configuration but without any success. It's not clear (to me at least) from the error message whether or not it can't find the file I'm passing to dojo/text (but I've tried full as well as relative paths) or the Dojo module itself ?
I'd just like to confirm that what I'm attempting is possible, before I spend any more time with this... but obviously any solution or example would be greatly appreciated!!
Many thanks,
Dave
To your specific error: You need to install Dojo for your own project if you want to use it. You are trying to load a module that does not exist. You may also try using the copy that comes with Intern, by loading modules from intern/dojo, but this isn’t recommended if you don’t understand the potential caveats of loading this internal library.
To using dojo/text in a functional test, generally: This is not currently possible unless you use the Geezer branch or explicitly use the Dojo 1 loader, because that module relies on functionality that is only exposed by the Dojo 1 loader when running in Node.js. A different text loader module that is fully generic would work, or you could load intern/dojo/node!fs and load the text yourself. This will be addressed in the future.
I just came across the same issue and for me this worked:
define([
"dojo/_base/declare",
"intern/dojo/text!/[PathToText]"
], function (declare, base) {
Seems as if Sitepen has included this in the meantime...

How does R3 use the Needs field of a script header? Is there any impact on namespaces?

I'd like to know the behaviour of R3 when processing the Needs field of a script header and what implications for word binding it has.
Background. I'm currently trying to port some R2 scripts to R3 in order to learn R3. In R2 the Needs field of a script header was essentially just documentation, though I made use of it with a custom function to reference scripts that are required to make my script run.
R3 appears to call the Needs referenced scripts itself, but the binding seems different to DOing the other scripts.
For example when %test-parent.r is:
REBOL [
title: {test parent}
needs: [%test-child.r]
]
parent: now
?? parent
?? child
and %test-child is:
REBOL [
title: {test child}
]
child: now
?? child
R3 Alpha (Saphiron build 22-Feb-2013/11:09:25) returns:
>> do %test-parent.r
Script: "test parent" Version: none Date: none
child: 9-May-2013/22:51:52+10:00
parent: 9-May-2013/22:51:52+10:00
** Script error: child has no value
** Where: get ajoin case ?? catch either either -apply- do
** Near: get :name
I don't understand why test-parent cannot access Child set by %test-child.r
If I remove the Needs field from test-parent.r header and instead insert a line to just DO %test-child.r then there is no error and the script performs as expected.
Ah, you've run into Rebol 3's policy to "do what you say, it can't read your mind". R3's Needs header is part of its module system, so anything you load with Needs is actually imported as a module, even if it isn't declared as such.
Loading scripts with Needs is a quick way to get them treated as modules even in the original author didn't declare them as such. Modules get their own contexts where their words are defined. Loading a script as a module is a great way to use a script that isn't that tidy, that leaks words into the shared script context. Like your %test-child.r script, it leaks the word child into the script context, what if you didn't want that to happen? Load it with Needs or import and that will clean that right up.
If you want a script treated as a script, use do to run it. Regular scripts use a (mostly) shared context, so when you do a script it has effect on the same context as the script you called it from. That is why the child: now statement affected child in the parent script. Sometimes that's what you want to do, which is why we worked so hard to make scripts work that way in R3.
If you are going to use Needs or import to load your own scripts, you might as well make them modules and export what you want, like this:
REBOL [
type: module
title: {test child}
exports: [child]
]
child: now
?? child
As before, you don't even have to include the type: module if you are going to be using Needs or import anyway, but it would help just in case you run your module with do. R3 assumes that if you declare your module to be a module, that you wrote it to be a module and depend on it working that way even if it's called with do. At the least, declaring a type header is a stronger statement than not declaring a type header at all, so it takes precedence in the conflicting "do what you say" situation.
Look here for more details about how the module system works: How are words bound within a Rebol module?