Let menhir use external modules by dune - menhir

In this project, I would like to add a folder extra containing try.ml:
(* try.ml *)
let x = 5
Then, in parser.mly, I want to use the module Try. So I need to open it in the header:
%{
open AST
open Try
%}
At the moment, dune is as follows
(ocamllex lexer)
(menhir
(modules parser)
(flags --explain --inspection --table --dump)
)
(executable (name parse)
(libraries menhirLib))
Does anyone know how to modify the dune file such that it takes Try into account?

Related

Using extended classes in gst (GNU smalltalk)?

This is a bit of a follow-up question to this one.
Say I've managed to extend the Integer class with a new method 'square'. Now I want to use it.
Calling the new method from within the file is easy:
Integer extend [
square [
| r |
r := self * self.
^r
]
]
x := 5 square.
x printNl.
Here, I can just call $ gst myprogram.st in bash and it'll print 25. But what if I want to use the method from inside the GNU smalltalk application? Like this:
$ gst
st> 5 square
25
st>
This may have to do with images, I'm not sure. This tutorial says I can edit the ~/.st/kernel/Builtins.st file to edit what files are loaded into the kernel, but I have no such file.
I would not edit what's loaded into the kernel. To elaborate on my comment, one way of loading previously created files into the environment for GNU Smalltalk, outside of using image files, is to use packages.
A sample package.xml file, which defines the package per the documentation, would look like:
<package>
<name>MyPackage</name>
<!-- Include any prerequisite packages here, if you need them -->
<prereq>PrequisitePackageName</prereq>
<filein>Foo.st</filein>
<filein>Bar.st</filein>
</package>
A sample Makefile for building the package might look like:
# MyPackage makefile
#
PACKAGE_DIR = ~/.st
PACKAGE_SPEC = package.xml
PACKAGE_FILE = $(PACKAGE_DIR)/MyPackage.star
PACKAGE_SRC = \
Foo.st \
Bar.st
$(PACKAGE_FILE): $(PACKAGE_SRC) $(PACKAGE_SPEC)
gst-package -t ~/.st $(PACKAGE_SPEC)
With the above files in your working directory containing Foo.st and Bar.st, you can do a make and it will build the .star package file and put it in ~/.st (where gst will go looking for packages as the first place to look). When you run your environment, you can then use PackageLoader to load it in:
$ gst
GNU Smalltalk ready
st> PackageLoader fileInPackage: 'MyPackage'
Loading package PrerequisitePackage
Loading package MyPackage
PackageLoader
st>
Then you're ready to rock and roll... :)

Could not import module frege.system.Directory (java.lang.ClassNotFoundException: frege.system.Directory)

I tried to import System.Directory in my Frege program (In Eclipse) in order to use functions as getDirectoryContent, etc., and it writes me this error :
Could not import module frege.system.Directory (java.lang.ClassNotFoundException: frege.system.Directory)
What do I have to do ?
It is because the module frege.system.Directory doesn't exist in Frege. A good way to find out about a module is to use Hoogle for Frege at this URL: http://hoogle.haskell.org:8081. If we search for that module there, we can see that it doesn't list any module as opposed to, say, if you search for frege.data.List, we would see the module in the result.
Now for the functions you need like getDirectoryContent, if you look at the search result for frege.system.Directory, the first result is about processes and the third and fourth results are about jars and zip files. If you click on the second result, it would open the module frege.java.IO and you can see some relevant functions that might be useful for you (list for example). However the Haskell module you are trying to find is not yet ported to Frege but it should, of course, be possible to port that module backed by native Java implementations.
Update for OP's comment
Here is a simple snippet to return the files under a given directory:
ls :: String -> IO [String]
ls dir = do
contents <- File.new dir >>= _.list
maybe (return []) (JArray.fold (flip (:)) []) contents
Regarding createTempFile, the following works for me:
frege> File.createTempFile "test.txt"
String -> STMutable RealWorld File

Use collections with cargo without stdlib

I am currently trying to setup an embedded Rust project. For that it would be nice if I could use the collections crate (and by extension the alloc crate since it is required by collections). Is there an easy way to achieve this? I currently have the following dependencies in Cargo.toml
[build-dependencies]
gcc = "0.3"
[dependencies]
rust-libcore = "*"
[dependencies.rlibc]
git = "https://github.com/hackndev/rlibc"
branch = "zinc"
And use them as follows:
#![no_std]
#![crate_type="staticlib"]
#![feature(lang_items)]
#![feature(start)]
// This is not found when building with Cargo
extern crate collections;
//#[cfg(target_os = "none")]
extern crate rlibc;
#[start]
pub fn main(_argc: isize, _argv: *const *const u8) -> isize {
// or some call like this
core::collections::Vec::new();
0
}
Is there an easy way to include the collections crate?
One possible solution is to compile it yourself. This requires having a checkout of the Rust source. I don't have a working environment to test this in, so take this suggestion with a pinch of salt. Conceptually, you would do something like this:
cd $RUST_SRC_DIR
rustc --version --verbose | grep commit-hash # Grab the hash
git checkout $RUSTC_HASH
mkdir cross-compiled-libraries
rustc --target=arm-whatever-whatever -O src/libcollections/lib.rs \
--out-dir cross-compiled-libraries
Repeat the last step for whatever libraries you need. A lot of this is taken from the ideas in Embedded Rust Right Now!.
A big concern with this solution is that libcollections requires an allocator. Generally, there is jemalloc or the system allocator. I don't know if either are available on the target you are compiling for...
This doesn't quite get you all the way to something easy to use for Cargo, either. The stuff inside of Rust isn't actually Cargo-ified yet, either. You could create a new Cargo project and add something like this to the Cargo.toml:
[lib]
path = "/path/to/rust/src/libcollections/lib.rs"
Which would then allow you to rely on Cargo more.

Can Adobe .jsx scripts include other script files?

We're writing a bunch of .jsx scripts and in each I have to mock out some functions so I can use things like Array.map() and String.trim(), but I don't want to have to include that code at the top of every script.
Is there a way to "include" other .jsx scripts inside of a .jsx script file?
Or you can simply use #include and #includepath preprocessor directives at the top of your script.
You can find a detailed description in Adobe's "JavaScript Tools Guide".
For example, if you want to include scripts/helper.jsx in a .jsx file:
#include "scripts/helpers.jsx"
// the rest of your code below ...
Just leaving this here for anyone like me who is looking for this. In Adobe Illustrator CC 2015, I was unable to get #include to work, but #include did. So for example:
External File: foo.jsx
function alertTheWordNo(){
alert('NO');
}
The Script File: bar.jsx
//#include 'foo.jsx';
alertTheWordNo();
Disclaimer: I cannot find any documentation of this but have tested it with Adobe Illustrator CC 2015 and it works.
Hope this helps someone. If anyone has any questions just ask!
We're now using the $ helper available in Illustrator, and the $.evalFile() method. Pass it a path and it will evaluate the file and return the result.
I created a little helper that I can include (minified, of course) at the top of my .jsx scripts so I can do Libraries.include("my-other-script") that will include, in my case, a file that's in my adobe_scripts root folder, in a directory called lib.
// indexOf polyfill from https://gist.github.com/atk/1034425
[].indexOf||(Array.prototype.indexOf=function(a,b,c){for(c=this.length,b=(c+~~b)%c;b<c&&(!(b in this)||this[b]!==a);b++);return b^c?b:-1;});
var Libraries = (function (libPath) {
return {
include: function (path) {
if (!path.match(/\.jsx$/i)) {
path = path + ".jsx";
}
return $.evalFile(libPath + path);
}
};
})($.fileName.split("/").splice(0, $.fileName.split("/").indexOf("adobe_scripts") + 1).join("/") + "/lib/");
Minified version that I include:
/**
* Libraries.include(path) -- path must be relative to adobe_scripts/lib/
* See: https://gist.github.com/jasonrhodes/5286526
*/
[].indexOf||(Array.prototype.indexOf=function(a,b,c){for(c=this.length,b=(c+~~b)%c;b<c&&(!(b in this)||this[b]!==a);b++);return b^c?b:-1;});var Libraries=function(a){return{include:function(b){return b.match(/\.jsx$/i)||(b+=".jsx"),$.evalFile(a+b)}}}($.fileName.split("/").splice(0,$.fileName.split("/").indexOf("adobe_scripts")+1).join("/")+"/lib/");
See gist here: https://gist.github.com/jasonrhodes/5286526
Just wanted to add a note to Ike10's answer. Undocumented is generous - this is the worst "documentation" I've ever come across in 20+ years of writing code. It seems to me that you must also add the CEFCommandLine argument to your manifest.xml file before the primary JSX file will load/eval external files:
<Resources>
<MainPath>./whatever.html</MainPath>
<ScriptPath>./jsx/whatever.jsx</ScriptPath>
<CEFCommandLine>
<Parameter>--allow-file-access</Parameter>
<Parameter>--allow-file-access-from-files</Parameter>
</CEFCommandLine>
</Resources>

C, Objective-C preprocessor output

Is there a way to get pre-processed C/Objective-C code? I have some files I acquired and would like to see the code produced by some #defines.
From within Xcode:
Xcode 3: Select the file, then Build → Preprocess.
Xcode 4: Select the file, then Product → Generate Output → Generate Preprocessed File.
On the command line, gcc -E foo.m will show you the preprocessed output (just as it does for normal C/C++ files). Of course, this will also expand any #include or #import statements you may have in your code.
Use the -E command-line argument to gcc or clang. This is documented as: “Preprocess only; do not compile, assemble or link” and indeed it outputs the preprocessed version to stdout.
In Xcode 5: Select the .m file, then Product -> Perform Action -> Preprocess ".m"