Is #uifabric/build private package? - office-ui-fabric

I would like to investigate possibility of using just-scripts in my project. So for inspiration I've took a look at its use in office-ui-react. However there is not much I can read from it's just.config.json.
const { preset, just } = require('#uifabric/build');
const { chain, task } = just;
preset();
chain('verify-api-extractor').after('build');
My question is, if all it's build, test, etc. configuration in #uifabric/build preset function? And if so, if it's a public package?

Author of just-scripts here. I realize that documentation may be a bit lacking at the moment - definitely can use help here!
just-scripts github: https://github.com/microsoft/just
documentation: https://microsoft.github.io/just/
I don't recommend taking #uifabric/build as a dependency for another project. I recommend folks to build on top of the just-scripts one:
https://github.com/microsoft/just/tree/master/packages/just-scripts/src/task-presets
These presets are exported by just-scripts, so you can use them as you wish. task() can override already defined tasks, so you can feel free to override build, test, etc.

Related

Dynamically register a language

I'm writing a vscode extension and I'd like to register languages dynamically, based on user configuration. The extension would then instantiate LSP clients to talk to servers derived from user configuration as well.
This would allow for people writing custom and toy languages to get an extension "for free" and experiment with editor features without necessarily having to implement and publish the vscode part of it.
I've dug a bit in the vscode sources, and found an interface that seem like it could help : "ILanguageService", but I'm unsure as to whether this is something that's accessible from the extension API.
Any idea how I could go at it ? Is it even possible ?
Alright, so my question stemmed from a misunderstanding of how LSP clients work. They don't necessarily need to be tied to a language, and can work on a glob-pattern basis, something like
const filter: DocumentFilter = {
scheme: 'file',
pattern: `**/*.${myLanguage.extension}`
};
const clientOptions: LanguageClientOptions = {
documentSelector: [filter]
};
This seems to be sufficient for vscode to understand which LSP it should be calling

Creating Cytoscape.js extensions

Max, I want to update my extension to the new format, but I am running into issues with placement of custom code. It seems that the extension framework has been updated a lot since I added an extension 4 years ago. Is there a way to get better documentation on getting started with adding a extension? I am happy to help write up the documentation if you can help answer some questions that I think would help get people started. Let me know.
The only thing that really changed is that the scaffolder creates a webpack project for you. The extension registering procedure is the same: http://js.cytoscape.org/#extensions/api
For example, cytoscape( 'collection', 'fooBar', function(){ return 'baz'; } ) registers eles.fooBar().
I guess the main thing is that there are a lot more files than what the previous scaffolder generated, so it might be harder to find things. The layout output has lots of files, because it creates a skeleton impl for each of the continuous case and the discrete case.
The scaffolder isn't strictly necessary. You could use another build system (or none at all) as long as you call cytoscape(). For example, if you only care about publishing to npm for people who use webpack/browserify/rollup, then you could just use cjs require('cytoscape') to pull in the peer dependency. Exporting a register function is nice if you want to allow the client to decide the order of extension registrations with cytoscape.use(extension) (or extension(cytoscape)).
You're right that there should be some more docs on the output of the scaffolder. Maybe a summary of the files would suffice. We could add a tutorial in the blog later if need be. Both the docs and the blog just use markdown, so the content could go in either place.

Can the Properties (JavaBeans) already be used with Codename One?

I am interested in using Properties in my CodenameOne project, namely because properties can be observed. I searched and found this blog post which starts by announcing
We [Codename One committers] committed properties as a deprecated API
but then the blog post seems to tell it still could be in active development see
The code below is preliminary and the syntax/classes might change without warning
The code presented in the blog post is not testable on my project. Indeed the following code does not work :
public class User implements PropertyBusinessObject {
// Do stuff
}
because the PropertyBusinessObject interface does not exist in my project. How could the PropertyBusinessObject interface be defined, what should it extend ? And by the way are Properties already available ?
Thank you very much for helping me sorting this out in my mind!
Sure they've been available for a while. Just use Update Client Libs in the Codename One settings under the basics section. Once you do that you will have the latest libraries.
When you create a new project in the IDE it uses the libraries it has locally not necessarily the latest.

Go: test internal functions

Suppose I have a type MyType with a private method (mt *MyType) private() in a package mypackage.
I also have a directory tests, where I want to store tests for my package. This is how tests/mypackage_test.go looks like:
package mypackage_test
import (
"testing"
"myproj/mypackage"
)
func TestPrivate(t *testing.T) {
// Some test code
}
However, when I run go test I get the cannot refer to unexported field or method my package.(*MyType)."".private) error. I've googled a bit and found out that functions starting with lower case can not be seen outside their own package (and this seems to be true, 'cause upper case functions are freely callable from the tests).
I also read somewhere that adding <...>_internal_test.go to the test file could solve my problem like this (tests/mypackage_internal_test.go):
package mypackage
import (
"testing"
)
func TestPrivate(t *testing.T) {
mt := &MyType{}
// Some test code
}
But with this I only get undefined: MyType. So, my question: how can I test internal/private methods?
Why do you place your tests in a different package? The go testing mechanism uses _test as a suffix for test files so you can place tests in the same packages as the actual code, avoiding the problem you describe. Placing tests in a separate package is not idiomatic Go. Do not try to fight the Go conventions, it's not worth the effort and you are mostly going to lose the fight.
Go insists that files in the same folder belong to the same package, that is except for _test.go files. Moving your test code out of the package allows you to write tests as though you were a real user of the package. You cannot fiddle around with the internals, instead you focus on the exposed interface and are always thinking about any noise that you might be adding to your API.
And:
If you do need to unit test some internals, create another file with _internal_test.go as the suffix. Internal tests will necessarily be more brittle than your interface tests — but they’re a great way to ensure internal components are behaving, and are especially useful if you do test-driven development.
Source: https://medium.com/#matryer/5-simple-tips-and-tricks-for-writing-unit-tests-in-golang-619653f90742
There are different opinions on how you should struct you tests within a golang project and I suggest you to read the blog above.

Documenting Node.js projects [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I'm currently using JSDoc Toolkit to document my code, but it doesn't quite fit - namely, it seem to struggle with describing namespaces properly. Say you have two simple classes in each their files:
lib/database/foo.js:
/** #class */
function Foo(...) {...}
/** #function ... */
Foo.prototype.init(..., cb) { return cb(null, ...); };
module.exports = foo;
And then something inherited lib/database/bar.js:
var Foo = require('./foo');
/**
* #class
* #augments Foo
*/
function Bar(....) {...}
util.inherits(Bar, Foo);
Bar.prototype.moreInit(..., cb) { return cb(null, ...); };
In the generated documentation, this is output simply as Foo and Bar, without the leading database (or lib.database), which are quite necessary when you don't have everything in a global scope.
I've tried throwing #namespace database and #name database.Foo at it, but it doesn't turn out nice.
Any ideas for making JSDoc output something more suitable, or some entirely different tool that works better with Node.js? (I looked briefly at Natural Docs, JSDuck and breezed over quite a few others that looked quite obsolete...)
JSDoc is a port of JavaDoc. So basically the documentation assumes classical OOP and that's not suited to JavaScript.
Personally I would recommend using docco to annotate your source code. Examples of it can be found for underscore, backbone, docco.
A good alternative to docco is groc
As for an actual API documentation, I personally find auto generated documentation from comments just does not work for JavaScript and recommend you hand-write your API documentation.
Examples would be underscore API, Express API, nodejs API, socket.io docs
Similar StackOverFlow questions
Generating Javascript documentation
YUIDoc is a Node.js application that generates API documentation from comments in source, using a syntax similar to tools like Javadoc and Doxygen. YUIDoc provides:
Live previews. YUIDoc includes a standalone doc server, making it trivial to preview your docs as you write.
Modern markup. YUIDoc's generated documentation is an attractive, functional web application with real URLs and graceful fallbacks for spiders and other agents that can't run JavaScript.
Wide language support. YUIDoc was originally designed for the YUI project, but it is not tied to any particular library or programming language. You can use it with any language that supports /* */ comment blocks.
NOTE: Dox no longer outputs HTML, but a blob of JSON describing the parsed code. This means the code below doesn't work terribly well any more...
We ended up using Dox for now. It is a lot like docco, that Raynos mentions, but thows all of it in one bit HTML-file for output.
We hacked this into our makefiles:
JS_FILES := $(shell find lib/ -type f -name \*.js | grep -v 3rdparty)
#Add node_modules/*/bin/ to path:
#Ugly 'subst' hack: Check the Make Manual section 8.1 - Function Call Syntax
NPM_BINS:=$(subst bin node,bin:node,$(shell find node_modules/ -name bin -type d))
ifneq ($(NPM_BINS),)
PATH:=${NPM_BINS}:${PATH}
endif
.PHONY: doc lint test
doc: doc/index.html
doc/index.html: $(JS_FILES)
#mkdir -p doc
dox --title "Project Name" $^ > $#
It is not the prettiest or most efficient documentation ever made (and dox has quite a few minor bugs) - but I find it work rather well, at least for minor projects.
Sorry, I was not on StackExchange a year ago, but I believe the answer to your original question is to use the #memberOf tag:
/** #namespace */
database = {};
/**
* #class
* #memberOf database
*/
function Foo() { ... };
http://code.google.com/p/jsdoc-toolkit/wiki/TagMemberOf
This tag may or may not have existed when you asked your question.
Found a really nice solution for the problem: doxx.
It uses dox as mentioned above and converts this to nice HTML afterwards. Has a nice usage and worked great for me.
https://github.com/FGRibreau/doxx
I work with JSDoc and is very efficient, in addition to easy, but when projects have many alternate libraries are quite complicated development. I found Groc a very good tool based on Docco and works with other languages like: Python, Ruby, C + +, among others...
Furthermore Groc working with Markdown in GitHub which can be much more efficient when working with git as version control. Further helps assemble pages for publishing on GitHub.
You can also use the task manager GruntJS through grunt-groc example:
Install package:
npm install grunt-groc --save-dev
configure in your task file:
grunt.loadNpmTasks('grunt-groc');
And the config task:
// Project configuration.
grunt.initConfig({
groc: {
coffeescript: [
"coffee/*.coffee", "README.md"
],
options: {
"out": "doc/"
}
}
});
For run task:
grunt.registerTask('doc', ['groc'])