What's a good pattern for writing a Module and Class that augments Middleman? - middleman

I'm using Middleman 3.4.0 and in customizing my project I have some methods that are becoming too extensive for keeping them as helpers in config.rb. I'd like to write either a Module or Class and mix these back in as helpers with the ability to model the objects I need to work with. Is there a pattern for writing this as a Middleman extension somewhere that I can start from?

Use the middleman extension <extension name> command, which gives you a good boilerplate, see here.

Related

ArchUnit to test actual layered architecture

Currently in our project we have layered architecture implemented in following way where Controller, Service, Repository are placed in the same package for each feature, for instance:
feature1:
Feature1Controller
Feature1Service
Feature1Repository
feature2:
Feature2Controller
Feature2Service
Feature2Repository
I've found following example of arch unit test where such classes are placed in dedicated packages https://github.com/TNG/ArchUnit-Examples/blob/master/example-junit5/src/test/java/com/tngtech/archunit/exampletest/junit5/LayeredArchitectureTest.java
Please suggest whether there is possibility to test layered architecture when all layers are in single package
If the file name conventions are followed properly across your project, how about you write custom test cases instead of using layeredArchitecture().
For Example:
classes().that().haveSimpleNameEndingWith("Service")
.should().onlyBeAccessed().byClassesThat().haveSimpleNameEndingWith("Controller")
noClasses().that().haveSimpleNameEndingWith("Service")
.should().accessClassesThat().haveSimpleNameEndingWith("Controller")
I know this question is rather old. But for the record, this has been possible for a while using predicates for the layers, e.g.
layeredArchitecture().consideringAllDependencies()
.layer("Controllers").definedBy(HasName.Predicates.nameEndingWith("Controller"))
.layer("Services").definedBy(HasName.Predicates.nameEndingWith("Service"))
.layer("Repository").definedBy(HasName.Predicates.nameEndingWith("Repository"))
.whereLayer("Controllers").mayNotBeAccessedByAnyLayer()
.whereLayer("Services").mayOnlyBeAccessedByLayers("Controllers")
.whereLayer("Repository").mayOnlyBeAccessedByLayers("Services")
However, I'm not sure how well this works in practice. Because usually you don't just have classes following this naming pattern and that's it. A service might also have some POJO as method parameter type (e.g. MyInput) and that should maybe for example not be used by repositories as well. Also, using forward dependency rules (mayOnlyAccessLayers(..)) this might then cause unwanted violations.

How to set custom name, suffix for mapper files and interfaces in mybatis generator?

Can you set custom suffix and naming rule mapper xml and interfaces in MyBatis Generator (MBG)?
For example, When generating mapper files for class Book. MBG generates mapper file BookMapper.xml and interface PartnerDao.java. However, I wish to change the suffix to something else, like BookMapperBase.xml or BookDaoBase.xml, and PartnerMapperBase.java or PartnerDaoBase.java.
The reason is, former colleagues were using BookMapper.xml for their hand-written sql statements and using the same name would cause confusion. Moreover, I do not wish to use generated mappers directly, but use custom mapper files that extend BookMapperBase.xml.
I have searched online and found some github projects and hot rod ORM, but is it really not supported by official Mybatis Generator? If not, what is your recommended alternative?
There are a couple of options.
You could use a domain object renaming rule as documented here: http://www.mybatis.org/generator/configreference/domainObjectRenamingRule.html
If that doesn't work the way you want it to, you could write a MyBatis Generator plugin to change the names of the generated artifacts. There is an example here: https://github.com/mybatis/generator/blob/master/core/mybatis-generator-core/src/main/java/org/mybatis/generator/plugins/RenameExampleClassPlugin.java

Make IntelliJ aware of links to Java elements in XML files

I have a custom XML format that links to Java resources. For the sake of simplicity let's assume my XML file would look like this:
<root>
<java-class>my.fully.qualified.class.name</java-class>
</root>
Eventually my references will be somewhat more complicated. It will not contain the fully qualified class name directly and I will need some logic to resolve the correct class, but I want to keep the example as simple as possible here.
Now I want it to be possible to Strg+Click on the element's text and want IntelliJ to carry me to the .java file, just like it is possible in Spring-XML files. In the IDEA Plugin Development FAQ there is a link called "How do I add custom references to Java elements in XML files?" which so much sounds like exactly what I need. Unfortunately it links to a discussion where someone is more or less done implementing something like this, having some minor problems. Nevertheless I understood that I probably need to write an implementation of the interface com.intellij.psi.PsiReference. Googling for "PsiReference" and "IntelliJ" or "IDEA" unfortunately did not bring up any tutorials on how to use it, but I found the class XmlValueReference which sounds useful. Yet again googling for "XmlValueReference" did not turn up anything useful on how to use the class. At least the PSI Cookbook tells me that I can find the Java class by using JavaPsiFacade.findClass(). I'd be thankful for any tutorials, hints and the like, that tell the correct usage.
The above linked discussion mentions that I need to call registry.registerReferenceProvider(XmlTag.class, provider) in order to register my provider once I eventually managed to implement it, but of which type is "registry" and where do I get it from?
First of all, here's a nice tutorial that came up a few days ago, which explains the basics of IntelliJ plugin development (you should take a look at the section Reference Contributor).
You will likely have to define your own PsiReferenceContributor, which will be referenced in your plugin.xml like this:
<psi.referenceContributor implementation="com.yourplugin.YourReferenceContributor"/>
In your reference contributor, there's a method registerReferenceProviders(PsiReferenceRegistrar) where you will be able to call registry.registerReferenceProvider(XmlTag.class, provider).
Finally, in your instance of PsiReferenceProvider, you will have to test the tag name to filter out tags which don't contain class references, then find the right Java class using JavaPsiFacade.findClass().
From my experience, the best place to get help regarding IntelliJ plugin development is JetBrains' forums.

Generating "user" and "developer" documentation from the same codebase using Doxygen

I'm new to Doxygen and I'm trying to document an API I am planning to open source. I'd really like to build two sets of documentation, one for end users of the API and one for those who intend to modify it. Is there a way to tag Doxygen comment blocks in a way such that I can generate "user" and "dev" documentation trees? Is there a better solution to my problem? Thanks!
Depending on how your code is structured, you might be able to get away with using two Doxygen config files each including separate source files. The "user" config file would only list the source files containing the public interface to the API, while the "dev" config file would list all source files for the whole project.
This does mean that all your interfaces (e.g. abstract base classes) will need to be documented with the user in mind, but that is usually not a problem as by definition there is unlikely to be any implementation details in an abstract base class.
All your "dev" documentation then sits in the actual classes implementing the interfaces, which are never seen by the API and can be safely omitted by the "user" Doxygen config file.
Of course if your code isn't structured this way it's not going to work, so the only solution I can think of is to fill your comments with a bunch of conditional statements.
In addition to what Malvineous already said, there is the \internal doxygen command.
\internal lets you hide or show part of the documentation by changing INTERNAL_DOCS in the Doxyfile
More information here: http://www.doxygen.nl/manual/commands.html#cmdinternal

How to provide specific GWT implementations

Suppose I am working on exposing some of my server-side classes to a GWT application, but certain parts could be done much better using GWT-specific components (like JSNI, for instance).
What are some techniques for doing so without being too hacky?
For instance, I am aware of using a subpackage and using the <super-source/> tag, but this requires the package names to be different, which causes eclipse to complain. The general solution in the community is to then tell eclipse to use that as a source folder, but then eclipse complains about there being two classes with the same name.
Ideally, there would just be a way to keep everything in a single source tree, and actually have different classes which apply the alternate implementations. This would feel like a more OO approach.
I would like to add a suffix to a class like _gwt which accomplishes this automatically, and I know I could write a script to do this kind of transformation, but that is a kludge for sure.
I've been considering using Google's GIN/GUICE libraries for my projects in general, and I think there might be some kind of a solution there, but I am not sure as I have not thoroughly investigated it.
What are some solutions you have tried in the past on GWT projects?
The easiest way to have split implementations is to use super-source code, but only enough to instantiate a uniquely-named instance or dispatch to a different method. Ideally, the super-source implementation is just a few lines long, and not so bad that you can't roll it by hand.
To work around the Eclipse / javac double-mapping and package name issues, the GWT source uses two top-level roots for user code: user/src and user/super. For example, the AutoBeans package has a split-implementation of JSON quoting and evaluation, one for the JVM and one for the browser.
There's really no non-kludgy way to implement super-source, as this is a feature way outside what you can specify in the language. There's nothing that lets you say "use this implementation in this environment" without the use of some external tool.