StringTemplate conditional statement can't support similar id>0? - stringtemplate

Recently I use stringtemplate,I noticed StringTemplate can't support complex conditional,similar value>1 or value="menu" etc.
who can give me a advise how to work?thanks.

StringTemplate deliberately doesn't support complex conditionals. The thinking is that if you need that much logic, you're breaking its strict enforcement of model-view separation. Instead, use a template variable that represents the complex condition you're trying to represent, and pass that in instead, or use a method that computes the correct result.

Related

When we use linq do we ever need to use from?

I like to use keyword where, for example.
I simply do it like this
Dim orderFromThisGridTrading = _orders1.Where(Function(x) x.customIdentifier = ASimpleGridTradeSettings.name).ToArray
However, most samples would say I have to use from
Dim customersForRegion = From cust In customers Where cust.Region = region
Which is weird af and don't follow normal vb.net format. That looks a like like SQL languages and not a real programming language.
I wonder if I can always avoid using Form and just use like I am using? Are there any cases where that is not possible?
Is this even a good idea?
There is nothing bad in using query syntax in general. Especially in VB.NET it is very powerful and supports a lot of LINQ methods(more than in C#). Many prefer that because it can make your code more readable. But it seems that the opposite is true for you, you don't like it. Then use the method syntax, it supports all methods. I don't like it in VB.NET because of that ugly and verbose Function keyword, especially with GroupBy or Join i prefer query syntax.
I wonder if I can always avoid using From and just use like I am
using? Are there any cases where that is not possible?
No, method syntax is always possible. Actually query syntax gets compiled to method syntax.
The second example is called "Query Comprehension Syntax". It's optional. A lot of people find it makes their code more readable, but not everyone likes it. Personally, I find it adds another extra layer of indirection for what you need to know to get the computer to do what you want.
In the first example, the "From" is implied by the intial IEnumerable or IQueryable item in the expression.
But I do have one issue:
SQL is definitely a real programming language, and you'll likely get even better results from learning it well vs needing to rely on an ORM to construct queries. Eventually, usually sooner than later, you'll find you want to do something where you need to know advanced SQL anyway. Then you have two problems, because you need to the know the SQL and you need to know how to construct the ORM syntax to produce that SQL.

ANTRL4: Extracting expressions from languages

I have a programming language that has many constructs in it however I am only interested in extracting expressions from the language.
Is that possible to do without having to write the entire grammar?
Yes, its possible. You want what is called an "island parser". https://en.wikipedia.org/wiki/Island_grammar. You might not actually
decide to do this, more below.
The essential idea is to provide detailed grammar rules for the part of the language ("islands") you care about, and sloppy rules for the rest ("water").
The detailed grammar rules... you write as would normally write them. This includes building a lexer and parser to parse the part you want.
The "water" part is implemented as much as you can by defining sloppy lexemes. You may need more than one, and you will likely have to handle nested structures e.g., things involving "("...")" "["..."] and "{" ... "}" and you will end up doing with explicit tokens for the boundaries of these structures, and recursive grammar rules that keep track of the nesting (because lexers being FSAs typically can't track this).
Not obvious when you start, but painfully obvious after you are deep into this mess is skipping over long comment bodies, and especially string literals with the various quotes allowed by the language (consider Python for an over the top set) and the escaped sequences inside. You'll get burned by languages that allow interpolated strings when you figure out that you have the lex the raw string content separately from the interpolated expressions, because these are also typically nested structures. PHP and C# allow arbitrary expressions in their interpolated strings.... including expressions which themselves can contain... more interpolated strings!
The upside is all of this isn't really hard technically, if you ignore the sweat labor to dream up and handle all the funny cases.
... but ... considering typical parsing goals, island grammars tend to fall apart when used for this purpose.
To process expressions, you usually need the language declarations that provide types for the identifiers. If you leave them in the "ocean" part... you don't get type declarations and now it is hard to reason about your expressions. If you are processing java, and you encounter (a+b), is that addition or string concatenation? Without type information you just don't know.
If you decide you need the type information, now you need the detailed grammar for the variable and type declarations. And suddenly you're a lot closer to a full parser. At some point, you bail and just build a full parser; then you don't have think about whether you've cheated properly.
You don’t mention your language, but there’s a good chance that there’s an ANTLR grammar for it here ANTLR Grammars
These grammars will parse the entire contents of the source (by doing this, you can avoid some “messiness” that can come with trying decide when to pop into, and out of, island grammars, which could be particularly messy for expressions since they can occur in so many places within a typical source file.)
Once you have the resulting ParseTree, ANTLR provides a Listener capability that allows you to call a method to walk the tree and call you back for only those parts you are interested in. In your case that would be expressions.
A quick search on ANTLR Listeners should turn up several resources on how to write a Listener for your needs. (This is a pretty short article that covers the basics (in this case, for when you’re only interested in methods, but expressions would be similar. There are certainly others).

Creating generic code using database/sql package?

I've recently implemented a package that uses the database/sql package. By limiting the SQL to very simple select/update/insert statements I assumed the package would work with all the DBMS supported by database/sql.
However, it turns out that some databases use ? as placeholder value while others use $1, $2, etc., which means the prepared statements will work with some DBMS but not with others.
So I'm wondering is there any technique to make this work in a generic way with all the supported drivers? Or is it necessary to have DBMS-specific code everywhere? (which I think would make the abstraction provided by database/sql a bit pointless). I guess using non-prepared statements is not an option either since different DBMS have different ways to escape parameters.
Any suggestion?
I assume this behaviour was left out specifically because SQL dialects vary significantly between databases, and the Go team wanted to avoid writing a preprocessor for each driver to translate 'GoSQL' into native SQL. The database/sql package mostly provides connection wrangling, which is an abstraction that falls under 'pretty much necessary' instead of statement translation, which is more 'nice to have'.
That said, I agree that re-writing every statement is a major nuisance. It shouldn't be too hard to wrap the database/sql/driver.Prepare() method with a regex to substitute the standard placeholder with the native one, though, or provide a new interface that specifies an additional PrepareGeneric method that guesses a wrapped sql.DB flavour, and provides similar translation.
Gorp uses a dialect type for this, which may be worth a look.
Just throwing out ideas.

removing dead variables using antlr

I am currently servicing an old VBA (visual basic for applications) application. I've got a legacy tool which analyzes that application and prints out dead variables. As there are more than 2000 of them I do not want to do this by hand.
Therefore I had the idea to transform the separate codefiles which contain the dead variable according to the aforementioned tool to ASTs and remove them that way.
My question: Is there a recommended way to do this?
I do not want to use StringTemplate, as I would need to create templates for all rules and if I had a commend on the hidden channel, it would be lost, right?
Everything I need is to remove parts of that code and print out the rest as it was read in.
Any one has any recommendations, please?
Some theory
I assume that regular expressions are not enough to solve your task. That is you can't define the notion of a dead-code section in any regular language and expect to express it in a context-free language described by some antlr grammar.
The algorithm
The following algorithm can be suggested:
Tokenize source code with a lexer.
Since you want to preserve all the correct code -- don't skip or hide it's tokens. Make sure to define separate tokens for parts which may be removed or which will be used to determine the dead code, all other characters can be collected under a single token type. Here you can use output of your auxiliary tool in predicates to reduce the number of tokens generated. I guess antlr's tokenization (like any other tokenization) is expressed in a regular language so you can't remove all the dead code on this step.
Construct AST with a parser.
Here all the powers of a context-free language can be applied -- define dead-code sections in parser's rules and remove it from the AST being constructed.
Convert AST to source code. You can use some tree parser here, but I guess there is an easier way which can be found observing toString and similar methods of a tree type returned by the parser.

What, if any, are the disadvantages of SQL::Interp over SQL::Abstract?

I'm currently looking at some light-weight SQL abstraction modules. My workflow is such that i usually write SELECT queries manually, and INSERT/UPDATE queries via subs which take hashes.
Both of these modules seem perfect for my needs and i have a hard time deciding. SQL::Interp claims SQL::Abstract cannot provide full expressivity in SQL, but discusses no other differences.
Does it have any disadvantages? If so, which?
I can't speak to SQL::Interp, but I use SQL::Abstract and it's pretty good. In conjunction with DBIx::Connector and plain old DBI, I was able to totally eliminate the use of an ORM in my system with very little downside.
The only limitations I have run into is that it's not possible to write GROUP BY queries directly (although it's easy to do by simply appending to the generated query, and LIMIT queries are handled by the extension SQL::Abstract::Limit.
I used SQL::Abstract for a over a year, and then switched to SQL::Interp, which I've stuck with since.
SQL::Abstract had trouble with complex clauses. For the ones it could support, you would end up with a nest of "(" "[" and {" characters, which you were mentally translate back to meaning "AND", "OR" or actually parentheses.
SQL::Interp has no such limitations and uses no middle representation. Your SQL looks like SQL with bind variables where you want them. It works for complex queries as well as simple ones. I find SQL::Interp especially pleasant to use in combination with DBIx::Simple's built-in support for it. DBIx::Simple+SQL::Interp is a friendly and intuitive replacement for using raw DBI. I use the combination in a 100,000k+ LoC mod_perl web app.