How do I get rid of depreciated code in this class file for java antlr? - antlr

import org.antlr.v4.runtime.*;
import org.antlr.v4.runtime.tree.ParseTree;
import java.io.FileInputStream;
import java.io.InputStream;
public class Calc {
public static void main(String[] args) throws Exception {
String inputFile = null;
if ( args.length>0 ) inputFile = args[0];
InputStream is = System.in;
if ( inputFile!=null ) is = new FileInputStream(inputFile);
ANTLRInputStream input = new ANTLRInputStream(is);
CalculatorLexer lexer = new CalculatorLexer(input);
CommonTokenStream tokens = new CommonTokenStream(lexer);
CalculatorParser parser = new CalculatorParser(tokens);
ParseTree tree = parser.program(); // parse
CalcVisitor calcc = new CalcVisitor();
calcc.visit(tree);
}
}
As far as I know, I am pretty sure the ANTLRFileStream is what is depricated, but I have tried replacing it with CharStreams, but the code I try and run keeps resulting in an error. How can I fix this?

Try something like this:
public static void main(String[] args) throws Exception {
CharStream charStream = args.length > 0
? CharStreams.fromFileName(args[0])
: CharStreams.fromStream(System.in);
CalculatorLexer lexer = new CalculatorLexer(charStream);
CommonTokenStream tokens = new CommonTokenStream(lexer);
CalculatorParser parser = new CalculatorParser(tokens);
ParseTree tree = parser.program(); // parse
CalcVisitor calcc = new CalcVisitor();
calcc.visit(tree);
}

Related

ANTLRFileStream is deprecated, what can I use instead?

I am trying to compile a test class to test a simple grammer.
import org.antlr.v4.runtime.*;
public class Test {
public static void main(String[] args) throws Exception
{
CharStream input = null;
// pick an input stream (filename from commandline or stdin)
if(args.length > 0) input = new ANTLRFileStream(args[0]);
else input = new ANTLRInputStream(System.in);
// create the lexer
DrinkLexer lex = new DrinkLexer(input);
// create a buffer of tokens between the lexer and parser
CommonTokenStream tokens = new CommonTokenStream(lex);
// create the parser, attaching it to the token buffer
DrinkParser p = new DrinkParser(tokens);
p.drinkSentence(); // launch parser at drinkSentence file
}
}
How would I go about replacing the deprecated class?
Use the various static methods from CharStreams:
CharStream input = null;
// pick an input stream (filename from commandline or stdin)
if(args.length > 0) input = CharStreams.fromFileName(args[0]);
else input = CharStreams.fromStream(System.in);
// create the lexer
DrinkLexer lex = new DrinkLexer(input);
// create a buffer of tokens between the lexer and parser
CommonTokenStream tokens = new CommonTokenStream(lex);
// create the parser, attaching it to the token buffer
DrinkParser p = new DrinkParser(tokens);
p.drinkSentence(); // launch parser at drinkSentence file

Lucene - apply TokenFilter to a String and get result

I want to debug Lucene token filters and see results. How is it possible to apply a token filter to a token stream in order to see result?
(using Lucene 4.10.3)
import java.io.IOException;
import java.io.StringReader;
import java.util.Iterator;
import org.apache.lucene.analysis.core.LowerCaseFilter;
import org.apache.lucene.analysis.standard.StandardTokenizer;
import org.apache.lucene.analysis.tokenattributes.CharTermAttribute;
public class TokenFilterExample {
public static void main(String[] args) throws IOException {
// 1] Create token stream
StringReader r = new StringReader("Hello World");
StandardTokenizer s = new StandardTokenizer(r);
// Create lower-case token filter
LowerCaseFilter f = new LowerCaseFilter(s);
// Print result
System.out.println(??????);
// close
f.close();
s.close();
}
}
The solution is this:
// Print result
f.reset();
Iterator it = f.getAttributeClassesIterator();
while (f.incrementToken()) {
System.out.println(f
.getAttribute(CharTermAttribute.class));
}

How to solve a FolderClosedIOException?

So I am new to Apache Camel. I know that most of this code is probably not the most efficient way to do this, but I have made a code that uses Apache Camel to access my gmail, grab the new messages and if they have attachments save the attachments in a specified directory. My route saves the body data as a file in that directory. Everytime the DataHandler tries to use the getContent() method, whether its saving a file or trying to print the body to System.out, I get either a FolderClosedIOException or a FolderClosed Exception. I have not clue how to fix it. The catch reopens the folder but it just closes again after getting another message.
import org.apache.camel.*;
import java.io.*;
import java.util.*;
import javax.activation.DataHandler;
import javax.mail.Folder;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;
import com.sun.mail.util.FolderClosedIOException;
public class Imap {
public static void main(String[] args) throws Exception {
CamelContext context = new DefaultCamelContext();
context.addRoutes(new RouteBuilder() {
public void configure() {
from("imaps://imap.gmail.com?username=********#gmail.com&password=******"
+ "&debugMode=false&closeFolder=false&mapMailMessage=false"
+ "&connectionTimeout=0").to("file:\\EMAIL");
}
});
Map<String,String> props = new HashMap<String,String>();
props.put("mail.imap.socketFactory.class","javax.net.ssl.SSLSocketFactory");
props.put("mail.imap.auth", "true");
props.put("mail.imap.host","imap.gmail.com");
props.put("mail.store.protocol", "imaps");
context.setProperties(props);
Folder inbox = null;
ConsumerTemplate template = context.createConsumerTemplate();
context.start();
while(true) {
try {
Exchange e = template.receive("imaps://imap.gmail.com?username=*********#gmail.com&password=***********", 60000);
if(e == null) break;
Message m = e.getIn();
Map<String, Object> s = m.getHeaders();
Iterator it = s.entrySet().iterator();
while(it.hasNext()) {
Map.Entry pairs = (Map.Entry)it.next();
System.out.println(pairs.getKey()+" === "+pairs.getValue()+"\n\n");
it.remove();
}
if(m.hasAttachments()) {
Map<String,DataHandler> att = m.getAttachments();
for(String s1 : att.keySet()) {
DataHandler dh = att.get(s1);
String filename = dh.getName();
ByteArrayOutputStream o = new ByteArrayOutputStream();
dh.writeTo(o);
byte[] by = o.toByteArray();
FileOutputStream out = new FileOutputStream("C:/EMAIL/"+filename);
out.write(by);
out.flush();
out.close();
}
}
} catch(FolderClosedIOException ex) {
inbox = ex.getFolder();
inbox.open(Folder.READ_ONLY);
}
}
context.stop();
}
}
Please somebody tell me whats wrong!!
The error occurs here:
dh.writeTo(o);
We were was solving a similar problem in akka-camel
The solution i believe was to use manual acknowledgement and send an acknowledgement after we were done with the message.

Distributed Cache in Pig UDF

Here is my code to Implement a UDF using Distributed Cache Using Pig.
public class Regex extends EvalFunc<Integer> {
static HashMap<String, String> map = new HashMap<String, String>();
public List<String> getCacheFiles() {
Path lookup_file = new Path(
"hdfs://localhost.localdomain:8020/user/cloudera/top");
List<String> list = new ArrayList<String>(1);
list.add(lookup_file + "#id_lookup");
return list;
}
public void VectorizeData() throws IOException {
FileReader fr = new FileReader("./id_lookup");
BufferedReader brd = new BufferedReader(fr);
String line;
while ((line = brd.readLine()) != null) {
String str[] = line.split("#");
map.put(str[0], str[1]);
}
fr.close();
}
#Override
public Integer exec(Tuple input) throws IOException {
// TODO Auto-generated method stub
return map.size();
}
}
Given Below is my Distributed Cache Input File (hdfs://localhost.localdomain:8020/user/cloudera/top)
Impetigo|Streptococcus pyogenes#Impetigo
indeterminate leprosy|Uncharacteristic leprosy#indeterminate leprosy
Output I get is
(0)
(0)
(0)
(0)
(0)
This means that my hashmap is empty.
How do i fill my hashmap using Distributed Cache?.
This was because VectorizeData() was not called in the executable.

Eclipse Antlr Interpreter failure

The following grammar successfully generates a parser for treating the string 'ccdunion'. But when I try with the interpreter, I have NoViableAltException error, why?
grammar SimpleCalc;
options {
k = 2;
output = AST;
backtrack = true;
}
tokens {
UNION = 'union';
}
#header {
package myPack;
import java.io.File;
import java.io.FileWriter;
import java.io.PrintWriter;
}
#lexer::header {
package myPack;
}
#members {
public static void main(String[] args) throws Exception {
File fileOut = new File("src/ARFF.arff");
FileWriter fw = new FileWriter(fileOut);
PrintWriter pw = new PrintWriter(fw);
pw.print("a ");
pw.println("b");
pw.println(12);
fw.close();
SimpleCalcLexer lex = new SimpleCalcLexer(new ANTLRFileStream(args[0]));
CommonTokenStream tokens = new CommonTokenStream(lex);
SimpleCalcParser parser = new SimpleCalcParser(tokens);
try {
parser.expr();
} catch (RecognitionException e) {
e.printStackTrace();
}
}
}
expr
: filsPiquets EOF
{
System.out.println("base mono: ");
}
;
filsPiquets
: chars UNION
{
System.out.println("TXTunionII"+$chars.text);
}
;
chars
: CHAR
| . chars
;
CHAR
: .
;
... when I try with the interpreter, I have NoViableAltException error, why?
Because the interpreter is buggy: don't use it. Use the debugger instead. The same goes for ANTLRWorks.