First of all I don't know why "g++ -std=c++0x -Wall" would give me warning: invalid suffix on literal; C++11 requires a space between literal and string macro [-Wliteral-suffix] on the following program:
#include <iostream>
#define BEGIN "<b>"
#define END "</b>"
#pragma GCC diagnostic ignored "-Wliteral-suffix"
int main()
{
std::cout << "hello " BEGIN"world"END "\n";
}
Second, I followed gcc doc to ignore "-Wliteral-suffix" but still got the warning. How do I suppress the warning? And why does the compiler warn in the first place?
Ok, to summarize: the failure to suppress the warning is a known gcc bug (gcc.gnu.org/bugzilla/show_bug.cgi?id=61653). Since you cannot (and really, should not) suppress the warning, the easiest fix is to put a space between the literal and the #define string. You can safely do this; it won't change the output text.
The reason this is no longer allowed is because characters directly after a literal string are treated as user-defined literals, which is a new feature in C++11. User-defined literals are considered to be part of the same single token as the literal they modify, thus END will not be subject to replacement by the earlier #define.
Related
Im having some trouble with my analizer. I.m trying to use yytext inside my yyerror but it shows me this error, can you help me?
You can't use yytext in your parser because it is defined by the lexer.
Indeed, you normally shouldn't use yytext in your parser because its value is not meaningful to the parse. Your attempt to use it to provide context in error messages is just about the only reasonable use, and even then there is a certain ambiguity because you can't tell whether the erroneous token is the one currently in yytext or the previous token, which was overwritten when the parser obtained its lookahead token.
In any case, if you want to refer to yytext inside your parser, you'll need to declare it, which will normally require putting
extern char* yytext;
into your bison grammar file. Since the only place you can reasonably use yytext is yyerror, you might change the definition of that function to:
void yyerror(const char* msg) {
extern char* yytext;
fprintf(stderr, "%s at line %d near '%s'\n", msg, nLineas, yytext);
}
Note that you can get flex to track line numbers automatically, so you don't need to track your own nLineas variable. Just add
%option yylineno
at the top of your flex file, and the global variable yylineno will automatically be maintained during lexical analysis. If you want to use yylineno in your parser, you'll need to add an an extern declaration for it as well:
extern int yylineno;
Again, using yylineno in the parser may be imprecise because it might refer to the line number of the token following the error, which might be on a different line from the error (and might even be separated from the error by many lines of comments).
As an alternative to using external declarations of yytext and yylineno, you are free to put the implementation of yyerror inside the scanner definition instead of the grammar definition. Your grammar file should already have a forward declaration of yyerror, so it doesn't matter which file it's placed in. If you put it into the scanner file, global scanner variables will already be declared.
I created a special console log function macro. It works successfully except when there's a comma in the parameter, even if it's part of another expression, i.e. not another argument. I think it's due to the fact that macros are expanded at the pre-processing stage, so the semantic analysis hasn't occurred yet to understand that the comma is not another argument. Here is what I mean:
#define FANCY_LOG(message) [MyLogger logDebug:message withClassAndMethodName: __PRETTY_FUNCTION__ lineNumber: __LINE__];
+(BOOL)logDebug:(NSString *)message withClassAndMethodName:(const char *)name lineNumber:(int)lineNumber;
These work:
FANCY_LOG(#"Hello world");
FANCY_LOG([NSString stringWithFormat:#"Hello!"]);
This does not work:
FANCY_LOG([NSString stringWithFormat:#"Hello %#!", planet]);
Although the comma obviously is part of the NSString expression, the macro interprets it as another argument, I get the following error:
Too many arguments provided to function-like macro invocation
Here's what I have tried unsuccessfully (and variants of these):
#define FANCY_LOG(...) [MyLogger logDebug:##__VA_ARGS___ withClassAndMethodName: __PRETTY_FUNCTION__ lineNumber: __LINE__];
#define FANCY_LOG(message) [MyLogger logDebug:#message withClassAndMethodName: __PRETTY_FUNCTION__ lineNumber: __LINE__];
You are doing that wrong. First of all there are lots of great ready solutions so you do not have reinvent the wheel (don't remember for sure but I think CocoaLumberjack is best).
And your logger can look like this (I've got rusty with Objective C):
+(void) function:(char *)methodName
inLine:(int)line
logs:(NSString *)format, ...;
...
#define FANCY_LOG(...) [MyLogger function: __PRETTY_FUNCTION__ \
inLine: __LINE__ \
logs: __VA_ARGS__]
// then usage:
FANCY_LOG(#"Hello %#!", planet);
i'm trying to run this .y file
%{
#include <stdlib.h>
#include <stdio.h>
int yylex();
int yyerror();
%}
%start BEGIN
%%
BEGIN: 'a' | BEGIN 'a'
%%
int yylex(){
return getchar();
}
int yyerror(char* s){
fprintf(stderr, "*** ERROR: %s\n", s);
return 0;
}
int main(int argn, char **argv){
yyparse();
return 0;
}
It's a simple program in bison, the syntax seems to me correct, but always get the Syntax error problem ...
Thanks for your help.
The lexer function yylex needs to return 0 to indicate the end of the input. However, your implementation simply passes through the value returned by getchar, which will be EOF (normally -1).
Also, your input is almost certain to include a newline character, which will also be passed through to the parser.
Since the parser recognizes neither \n nor EOF, it produces an error when it receives one of them.
At a minimum, you would need to modify yylex to correctly respond to end of input:
int yylex(void) {
int ch = getchar();
return (ch == EOF) ? 0 : ch;
}
But you will still have to deal with newline charactets, either by handling them in your lexer (possibly ignoring them or possibly returning an end of input imdication), or by handling them in your grammar.
Note that bison/yacc-generated parsers always parse the entire input stream, not just the longest sequence satisfying the grammar. That can be adjusted with some work -- see the documentation for the YYACCEPT special action -- but the standard behaviour is usually what is desired when parsing.
By the way, please use standard style conventions in your bison/yacc grammars, in order to avoid problems and in order to avoid confusing readers. Normally we reserve UPPER_CASE for terminal symbols, since those are also used as compile-time constants in the lexer. Non-terminals are usually written in lower_case although some prefer to use CamelCase. For the terminals, you need to avoid the use of names reserved by the standard library (such as EOF) or by (f)lex (BEGIN) or bison/yacc (END). There are lists of reserved names in the manuals.
I use OpenCL on an ATI card with the printf extension enabled. I've written a function to print out variables:
void printVar(constant char* name, float var)
{
printf("%s: %f\r\n", name, var);
}
This code works as expected when compiled as plain C, but if i invoke it in OpenCL with
printVar("foo", 0.123);
the result is always some random char followed by 0.123 instead of "foo: 0.123". I guess the compiler has problems with recognizing the char* string, is there a workaround or a fix so i can get the function working?
As I mentioned in my comment I also get the same behavior, however I can suggest a simple workaround for the use case you showed, I.e. when the string is known at compile time we could just use a define statement instead:
#define PRINTVAR(N,X) (printf(N ": %f\r\n", X))
Can this statement ever fail?
if (#"Hello") foo(); ?
In other words can there be a situation where in the compiler fails to allocate enough storage space for literals. I know this sounds ridiculous for short length literals but what about really long ones.
No.
NSString literals are "allocated" at compile time and form part of the text segment of your program.
Edit
To answer the other part of the question, if the compiler fails to allocate enough memory for the literal, the if statement won't fail, but the compilation will.
I don't know what the effective upper limit to the length of a string literal is, but it's bound to be less than NSIntegerMax unichars because NSNotFound is defined as NSIntegerMax. According to the clang docs, the length in bytes of a string literal is an unsigned int and NSString string literals are sequences of unichars.
I'm pretty sure if you try to compile a file with the literal
#" ... 1TB worth of characters ... "
the compiler will fail. The C standard available here says that any compatible compiler needs to support at least 4095 characters per string literal, etc. See Section 5.2.4.1. I'm sure GCC and clang allows much bigger literals.