I want my program to ask for an expression, assign an inputted string to a variable 'exp' and then print the expression.
However I am having some trouble. I first tried using (read)
(princ "Enter a expression to be evaluated.")
(setf exp (read))
(princ exp)
However when i use this code, this happens.
Hello this is an expression ;This is what I input
Enter a expression to be evaluated.HELLO
T
I then tried to use (read-line), but when I do this, I don't seem to be asked for an input at all.
(princ "Enter a expression to be evaluated.")
(setf exp (read-line))
(princ exp)
gets
Enter a expression to be evaluated.
T
The program just ends.
After some answers I have come up with this
(defun get-input (prompt)
(clear-input)
(write-string prompt)
(finish-output)
(setf exp (read-line)))
(get-input "Enter an expression: ")
(princ exp)
However when i run this the following happens
My first sentence ;My first input
Enter an expression: My second sentence ;it then asks for input, i do so
My second sentence ;my second input is printed back at me
T
This is kind of a FAQ.
Output can be buffered. Use FINISH-OUTPUT to make sure that the output has actually reached its destination.
READ reads Lisp s-expressions. It returns the corresponding data-structure. It's only useful when you enter a valid s-expression.
READ-LINE reads a line and returns a string.
Example:
*
(defun ask (&optional (message "Input: "))
(clear-input) ; get rid of pending input
(write-string message) ;
(finish-output) ; make sure output gets visible
(read-line)) ; read a line as a string
ASK
* (ask "Name: ")
Name: Rainer
"Rainer"
NIL
File p.lisp:
(defun get-input (prompt)
(clear-input)
(write-string prompt)
(finish-output)
(read-line))
(write-string (get-input "Enter a sentence: "))
(finish-output)
Output
* (load "/tmp/p.lisp")
Enter a sentence: foo is not a bar
foo is not a bar
T
Related
Why does the first expression interpret but not the second?
I understand them to be identical, that is, a string invoking an anonymous regex.
("foo").first: /foo/ andthen say "$_ bar";
> foo bar
"foo": /foo/ andthen say "$_ bar";
> Confused
> at /home/dmc7z/raku/foo.raku:2
> ------> "foo":⏏ /foo/ andthen say "$_ bar";
> expecting any of:
> colon pair
This is a method call:
("foo").first: /foo/
It's the same as if you had written:
("foo").first( /foo/ )
Or just:
"foo".first( /foo/ )
(Note that I used : at the end of the three above descriptions in English. That's where the idea to use : to mean that whatever following it is part of the same expression comes from.)
In this case it doesn't make a whole lot of sense to use first. Instead I would use ~~.
"foo" ~~ /foo/ andthen say "$_ bar";
first is used to find the first item in a list that matches some description. If you use it on a single item it will either return the item, or return Nil. It always returns one value, and Nil is the most sensible single undefined value in this case.
The reason it says it's expecting a colon pair is that is the only use of : that could be valid in that location. Honestly I halfway expected it to complain that it was an invalid label.
Using sql-send-buffer, I can send a SQL query from a file to an open SQL REPL. Many of my queries have parameter placeholders (in Postgres syntax, $1, $2 &c.) . Does anyone have code analogous to sql-send-buffer that will prompt for values to fill in for these parameters? Ideally, I'd like it to store the values I provide, and not prompt again unless I add parameters or close the file.
Currently I either:
replace the parameters in the file, try to remember not to commit or deploy these test values (error prone)
copy the query into the REPL, replace parameters there (tedious)
Something like this, perhaps:
(defvar my-sql-replacements nil)
(make-variable-buffer-local 'my-sql-replacements)
(defun my-sql-send-buffer-replace ()
(interactive)
(let ((string (buffer-substring-no-properties (point-min) (point-max))))
(while (string-match "[$][0-9]" string)
(let* ((placeholder (match-string 0 string))
(replacement (or (cdr (assoc placeholder my-sql-replacements))
(read-string (format "Replacement for %s: " placeholder)))))
(unless (assoc placeholder my-sql-replacements)
(push (cons placeholder replacement) my-sql-replacements))
(setq string (replace-regexp-in-string (regexp-quote placeholder) replacement string))))
(sql-send-string string)))
I haven't tested it with an actual SQL server, but from tracing sql-send-string it looks like it should work. It stores the replacements in a buffer-local variable.
I sometimes put examples of function calls and their output in the documentation string of a function definition.
(defun js-[] (&rest args)
"Javascript array literal statement.
(js-[] 1 2 3)
> \"[1, 2, 3]\"
"
(format nil "[~{~A~^, ~}]" (mapcar #'js-expr args)))
But sometimes the output of the function is a string. So I have to escape the double quotes in the example output. This becomes tedious very quickly.
Is there a way to change the docstring delimiter from double quotes to something else so I don't have to keep escaping them?
Please note that sometimes it's worse than just escaping once:
(defun js-~ (str)
"Javascript string statement. This is needed so that double quotes are inserted.
(js-~ \"string\")
> \"\\\"string\\\"\"
"
(format nil "\"~A\"" str))
Here there is an additional problem. Reading the docstring is difficult.
TL;DR
Yes, you can, no, you do not want to do it.
No, CL has just one syntax for strings
The only way to represent a string in Common Lisp is to use
Double-Quote ".
Yes, you can modify the reader so that something else denotes a string
E.g., suppose you want to a string to be started and stopped by, say, #.
(This is an ordinary character rarely used in symbol names,
in contrast to % and $ which are often used in implementation-internal symbols.)
Set the properties of # from ":
(multiple-value-bind (function non-terminating-p)
(get-macro-character #\")
(set-macro-character #\# function non-terminating-p))
Now:
(read-from-string "#123#")
==> "123" ; 5
(read-from-string #"123"#)
==> "123" ; 5
Do not forget to restore the input syntax to standard Common Lisp syntax:
(setq *readtable* (copy-readtable nil))
See Reader.
You might be able to modify the printer
The standard does not require that the printing of standard objects
(such as a string) to be
used-modifiable.
You can try defining a print-object method:
(defmethod print-object ((o string) (d stream))
...)
however,
implementing this correctly is not easy
this is non-conforming code (defining a method for a standardized generic function which is applicable when all of the arguments are direct instances of standardized classes)
thus many implementations will signal errors on this code,
even if you disable package locks &c, the implementation is free to ignore your method.
No, you do not want to do that
The code exists for people to read it.
Changing Lisp syntax will make it harder for others to read your code.
It will also confuse various tools you use (editor &c).
CL has many warts, but this is not one of them ;-)
PS. See also documentation and describe, as well as comment syntax Sharpsign Vertical-Bar and Semicolon.
You could make a reader macro that slurps in a multi line string like this:
(defun hash-slash-reader (stream slash arg)
(declare (ignore slash arg))
(loop :with s := (make-string-output-stream)
:for c := (read-char stream)
:if (and (eql #\/ c) (eql #\# (peek-char nil stream)))
:do (read-char stream) (return (get-output-stream-string s))
:if (eql #\Newline c)
:do (peek-char t stream)
:do (princ c s)))
(set-dispatch-macro-character #\# #\/ #'hash-slash-reader)
Now you can do:
(defun js-~ (str)
#/ --------------------------
Javascript string statement.
This is needed so that double quotes are inserted.
(js-~ "string")
> "\"string\""
-------------------------- /#
(format nil "\"~A\"" str))
The documentation string will be added just as if you'd written it with double quotes. This is effectively the same as changing the delimiter for strings!. In fact, it is an additional way to delimit strings.
Which is why you can use it (not recommended though) in regular lisp code, and not just for documentation purposes.
Using / as the sub-character of the dispatch macro, helps keep it conceptually close to the multiline comment, but avoids being ignored by the reader altogether.
Another idea. Write your docstrings as usual, without examples.
(defun js-~ (str)
"Javascript array literal statement."
...)
Define tests. That can be as simple as:
(defparameter *tests*
'(((js-~ "string") . "\"string\"")
...))
Use that list to perform tests:
(loop for (form . expected) in *tests*
for (fn . args) = form
for actual = (apply fn args)
do (assert (equalp actual expected)))
... and to update the documentation. Be careful, this appends to the existing documentation string, so don't run it twice.
(loop for (form . expected) in *tests*
for (fn . args) = form
do (setf (documentation fn 'function)
(format nil
"~a~%~% ~S~% => ~S"
(documentation fn 'function)
form
expected)))
You can (ab)use cl-interpol. Although the purpose of the library is to enable string interpolation it also allows custom string delimiters, if you don't mind preprending the string with #?. e.j.
CL-USER> (cl-interpol:enable-interpol-syntax)
; No values
CL-USER> #?'foo'
"foo"
CL-USER> #?/foo/
"foo"
CL-USER> #?{foo}
"foo"
CL-USER>
so after enabling the interpol reader macro you could write
(defun js-[] (&rest args)
#?'Javascript array literal statement.
(js-[] 1 2 3)
> "[1, 2, 3]"
'
I'm using an AutoIt script to automate interaction with a GUI, and part of the process involves using the ControlSend() function to place a file path into a combo box. The majority of the time, the process works properly, but occasionally ( ~ 1/50 calls to the function? ) a single hyphen in the filepath is replaced with an underscore. The script is to be run unsupervised for bulk data processing, and such an error typically results in a forced-focus popup that screams "The file could not be found!" and halts further processing.
Unfortunately, due to the character limit of the combo box, I cannot supply all 16 arguments with a single call, and I am forced to load each of the images individually using the following for-loop:
;Iterate through each command line argument (file path)
For $i = 1 To $CmdLine[0]
;click the "Disk" Button to load an image from disk
ControlClick("Assemble HDR Image", "", "[CLASS:Button; TEXT:Disk; Instance:1]")
;Give the dialogue time to open before entering text
Sleep(1000)
;Send a single file path to the combo box
ControlSend("Open", "" , "Edit1", $CmdLine[$i])
;"Press Enter" to load the image
Send("{ENTER}")
Next
In an errant run, the file path
C:\my\file\path\hdr_2016-04-22T080033_00_rgb
^Hyphen
is converted to
C:\my\file\path\hdr_2016_04-22T080033_00_rgb
^Underscore
Due to the existence of both hyphens and underscores in the file name, it is difficult to perform a programmatic correction (e.g. replace all underscores with hyphens).
What can be done to correct or prevent such an error?
This is both my first attempt at GUI automation and my first question on SO, and I apologize for my lack of experience, poor wording, or deviations from StackOverflow convention.
Just use ControlSetText instead of ControlSend as it will set the complete Text at once and won't allow other keystrokes (like Shift) to interfere with the many virtual keystrokes that the Send-function fires.
If the hyphen is the problem and you need to replace it, you can do so:
#include <File.au3>
; your path
$sPath = 'C:\my\file\path'
; get all files from this path
$aFiles = _FileListToArray($sPath, '*', 1)
; if all your files looks like that (with or without hyphen), you can work with "StringRegExpReplace"
; 'hdr_2016-04-22T080033_00_rgb'
$sPattern = '(\D+\d{4})(.)(.+)'
; it means:
; 1st group: (\D+\d{4})
; \D+ one or more non-digit, i.e. "hdr_"
; \d{4} digit 4-times, i.e. "2016"
; 2nd group: (.)
; . any character, hyphen, underscore or other, only one character, i.e. "~"
; 3rd group: (.+)
; . any character, one or more times, i.e. "22T080033_00_rgb"
; now you change the filename for all cases, where this pattern matches
Local $sTmpName
For $i = 1 To $aFiles[0]
; check for pattern match
If StringRegExp($aFiles[$i]) Then
; replace the 2nd group with underscore
$sTmpName = StringRegExpReplace($aFiles[$i], $sPattern, '\1_\3')
FileMove($sPath & '\' & $aFiles[$i], $sPath & '\' & $sTmpName)
EndIf
Next
I am parsing SQL in Haskell using Parsec. How can I ensure that a statement with a where clause will not treat the WHERE as a table name? Find below some part of my coding. The p_Combination works but it sees the WHERE as part of the list of attributes instead of the where clause.
--- from clause
data Table_clause = Table {table_name :: String, alias :: Maybe String} deriving Show
p_Table_clause:: Parser Table_clause
p_Table_clause = do
t <- word
skipMany (space <?> "require space at the Table clause")
a <- optionMaybe (many1 (alphaNum)) <?> "aliase for table or nothing"
return $ Table t a
newtype From_clause = From [Table_clause] deriving Show
p_From_clause :: Parser From_clause
p_From_clause = do
string "FROM" <?> "From";
skipMany1 (space <?> "space in the from clause ")
x <- sepBy p_Table_clause (many1(char ',' <|> space))
return $ From x
-- where clause conditions elements
data WhereClause = WhereFCondi String deriving Show
p_WhereClause :: Parser WhereClause
p_WhereClause = do
string "WHERE"
skipMany1 space
x <- word
return $ WhereFCondi x
data Combination = FromWhere From_clause (Maybe WhereClause) deriving Show
p_Combination :: Parser Combination
p_Combination = do
x <- p_From_clause
skipMany1 space
y <- optionMaybe p_WhereClause
return $ FromWhere x y
Normal SQL parsers have a number of reserved words, and they’re often not context-sensitive. That is, even where a where might be unambiguous, it is not allowed simply because it is reserved. I’d guess most implementations do this by first lexing the source in a conceptually separate stage from parsing the lexed tokens, but we do not need to do that with Parsec.
Usually the way we do this with Parsec is by using Text.Parsec.Token. To use it, you first create a LanguageDef defining some basic characteristics about the language you intend to parse: how comments work, the reserved words, whether it’s case sensitive or not, etc. Then you use makeTokenParser to get a record full of functions tailored to that language. For example, identifier will not match any reserved word, they are all careful to require whitespace where necessary, and when they are skipping whitespace, comments are also skipped.
If you want to stay with your current approach, using only Parsec primitives, you’ll probably want to look into notFollowedBy. This doesn’t do exactly what your code does, but it should provide some inspiration about how to use it:
string "FROM" >> many1 space
tableName <- many1 alphaNum <* many1 space
aliasName <- optionMaybe $ notFollowedBy (string "WHERE" >> many1 space)
>> many1 alphaNum <* many1 space
Essentially:
Parse a FROM, then whitespace.
Parse a table name, then whitespace.
If WHERE followed by whitespace is not next, parse an alias name then whitespace.
I guess the problem is that p_Table_clause accepts "WHERE". To fix this, check for "WHERE" and fail the parser:
p_Table_clause = do
t <- try (do w <- word
if w == "WHERE"
then unexpected "keyword WHERE"
else return w)
...
I guess there might be a try missing in sepBy p_Table_clause (many1 (char ',' <|> space)). I would try sepBy p_Table_clause (try (many1 (char ',' <|> space))).
(Or actually, I would follow the advice from the Parsec documentation and define a lexeme combinator to handle whitespace).
I don't really see the combinator you need right away, if there is one. Basically, you need p_Combination to try (string "WHERE" >> skipMany1 space) and if it succeeds, parse a WHERE "body" and be done. If it fails, try p_Table_clause, if it fails be done. If p_Table_clause succeeds, read the separator and loop. After the loop is done prepend the Table_clause to the results.
There's some other problems with your parser, too. many1 (char ',' <|> space) matches " ,,, , ,, " which is not a valid separator between tables in a from clause, for example. Also, SQL keywords are case-insensitive, IIRC.
In general, you want to exclude keywords from matching identifiers, with something like:
keyword :: Parser Keyword
keyword = string "WHERE" >> return KW_Where
<|> string "FROM" >> return KW_From
<|> string "SELECT" >> return KW_Select
identifier :: Parser String
identifier = try (keyword >> \kw -> fail $ "Expected identifier; got:" ++ show kw)
<|> (liftA2 (:) identiferStart (many identifierPart))
If two (or more) or your keywords have common prefixes, you'll want to combine them for more efficiency (less backtracking) like:
keyword :: Parser Keyword
keyword = char 'D' >> ( string "ROP" >> KW_Drop
<|> string "ELETE" >> KW_Delete
)
<|> string "INSERT" >> return KW_Insert