Opening and closing AUTOCAD files using SCR Scripting - scripting

I have recently started writing scripts for AUTOCAD.
I want to do the following:
Suppose,I place my script in the current location.
In current location,I have several folders. Each folder in turn contain many folders,which again contain some drawing files (of type .DWG and .DWT).
I want to loop through each of the folder and get a list containing only .DWT files.
Now,I want to loop through each of the .DWT file and open the file in AUTOCAD, change the value of the parameter "DELOBJ" to 1 (say) and finally save,close the document.
Can we do it with normal SCR Scripting (or) can we do it using LISP Command?
I would be really glad,if someone can help me in this context
Thanks in advance.

Today I don't have enought time to prepare full sample (sorry) but let's start with:
Get the list of all *.DWT files. You may do it like this:
(defun CMD::Dir ( pattern / Shell Dirinf Outbuf CmdVal)
(setq cmd (strcat "%comspec% /C dir /S /B " pattern ) )
(print cmd )
(setq Shell (vlax-get-or-create-object "Wscript.Shell"))
(setq Dirinf(vlax-invoke-method Shell 'Exec cmd ))
(setq Outbuf(vlax-get-property Dirinf 'StdOut ))
( while (= :vlax-false (vlax-get-property Outbuf 'AtEndOfStream ) )
(setq CmdVal (append CmdVal (list (vlax-invoke-method Outbuf 'ReadLine ) ) ) )
)
(vlax-release-object Shell)
CmdVal
)
(setq files (CMD::Dir "**YourPath**\\*.dwt" ) )
then using (foreach file files .. ) open each drawing, and set value of DELOBJ. But remember, that LISP context is only in active drawing, so You can't use (setvar 'DELOBJ 1)
Probably you may to do it by vlax. but this is the time, when I can not help You now. When I have sample I will update.

The 'DELOBJ' Systemvariable is saved in the registry, so it has nothing to do with any documents... (indeed, some Sysvars are saved in Documents, but if you only need focus on this...)
refer:
ADSK Knowledge Network
So you set it once per profile (a simple .reg file would be enough)
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\SOFTWARE\Autodesk\AutoCAD\[Release]\[Product]\Profiles\[Profile]]
"Delobj"=dword:00000001

You would be able to pull the value from registry with this code.
(vl-registry-read (strcat "HKEY_CURRENT_USER\\" (vlax-product-key) "\\Profiles\\" (vla-get-ActiveProfile (vla-get-profiles (vla-get-preferences (vlax-get-Acad-Object)))) "\\General") "Delobj")
Check if its not 1 and then use vl-registry-write

Related

Emacs call-process stumbles upon supplied args content

I'm puzzled by unexpected behavior of a call-process call.
Here's a little function:
(defun work-in-progress()
"run ledger report and output result in current buffer.
The intention is to run following command (works perfectly from shell):
`ledger -U csv --sort date --file ledger-file child`
where ledger-file is a known and defined variable.
The problem is that the command fails with:
`Error: Illegal option -`
"
(interactive)
(cd ledger-dir)
(setq child "peter")
(setq parms "-U csv --sort date --file")
(setq prog "ledger") ; should be found via path variable or so
(call-process prog nil t t parms ledger-file child)
)
I've played around with the sequence of the ledger command options, but emacs always seems to complain about the first option or all options in the parms variable:
e.g.
(setq parms "--sort date -U csv --file")
results in
Error: Illegal option --sort date -U csv --file
iso
Error: Illegal option -
The ledger cli program isn't fussy about arguments sequence, both described option sequences work perfectly well when calling ledger at the command line.
This truly puzzles me. The documentation reads
call-process program &optional infile destination display &rest args
and infile is set to nil, destination as well as display are t, so why doesn'it grok the content of args variable?
Any help, correction and/or suggestion would be sincerely appreciated!
The tail of the list should be a sequence of strings, each corresponding to one argument.
(setq parms '("-U" "csv" "--sort" "date" "--file"))
(setq prog "ledger")
(apply #'call-process prog nil t t (append parms '(ledger-file child)))
You need apply to make the result of append into a continuation of the static list of arguments to call-process.
I had to guess what ledger-file and child are; if they are not strings, you need to convert them to strings.
To briefly recapitulate how the shell parses arguments, the command line
echo foo "bar $baz" 'quux $baz'
gets turned into the string array
'("echo" "foo" "bar <<value of baz>>" "quux $baz")
(to use a Lispy notation) and passed as arguments to execlp.
The solution is to pass each parameter as a separate string:
(defun work-in-progress()
"run ledger report and output result in current buffer. "
(interactive)
(cd ledger-dir)
(setq child "peter")
(setq prog "ledger") ; should be found via path variable or so
(call-process prog nil t t "--sort" "date" "-U" "csv" "--file" ledger-file child)
)

IDL batch processing: fully automatic input selection

I need to process MODIS ocean level 2 data and I obtained an external plugin for ENVI https://github.com/dawhite/EPOC/releases. Now, I want to batch process hundreds of images for which I modified the code like the following code. The code is running fine, but I have to select the input file every time. Can anyone please help me to make the program fully automatic? I really appreciate and thanks a lot for your help!
Pro OCL2convert
dir = 'C:\MODIS\'
CD, dir
; batch processing of level 2 ocean chlorophyll data
files=file_search('*.L2_LAC_OC.x.hdf', count=numfiles)
; this command will search for all files in the directory which end with
; the specified one
counter=0
; this is a counter that tells IDL which file is being read-starts at 0
While (counter LT numfiles) Do begin
; this command tells IDL to start a loop and to only finish when the counter
; is equal to the number of files with the name specified
name=files(counter)
openr, 1, name
proj = envi_proj_create(/utm, zone=40, datum='WGS-84')
ps = [1000.0d,1000.0d]
no_bowtie = 0 ;same as not setting the keyword
no_msg = 1 ;same as setting the keyword
;OUTPUT CHOICES
;0 -> standard product only
;1 -> georeferenced product only
;2 -> standard and georeferenced products
output_choice = 2
;RETURNED VALUES
;r_fid -> ENVI FID for the standard product, if requested
;georef_fid -> ENVI FID for the georeferenced product, if requested
convert_oc_l2_data, fname=fname, output_path=output_path, $
proj=proj, ps=ps, output_choice=output_choice, r_fid=r_fid, $
georef_fid=georef_fid, no_bowtie=no_bowtie, no_msg=no_msg
print,'done!'
close, 1
counter=counter+1
Endwhile
End
Not knowing what convert_oc_l2_data does (it appears to be a program you created, there is no public documentation for it), I would say that the problem might be that the out_path variable is not defined in the rest of your program.

dcl assignment from a command

I am new to DCL.
I want to get the out put of a command in a variable and iterate result one by one.
filePath=dir /since="time_now" [.SUBDIR]*.PNG/noheader/notrail
That's just not how we roll with DCL.
We don't do pipes, we do, but not really.
DIR/SINCE=NOW ... will not give anything by definition, since nothing exists since now.
Use /OUT to stick the directory output into a file, and then read ans parse (F$PARSE and/or F$ELEMENT and/or F$LOC)
Check out HELP OPEN; HELP READ [/END]; HELP LEXICAL
Google for examples.
More advanced DCL scripts use F$PARSE, F$SEARCH and F$FILE(file,CDT) to avoid activating images and creating temp files: $ HELP LEXICAL
Google for examples.
Check out yesterday stack-exhange entry ?! : OpenVMS - DELETE Line if TEXT like x
But if you are just starting... IMHO just skip DCL and stick to PERL
$ perl -e "for (<[.SUBDIR]*.PNG>) { next unless -M > 0.123; print; ... }"
Good luck!
Hein
top:
file = f$search("[.subdir]*.PNG")
if (file .eqs. "")then goto cont
mtime=f$file_attribute(file,"RDT")
if mtime.ges.build_start_time then -
name=f>parse(file,,,"NAME")
call CHECK "''name'"
goto top
cont:
#Hein please review this code and suggest changes

How to pipe visually selected text to a UNIX command and append output to current buffer in Vim

Using Vim, I'm trying to pipe text selected in visual mode to a UNIX command and have the output appended to the end of the current file. For example, say we have a SQL command such as:
SELECT * FROM mytable;
I want to do something like the following:
<ESC>
V " select text
:'<,'>!mysql -uuser -ppass mydb
But instead of having the output overwrite the currently selected text, I would like to have the output appended to the end of the file. You probably see where this is going. I'm working on using Vim as a simple SQL editor. That way, I don't have to leave Vim to edit, tweak, test SQL code.
How about copying the selected text to the end of the file, select the copy and run the command? If you do not want to repeat the same commands over and over again, you can record the sequence by using q or add a new command. I have tried the latter as follows:
:com -range C <line1>,<line2>yank | $ | put | .,$ !rev
With it you can select some lines and then type :C. This will first yank the selection, then go to the end of the file, paste the yanked text and run the command (rev in this case) over the new text.
If you prefer more programmatic approach, you can have
:call append(line("$"), system("command", GetSelectedText()))
where GetSelectedText is the reusable function:
func! GetSelectedText()
normal gv"xy
let result = getreg("x")
normal gv
return result
endfunc
Try
:r | YourCommand
For example:
:r ! echo foo
adds foo to your buffer.

Nano hacks: most useful tiny programs you've coded or come across

It's the first great virtue of programmers. All of us have, at one time or another automated a task with a bit of throw-away code. Sometimes it takes a couple seconds tapping out a one-liner, sometimes we spend an exorbitant amount of time automating away a two-second task and then never use it again.
What tiny hack have you found useful enough to reuse? To make go so far as to make an alias for?
Note: before answering, please check to make sure it's not already on favourite command-line tricks using BASH or perl/ruby one-liner questions.
i found this on dotfiles.org just today. it's very simple, but clever. i felt stupid for not having thought of it myself.
###
### Handy Extract Program
###
extract () {
if [ -f $1 ] ; then
case $1 in
*.tar.bz2) tar xvjf $1 ;;
*.tar.gz) tar xvzf $1 ;;
*.bz2) bunzip2 $1 ;;
*.rar) unrar x $1 ;;
*.gz) gunzip $1 ;;
*.tar) tar xvf $1 ;;
*.tbz2) tar xvjf $1 ;;
*.tgz) tar xvzf $1 ;;
*.zip) unzip $1 ;;
*.Z) uncompress $1 ;;
*.7z) 7z x $1 ;;
*) echo "'$1' cannot be extracted via >extract<" ;;
esac
else
echo "'$1' is not a valid file"
fi
}
Here's a filter that puts commas in the middle of any large numbers in standard input.
$ cat ~/bin/comma
#!/usr/bin/perl -p
s/(\d{4,})/commify($1)/ge;
sub commify {
local $_ = shift;
1 while s/^([ -+]?\d+)(\d{3})/$1,$2/;
return $_;
}
I usually wind up using it for long output lists of big numbers, and I tire of counting decimal places. Now instead of seeing
-rw-r--r-- 1 alester alester 2244487404 Oct 6 15:38 listdetail.sql
I can run that as ls -l | comma and see
-rw-r--r-- 1 alester alester 2,244,487,404 Oct 6 15:38 listdetail.sql
This script saved my career!
Quite a few years ago, i was working remotely on a client database. I updated a shipment to change its status. But I forgot the where clause.
I'll never forget the feeling in the pit of my stomach when I saw (6834 rows affected). I basically spent the entire night going through event logs and figuring out the proper status on all those shipments. Crap!
So I wrote a script (originally in awk) that would start a transaction for any updates, and check the rows affected before committing. This prevented any surprises.
So now I never do updates from command line without going through a script like this. Here it is (now in Python):
import sys
import subprocess as sp
pgm = "isql"
if len(sys.argv) == 1:
print "Usage: \nsql sql-string [rows-affected]"
sys.exit()
sql_str = sys.argv[1].upper()
max_rows_affected = 3
if len(sys.argv) > 2:
max_rows_affected = int(sys.argv[2])
if sql_str.startswith("UPDATE"):
sql_str = "BEGIN TRANSACTION\\n" + sql_str
p1 = sp.Popen([pgm, sql_str],stdout=sp.PIPE,
shell=True)
(stdout, stderr) = p1.communicate()
print stdout
# example -> (33 rows affected)
affected = stdout.splitlines()[-1]
affected = affected.split()[0].lstrip('(')
num_affected = int(affected)
if num_affected > max_rows_affected:
print "WARNING! ", num_affected,"rows were affected, rolling back..."
sql_str = "ROLLBACK TRANSACTION"
ret_code = sp.call([pgm, sql_str], shell=True)
else:
sql_str = "COMMIT TRANSACTION"
ret_code = sp.call([pgm, sql_str], shell=True)
else:
ret_code = sp.call([pgm, sql_str], shell=True)
I use this script under assorted linuxes to check whether a directory copy between machines (or to CD/DVD) worked or whether copying (e.g. ext3 utf8 filenames -> fusebl
k) has mangled special characters in the filenames.
#!/bin/bash
## dsum Do checksums recursively over a directory.
## Typical usage: dsum <directory> > outfile
export LC_ALL=C # Optional - use sort order across different locales
if [ $# != 1 ]; then echo "Usage: ${0/*\//} <directory>" 1>&2; exit; fi
cd $1 1>&2 || exit
#findargs=-follow # Uncomment to follow symbolic links
find . $findargs -type f | sort | xargs -d'\n' cksum
Sorry, don't have the exact code handy, but I coded a regular expression for searching source code in VS.Net that allowed me to search anything not in comments. It came in very useful in a particular project I was working on, where people insisted that commenting out code was good practice, in case you wanted to go back and see what the code used to do.
I have two ruby scripts that I modify regularly to download all of various webcomics. Extremely handy! Note: They require wget, so probably linux. Note2: read these before you try them, they need a little bit of modification for each site.
Date based downloader:
#!/usr/bin/ruby -w
Day = 60 * 60 * 24
Fromat = "hjlsdahjsd/comics/st%Y%m%d.gif"
t = Time.local(2005, 2, 5)
MWF = [1,3,5]
until t == Time.local(2007, 7, 9)
if MWF.include? t.wday
`wget #{t.strftime(Fromat)}`
sleep 3
end
t += Day
end
Or you can use the number based one:
#!/usr/bin/ruby -w
Fromat = "http://fdsafdsa/comics/%08d.gif"
1.upto(986) do |i|
`wget #{sprintf(Fromat, i)}`
sleep 1
end
Instead of having to repeatedly open files in SQL Query Analyser and run them, I found the syntax needed to make a batch file, and could then run 100 at once. Oh the sweet sweet joy! I've used this ever since.
isqlw -S servername -d dbname -E -i F:\blah\whatever.sql -o F:\results.txt
This goes back to my COBOL days but I had two generic COBOL programs, one batch and one online (mainframe folks will know what these are). They were shells of a program that could take any set of parameters and/or files and be run, batch or executed in an IMS test region. I had them set up so that depending on the parameters I could access files, databases(DB2 or IMS DB) and or just manipulate working storage or whatever.
It was great because I could test that date function without guessing or test why there was truncation or why there was a database ABEND. The programs grew in size as time went on to include all sorts of tests and become a staple of the development group. Everyone knew where the code resided and included them in their unit testing as well. Those programs got so large (most of the code were commented out tests) and it was all contributed by people through the years. They saved so much time and settled so many disagreements!
I coded a Perl script to map dependencies, without going into an endless loop, For a legacy C program I inherited .... that also had a diamond dependency problem.
I wrote small program that e-mailed me when I received e-mails from friends, on an rarely used e-mail account.
I wrote another small program that sent me text messages if my home IP changes.
To name a few.
Years ago I built a suite of applications on a custom web application platform in PERL.
One cool feature was to convert SQL query strings into human readable sentences that described what the results were.
The code was relatively short but the end effect was nice.
I've got a little app that you run and it dumps a GUID into the clipboard. You can run it /noui or not. With UI, its a single button that drops a new GUID every time you click it. Without it drops a new one and then exits.
I mostly use it from within VS. I have it as an external app and mapped to a shortcut. I'm writing an app that relies heavily on xaml and guids, so I always find I need to paste a new guid into xaml...
Any time I write a clever list comprehension or use of map/reduce in python. There was one like this:
if reduce(lambda x, c: locks[x] and c, locknames, True):
print "Sub-threads terminated!"
The reason I remember that is that I came up with it myself, then saw the exact same code on somebody else's website. Now-adays it'd probably be done like:
if all(map(lambda z: locks[z], locknames)):
print "ya trik"
I've got 20 or 30 of these things lying around because once I coded up the framework for my standard console app in windows I can pretty much drop in any logic I want, so I got a lot of these little things that solve specific problems.
I guess the ones I'm using a lot right now is a console app that takes stdin and colorizes the output based on xml profiles that match regular expressions to colors. I use it for watching my log files from builds. The other one is a command line launcher so I don't pollute my PATH env var and it would exceed the limit on some systems anyway, namely win2k.
I'm constantly connecting to various linux servers from my own desktop throughout my workday, so I created a few aliases that will launch an xterm on those machines and set the title, background color, and other tweaks:
alias x="xterm" # local
alias xd="ssh -Xf me#development_host xterm -bg aliceblue -ls -sb -bc -geometry 100x30 -title Development"
alias xp="ssh -Xf me#production_host xterm -bg thistle1 ..."
I have a bunch of servers I frequently connect to, as well, but they're all on my local network. This Ruby script prints out the command to create aliases for any machine with ssh open:
#!/usr/bin/env ruby
require 'rubygems'
require 'dnssd'
handle = DNSSD.browse('_ssh._tcp') do |reply|
print "alias #{reply.name}='ssh #{reply.name}.#{reply.domain}';"
end
sleep 1
handle.stop
Use it like this in your .bash_profile:
eval `ruby ~/.alias_shares`