dont work lappend in proc tcl - scripting

I want get list of included files in HDL designer with tcl script. I get example that printed list of files into a log and change it. I add global variable filenames and append all filenames to it, but when proc will be executes my variable is empty.
proc walkDependencies {decl} {
global alreadyDone filenames
# Only look at each declaration once.
if {[info exists alreadyDone($decl)]} {
return
}
set alreadyDone($decl) 1
# Only report each file once.
set declFile [$decl file]
if {[info exists alreadyDone($declFile)]} {
set reportFile 0
} else {
set reportFile 1
set alreadyDone($declFile) 1
}
foreach pkg [$decl packages] {
walkDependencies $pkg
}
if {[$decl configure class] eq "architecture"} {
walkDependencies [$decl entity]
foreach inst [$decl instances] {
if {![catch {$inst child} child]} {
walkDependencies $child
}
}
}
set file [$decl file]
set fileType [$file configure type]
if {![regexp {Text$} $fileType]} {
if {[lsearch {symbol blockInterface} $fileType] != -1} {
# This assumes ent+arch are generated to a single file.
set reportFile 0
} else {
set file [$file generated]
}
}
if {$reportFile} {
set lib [$file library]
# Exclude standard and downstreamOnly libraries.
if {[$lib configure type] eq "regular"} {
# puts "[$lib configure hardHdlDir]/[$file configure relativePathname]"
set tmp "[$lib configure hardHdlDir]/[$file configure relativePathname]"
lappend $filenames $tmp
}
}
if {[$decl configure class] eq "packageHeader"} {
walkDependencies [$decl body]
}
}
set filenames
catch {unset alreadyDone}
set lib [library open Travers_lib]
walkDependencies [$lib declaration Travers_top struct]
foreach i $filenames {
puts $i
}
if uncomment line # puts "[$lib configure hardHdlDir]/[$file configure relativePathname]" all included files will be printed into a log.

Your problem is most likely that you are using
lappend $filenames $tmp
which means that you are appending the value of tmp to a local variable whose name is equal to the value of filenames. Try
lappend filenames $tmp
instead.
Documentation:
lappend

Related

How should I bundle a library of text files with my module?

I have the following structure in the resources directory in a module I'm building:
resources
|-- examples
|-- Arrays
| |-- file
|-- Lists
|-- file1
|-- file2
I have the following code to collect and process these files:
use v6.d;
unit module Doc::Examples::Resources;
class Resource {
has Str $.name;
has Resource #.resources;
has Resource %.resource-index;
method resource-names() {
#.resources>>.name.sort
}
method list-resources() {
self.resource-names>>.say;
}
method is-resource(Str:D $lesson) {
$lesson ~~ any self.resource-names;
}
method get-resource(Str:D $lesson) {
if !self.is-resource($lesson) {
say "Sorry, that lesson does not exist.";
return;
}
return %.resource-index{$lesson};
}
}
class Lesson is Resource {
use Doc::Parser;
use Doc::Subroutines;
has IO $.file;
method new(IO:D :$file) {
my $name = $file.basename;
self.bless(:$name, :$file)
}
method parse() {
my #parsed = parse-file $.file.path;
die "Failed parse examples from $.file" if #parsed.^name eq 'Any';
for #parsed -> $section {
my $heading = $section<meta>[0] || '';
my $intro = $section<meta>[1] || '';
say $heading.uc ~ "\n" if $heading && !$intro;
say $heading.uc if $heading && $intro;
say $intro ~ "\n" if $intro;
for $section<code>.Array {
die "Failed parse examples from $.file, check it's syntax." if $_.^name eq 'Any';
das |$_>>.trim;
}
}
}
}
class Topic is Resource {
method new(IO:D :$dir) {
my $files = dir $?DISTRIBUTION.content("$dir");
my #lessons;
my $name = $dir.basename;
my %lesson-index;
for $files.Array -> $file {
my $lesson = Lesson.new(:$file);
push #lessons, $lesson;
%lesson-index{$lesson.name} = $lesson;
}
self.bless(:$name, resources => #lessons, resource-index => %lesson-index);
}
}
class LocalResources is Resource is export {
method new() {
my $dirs = dir $?DISTRIBUTION.content('resources/examples');
my #resources;
my %resource-index;
for $dirs.Array -> $dir {
my $t = Topic.new(:$dir);
push #resources, $t;
%resource-index{$t.name} = $t;
}
self.bless(:#resources, :%resource-index)
}
method list-lessons(Str:D $topic) {
self.get-resource($topic).list-lessons;
}
method parse-lesson(Str:D $topic, Str:D $lesson) {
self.get-resource($topic).get-resource($lesson).parse;
}
}
It works. However, I'm told that this is not reliable and there there is no guarantee that lines like my $files = dir $?DISTRIBUTION.content("$dir"); will work after the module is installed or will continue to work into the future.
So what are better options for bundling a library of text files with my module that can be accessed and found by the module?
Files under the resources directory will always be available as keys to the %?RESOURCES compile-time variable if you declare them in the META6.json file this way:
"resources": [
"examples/Array/file",
]
and so on.
I've settled on a solution. As pointed out by jjmerelo, the META6.json file contains a list of resources and, if you use the comma IDE, the list of resources is automatically generated for you.
From within the module's code, the list of resources can be accessed via the $?DISTRIBUTION variable like so:
my #resources = $?DISTRIBUTION.meta<resources>
From here, I can build up my list of resources.
One note on something I discovered: the $?DISTRIBUTION variable is not accessible from a test script. It has to be placed inside a module in the lib directory of the distribution and exported.

Perl6: How to find all installed modules whose filename matches a pattern?

Is it possible in Perl6 to find all installed modules whose file-name matches a pattern?
In Perl5 I would write it like this:
use File::Spec::Functions qw( catfile );
my %installed;
for my $dir ( #INC ) {
my $glob_pattern = catfile $dir, 'App', 'DBBrowser', 'DB', '*.pm';
map { $installed{$_}++ } glob $glob_pattern;
}
There is currently no way to get the original file name of an installed module. However it is possible to get the module names
sub list-installed {
my #curs = $*REPO.repo-chain.grep(*.?prefix.?e);
my #repo-dirs = #curs>>.prefix;
my #dist-dirs = |#repo-dirs.map(*.child('dist')).grep(*.e);
my #dist-files = |#dist-dirs.map(*.IO.dir.grep(*.IO.f).Slip);
my $dists := gather for #dist-files -> $file {
if try { Distribution.new( |%(from-json($file.IO.slurp)) ) } -> $dist {
my $cur = #curs.first: {.prefix eq $file.parent.parent}
take $_ for $dist.hash<provides>.keys;
}
}
}
.say for list-installed();
see: Zef::Client.list-installed()

Losing time with displays and printed content when cleaning XML files

I'm trying to create a PowerShell routine to clean XML files automatically. I have succesfully created my routine, and I'm able to clear a file with different functions and scripts. But I want to launch my PowerShell routine every time that I have new XML file. So I've decided to add a system to deal with every files in a directory.
Now that I'm calling my routine to clean my PowerShell scripts, even if I don't use Write-Host, It displays rows when I'm launching my routine, and I'm losing a lot of time to clear XML files.
Here is my code:
param ([string] $sourceDirectory, [string] $targetDirectory, [string] $XSDFileName, [string] $dataSourceName, [string] $databaseName)
clear
function clearLocalVariables{
#This functions clears my local variables
}
function createSQLNodesList{
param ([string] $dataSourceName,[string] $databaseName)
#This function creates a list of available and allowed nodes in my XML Files from SQL databases.
}
The following functions are used to check my nodes, and this is where the prints and Write-Host appears when it's launched more than once:
function isNodeNameValid {
param ([string] $testedNodeName)
# This function is used to return the value of the nodeAnalysis function.
# It selects wich list the node will be analysed depending on the fact that
# it is a node for the aspect of the XML or for data.
# - $testedNodeName is a string representing the XML node analysed.
# If the node name is a 5 length string, begins with an A, and is composed of
# 4 digits ('AXXXX'), then it is data.
if(($testedNodeName.Length -eq 5) -and ($testedNodeName.Substring(0,1) -eq "A" ) -and ($testedNodeName.Substring(1,4) -match "^[-]?[0-9.]+$")) {
return nodeAnalysis -nodesList $nodesSQL -testedNodeName $testedNodeName
#Else, it is in the list for the aspect of the XML.
} else {
return nodeAnalysis -nodesList $nodesXML -testedNodeName $testedNodeName
}
}
function nodeAnalysis {
param ($nodesList,[string] $testedNodeName)
# This function is used to analyse each node name given.
# It compares the name of the name analysed to each node in the array given in parameter.
# - $nodesList is the corresponding array depending on the isNodeNameValid() method.
# - $testedNodeName is a string representing the XML node analysed.
# We compare each node of the node array to the testedNodeName. If the testedNodeName is in this array, the method returns 1.
foreach($nodeName in $nodesList) {
if ($testedNodeName -eq $nodeName) {
return 1
}
}
#If the node correspond to any node of the list, then the method returns 0.
return 0
}
# -- XML Nodes recursive cleaning method -- #
function cleanXMLContent {
param ($XMLDoc,[int] $endOfLeaf, [int] $boucle)
#This is the function I have trouble with displays and efficency :
while($endOfFile -ne 1) {
if($endOfLeaf -eq 1) {
if($XMLDoc.Name -eq "#document"){
$endOfFile = 1
}
if($XMLDoc.NextSibling) {
$XMLDoc = $XMLDoc.NextSibling
$endOfLeaf = 0
} else {
$XMLDoc = $XMLDoc.ParentNode
$endOfLeaf = 1
}
} else {
if(!(isNodeNameValid -testedNodeName $XMLDoc.Name)) {
if($XMLDoc.PreviousSibling) {
$nodeNameToDelete = $XMLDoc.Name
$siblingNodeName = $XMLDoc.PreviousSibling.Name
$XMLDoc = $XMLDoc.ParentNode
$XMLDoc.RemoveChild($XMLDoc.SelectSingleNode($nodeNameToDelete))
$XMLDoc = $XMLDoc.SelectSingleNode($siblingNodeName)
} else {
$nodeNameToDelete = $XMLDoc.Name
$XMLDoc = $XMLDoc.ParentNode
$XMLDoc.RemoveChild($XMLDoc.SelectSingleNode($nodeNameToDelete))
}
} else {
if($XMLDoc.HasChildNodes) {
$XMLDoc = $XMLDoc.FirstChild
$endOfLeaf = 0
} else {
if($XMLDoc.NextSibling) {
$XMLDoc = $XMLDoc.NextSibling
$endOfLeaf = 0
} else {
if($XMLDoc.ParentNode) {
$XMLDoc = $XMLDoc.ParentNode
if($XMLDoc.NextSibling) {
$endOfLeaf = 1
} else {
$XMLDoc = $XMLDoc.ParentNode
$endOfLeaf = 1
}
}
}
}
}
}
}
Write-Host "- Cleaning XML Nodes OK" -ForegroundColor Green
}
function createXSDSchema {
param ([string] $XSDFileName)
#This function is used to create XSD corresponding File
}
function cleanFile {
param ([string] $fileName, [string] $source, [string] $target, [string] $XSDFileName, [string] $dataSourceName, [string] $databaseName)
# -- Opening XML File -- #
#Creation of the XML Document iteration path
$date = Get-Date
[string] $stringDate = ($date.Year*10000 + $date.Month*100 + $date.Day) * 1000000 + ($date.Hour * 10000 + $date.Minute* 100 + $date.Second)
$date = $stringDate.substring(0,8) + "_" + $stringDate.substring(8,6)
#determining the path of the source and the target files.
$XMLDocPath = $source + $fileName
$XMLFutureFileNamePreWork = $fileName.Substring(0,$fileName.Length - 4)
$XMLFuturePath = $target + $XMLFutureFileNamePreWork + "cleaned" #_"+$date
#Creation of the XML Document
$XMLDoc = New-Object System.Xml.XmlDocument
$XMLFile = Resolve-Path($XMLDocPath)
#Loading of the XML File
$XMLDoc.Load($XMLFile)
[XML] $XMLDoc = Get-Content -Path $XMLDocPath
#If the XML Document exists, then we clean it.
if($XMLDoc.HasChildNodes) {
#The XML Document is cleaned.
cleanXMLContent $XMLDoc.FirstChild -endOfLeaf 0
Write-Host "- XML Cleaned" -ForegroundColor Green
#If it is a success, then we save it in a new file.
#if($AnalysisFinished -eq 1) {
#Modifying the XSD Attribute
#setting the XSD name into the XML file
createXSDSchema -XSDFileName $XSDFileName
#Creation of the XML Document
$XMLDoc.Save($XMLFuturePath+".xml")
Write-Host "- Creation of the new XML File Successfull at "$XMLFuturePath -ForegroundColor Green
#Creation of the XSD Corresponding Document
#createXSDSchema -XMLPath $XMLFuturePath
#}
} else {
Write-Host "Impossible"
}
}
Here I'm executing the whole process with the different functions. When I'm launching each functions separatly it works, but with many files it displays content and I lose a lot of time:
cd $sourceDirectory
$files = Get-ChildItem $sourceDirectory
# -- Local Variables Cleanning -- #
clearLocalVariables
Write-Host "- Variable cleaning successfull" -ForegroundColor Green
# -- SQL Connection -- #
$nodesSQL = createSQLNodesList -dataSourceName $dataSourceName -databaseName $databaseName
foreach($file in $files){
cleanFile -fileName $file -source $sourceDirectory -target $targetDirectory -XSDFileName $XSDFileName -dataSourceName $dataSourceName -databaseName $databaseName
}
Do you have any idea about how to avoid the different displays of the contents?
I have a lot of blank rows, that multiplies the cleaning time by 10 or 15.
First, refrain from loading XML files twice. Use either
$XMLDoc = New-Object System.Xml.XmlDocument
$XMLDoc.Load($XMLFile)
or
[xml]$XMLDoc = Get-Content -Path $XMLFile
They both do the same thing.
Next, replace the iterative XML traversal with a recursive one:
function Clean-XmlContent {
Param(
[Parameter(Mandatory=$true)]
[Xml.XmlElement]$Node
)
if ($Node.HasChildNodes) {
foreach ($child in $Node.ChildNodes) {
if ($child -is [Xml.XmlElement]) { Clean-XmlContent $child }
}
}
if (-not (Test-NodeName -NodeName $Node.LocalName)) {
$Node.ParentNode.RemoveChild($Node)
}
}
and call it with the XML root node:
Clean-XmlContent $XMLDoc.DocumentElement
Also, simplify the node name validation:
function Test-NodeName {
Param(
[string]$NodeName
)
if ($NodeName -match '^A\d{4}$') {
return ($nodesSQL -contains $NodeName)
} else {
return ($nodesXML -contains $NodeName)
}
}
That should speed up things considerably.
Thanks to Ansgar Wiechers, I have found a way to accelerate my code, I use recursive way to develop my code. This way my code is much faster, but the content of the rows deleted was still printed.
But to avoid having the content of the deleted nodes printed on screen, I had to use :
[void]$RemovedNode.ParentNode.RemoveChild($RemovedNode)
Instead of :
$RemovedNode.ParentNode.RemoveChild($RemovedNode)

issue accessing lexical scope using B

For debugging purposes I'd like to Access the lexical scope of different subroutines with a specific Attribute set. That works fine. I get a Problem when the first variable stores a string, then I get a empty string. I do something like this:
$pad = $cv->PADLIST; # $cv is the coderef to the sub
#scatchpad = $pad->ARRAY; # getting the scratchpad
#varnames = $scratchpad[0]->ARRAY; # getting the variablenames
#varcontents = $scratchpad[1]->ARRAY; # getting the Content from the vars
for (0 .. $#varnames) {
eval {
my $name = $varnames[$_]->PV;
my $content;
# following line matches numbers, works so far
$content = $varcontent[$_]->IVX if (scalar($varcontent[$_]) =~ /PVIV=/);
# should match strings, but does give me undef
$content = B::perlstring($varcontent[$_]->PV) if (scalar($varcontent[$_]) =~ /PV=/);
print "DEBUGGER> Local variable: ", $name, " = ", $content, "\n";
}; # there are Special vars that throw a error, but i don't care about them
}
Like I said in the comment the eval is to prevent the Errors from the B::Special objects in the scratchpad.
Output:
Local variable: $test = 42
Local variable: $text = 0
The first Output is okay, the second should Output "TEXT" instead of 0.
What am I doing wrong?
EDIT: With a little bit of coding I got all values of the variables , but not stored in the same indexes of #varnames and #varcontents. So now is the question how (in which order) the values are stored in #varcontents.
use strict;
use warnings;
use B;
sub testsub {
my $testvar1 = 42;
my $testvar2 = 21;
my $testvar3 = "testval3";
print "printtest1";
my $testvar4 = "testval4";
print "printtest2";
return "returnval";
}
no warnings "uninitialized";
my $coderef = \&testsub;
my $cv = B::svref_2object ( $coderef );
my $pad = $cv->PADLIST; # get scratchpad object
my #scratchpad = $pad->ARRAY;
my #varnames = $scratchpad[0]->ARRAY; # get varnames out of scratchpad
my #varcontents = $scratchpad[1]->ARRAY; # get content array out of scratchpad
my #vars; # array to store variable names adn "undef" for special objects (print-values, return-values, etc.)
for (0 .. $#varnames) {
eval { push #vars, $varnames[$_]->PV; };
if ($#) { push #vars, "undef"; }
}
my #cont; # array to store the content of the variables and special objects
for (0 .. $#varcontents) {
eval { push #cont, $varcontents[$_]->IV; };
eval { push #cont, $varcontents[$_]->PV; };
}
print $vars[$_], "\t\t\t", $cont[$_], "\n" for (0 .. $#cont);
EDIT2: Added runnable script to demonstrate the issue: Variablenames and variablevalues are not stored in the same index of the two Arrays (#varnames and #varcontents).

How to create a dynamic variable in Powershell, sucha as date/time etc

Hi i am not exactly sure if my wording is right but i need a variable which contains current date/time whenever i write data to log ; how can i do that without initializing everytime.Currently everytime i need a update i use these both statements jointly.Is there an other way of doing this?
$DateTime = get-date | select datetime
Add-Content $LogFile -Value "$DateTime.DateTime: XXXXX"
please do let me know if any questions or clarifications regarding my question.
This script make the real Dynamic variable in Powershell ( Thanks to Lee Holmes and his Windows PowerShell Cookbook The Complete Guide to Scripting Microsoft's Command Shell, 3rd Edition)
##############################################################################
##
## New-DynamicVariable
##
## From Windows PowerShell Cookbook (O'Reilly)
## by Lee Holmes (http://www.leeholmes.com/guide)
##
##############################################################################
<#
.SYNOPSIS
Creates a variable that supports scripted actions for its getter and setter
.EXAMPLE
PS > .\New-DynamicVariable GLOBAL:WindowTitle `
-Getter { $host.UI.RawUI.WindowTitle } `
-Setter { $host.UI.RawUI.WindowTitle = $args[0] }
PS > $windowTitle
Administrator: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
PS > $windowTitle = "Test"
PS > $windowTitle
Test
#>
param(
## The name for the dynamic variable
[Parameter(Mandatory = $true)]
$Name,
## The scriptblock to invoke when getting the value of the variable
[Parameter(Mandatory = $true)]
[ScriptBlock] $Getter,
## The scriptblock to invoke when setting the value of the variable
[ScriptBlock] $Setter
)
Set-StrictMode -Version 3
Add-Type #"
using System;
using System.Collections.ObjectModel;
using System.Management.Automation;
namespace Lee.Holmes
{
public class DynamicVariable : PSVariable
{
public DynamicVariable(
string name,
ScriptBlock scriptGetter,
ScriptBlock scriptSetter)
: base(name, null, ScopedItemOptions.AllScope)
{
getter = scriptGetter;
setter = scriptSetter;
}
private ScriptBlock getter;
private ScriptBlock setter;
public override object Value
{
get
{
if(getter != null)
{
Collection<PSObject> results = getter.Invoke();
if(results.Count == 1)
{
return results[0];
}
else
{
PSObject[] returnResults =
new PSObject[results.Count];
results.CopyTo(returnResults, 0);
return returnResults;
}
}
else { return null; }
}
set
{
if(setter != null) { setter.Invoke(value); }
}
}
}
}
"#
## If we've already defined the variable, remove it.
if(Test-Path variable:\$name)
{
Remove-Item variable:\$name -Force
}
## Set the new variable, along with its getter and setter.
$executioncontext.SessionState.PSVariable.Set(
(New-Object Lee.Holmes.DynamicVariable $name,$getter,$setter))
There's a Set-StrictMode -Version 3 but you can set it as -Version 2 if you can load framework 4.0 in your powershell V2.0 session as explained Here
The use for the OP is:
New-DynamicVariable -Name GLOBAL:now -Getter { (get-date).datetime }
Here the Lee Holmes's evaluation (where it is clear what is the real flaw) about the method I used in my other answer:
Note
There are innovative solutions on the Internet that use PowerShell's debugging facilities to create a breakpoint that changes a variable's value whenever you attempt to read from it. While unique, this solution causes PowerShell to think that any scripts that rely on the variable are in debugging mode. This, unfortunately, prevents PowerShell from enabling some important performance optimizations in those scripts.
Why not use:
Add-Content $LogFile -Value "$((Get-Date).DateTime): XXXXX"
This gets the current datetime every time. Notice that it's inside $( ) which makes powershell run the expression(get the datetime) before inserting it into the string.
wrap your two commands in function so you will have just one call ?
function add-log{
(param $txt)
$DateTime = get-date | select -expand datetime
Add-Content $LogFile -Value "$DateTime: $txt"
}
Besides these other ways (which frankly I would probably use instead - except the breakpoint approach), you can create a custom object with a ScriptProperty that you can provide the implementation for:
$obj = new-object pscustomobject
$obj | Add-Member ScriptProperty Now -Value { Get-Date }
$obj.now
Using PsBreakPoint:
$act= #'
$global:now = (get-date).datetime
'#
$global:sb = [scriptblock]::Create($act)
$now = Set-PSBreakpoint -Variable now -Mode Read -Action $global:sb
calling $now returns current updated datetime value
One liner:
$now = Set-PSBreakpoint -Variable now -Mode Read -Action { $global:now = (get-date).datetime }