Creating DLL using MASM32 - dll

I have a problem creating DLL in masm32. No matter what I do the output is always *.exe file, not the *.dll. The following codes are in fact those created by masm32 -> Code -> Create New DLL. I just use Project -> Build All to build it. What am I doing wrong guys ?
mydll.def
LIBRARY mydll
EXPORTS my_proc
; EXPORTS [your_exported_proc_name]
makeit.bat
#echo off
if exist mydll.obj del mydll.obj
if exist mydll.dll del mydll.dll
\masm32\bin\ml /c /coff mydll.asm
\masm32\bin\Link /SUBSYSTEM:WINDOWS /DLL /DEF:mydll.def mydll.obj
del mydll.obj
del mydll.exp
dir mydll.*
pause
mydll.asm
; ¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤
include \masm32\include\masm32rt.inc
include \masm32\include\windows.inc
; ¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤¤
; -------------------------------------------
; Build this DLL with the provided MAKEIT.BAT
; -------------------------------------------
.data?
hInstance dd ?
.code
; «««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««
LibMain proc instance:DWORD,reason:DWORD,unused:DWORD
.if reason == DLL_PROCESS_ATTACH
mrm hInstance, instance ; copy local to global
mov eax, TRUE ; return TRUE so DLL will start
.elseif reason == DLL_PROCESS_DETACH
.elseif reason == DLL_THREAD_ATTACH
.elseif reason == DLL_THREAD_DETACH
.endif
ret
LibMain endp
My_proc proc
ret
My_proc endp
; «««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««
comment * -----------------------------------------------------
You should add the procedures your DLL requires AFTER
the LibMain procedure. For each procedure that you
wish to EXPORT you must place its name in the "mydll.def"
file so that the linker will know which procedures to
put in the EXPORT table in the DLL. Use the following
syntax AFTER the LIBRARY name on the 1st line.
LIBRARY mydll
EXPORTS YourProcName
EXPORTS AnotherProcName
------------------------------------------------------- *
; «««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««««
end LibMain

If you have a license for Visual Studio then you can use visual studio for your masm32 programming, the details of how to setup Visual Studio can be found here after following the steps provided go to project properties and under configuration properties change configuration type to Dynamic Library.

Related

project user.lua require statement only for project specific Modules

I have maintained my user.lua project folder specific. Is there anything in place where I can exclude Zerobrane's env path when I check a Module require statement with "Evaluate in Console"?
The reason for this is , i will ensure that everything is working within the plugin Engine himself.
This is what would be checked for a missing Module
lualibs and bin is cerobrane specific, if I see it right
Output
local toast = require("toast")
[string " local toast = require("toast")"]:1: module 'toast' not found:
no field package.preload['toast']
no file 'lualibs/toast.lua'
no file 'lualibs/toast/toast.lua'
no file 'lualibs/toast/init.lua'
no file './toast.lua'
no file '/usr/local/share/luajit-2.0.4/toast.lua'
no file '/usr/local/share/lua/5.1/toast.lua'
no file '/usr/local/share/lua/5.1/toast/init.lua'
no file '/Volumes/SSD2go PKT/X-Plane 11 stable/Resources/plugins/FlyWithLua/Internals/toast.lua'
no file '/Volumes/SSD2go PKT/X-Plane 11 stable/Resources/plugins/FlyWithLua/Internals/toast/init.lua'
no file '/Volumes/SSD2go PKT/X-Plane 11 stable/Resources/plugins/FlyWithLua/Modules/toast.lua'
no file '/Volumes/SSD2go PKT/X-Plane 11 stable/Resources/plugins/FlyWithLua/Modules/toast/init.lua'
no file 'bin/clibs/libtoast.dylib'
no file 'bin/clibs/toast.dylib'
no file './toast.so'
no file '/usr/local/lib/lua/5.1/toast.so'
no file '/usr/local/lib/lua/5.1/loadall.so'
no file '/Volumes/SSD2go PKT/X-Plane 11 stable/Resources/plugins/FlyWithLua/Internals/libtoast_64.so'
Here is my user.lua file at this time
--[[--
Use this file to specify **User** preferences.
Review [examples](+/Applications/ZeroBraneStudio.app/Contents/ZeroBraneStudio/cfg/user-sample.lua) or check [online documentation](http://studio.zerobrane.com/documentation.html) for details.
--]]--
--https://studio.zerobrane.com/doc-general-preferences#debugger
-- to automatically open files requested during debugging
editor.autoactivate = true
--enable verbose output
--debugger.verbose=true
--[[--
specify how print results should be redirected in the application being debugged (v0.39+). Use 'c' for ‘copying’ (appears in the application output and the Output panel), 'r' for ‘redirecting’ (only appears in the Output panel), or 'd' for ‘default’ (only appears in the application output). This is mostly useful for remote debugging to specify how the output should be redirected.
--]]--
debugger.redirect="c"
-- to force execution to continue immediately after starting debugging;
-- set to `false` to disable (the interpreter will stop on the first line or
-- when debugging starts); some interpreters may use `true` or `false`
-- by default, but can be still reconfigured with this setting.
debugger.runonstart = true
-- FlyWithLua.ini version 2.7.6 build 2018-10-24
-- Where to search for modules.
-- use this to evaluate your project folder , select the print function / right Mousebutton --> Evaluate in Console
--print(ide.filetree.projdir)
ZBSProjDir = "/Volumes/SSD2go PKT/X-Plane 11 stable/Resources/plugins/FlyWithLua"
INTERNALS_DIRECTORY = ZBSProjDir .. "/Internals/"
MODULES_DIRECTORY = ZBSProjDir .. "/Modules/"
package.path = table.concat({
package.path,
INTERNALS_DIRECTORY .. "?.lua",
INTERNALS_DIRECTORY .. "?/init.lua",
MODULES_DIRECTORY .. "?.lua",
MODULES_DIRECTORY .. "?/init.lua",
}, ";")
package.cpath = table.concat({
package.cpath,
INTERNALS_DIRECTORY .. "?.ext",
MODULES_DIRECTORY .. "?.ext",
}, ";")
-- Produce a correct name pattern for binary modules for OS and architecture.
-- This resolves clash between OS X and Linux binary modules by requiring "lib"
-- prefix for Linux ones.
local library_pattern = "?_64."
if SYSTEM == "IBM" then
library_pattern = library_pattern .. "dll"
elseif SYSTEM == "APL" then
library_pattern = library_pattern .. "so"
else
library_pattern = "lib" .. library_pattern .. "so"
end
package.cpath = package.cpath:gsub("?.ext", library_pattern)
Version --> ZeroBrane Studio (1.90; MobDebug 0.706)
Greetings Lars
You should get the desired effect if toast module file is located in your project directory. When a command is executed in the Console, the current directory is set to the project directory, so even though lualibs folder from the IDE may be in the path, it should make no difference (unless you copied the module to lualibs).

Can I get correct DLL module list for tasklist /m when checking 32 bit program on a 64bit machine?

Command tasklist on windows has really useful feature: it can list all dll modules for a process, or all processes. Command bellow will list all DLL files used by explorer.exe:
tasklist /fi "ImageName eq explorer.exe" /m
Looks like this (shortened, translated to english):
Process name PID Modules
========================= ======== ============================================
explorer.exe 1104 ntdll.dll, kernel32.dll, KERNELBASE.dll,
ADVAPI32.dll, msvcrt.dll, sechost.dll,
RPCRT4.dll, GDI32.dll, USER32.dll, LPK.dll,
USP10.dll, SHLWAPI.dll, SHELL32.dll,
ole32.dll, OLEAUT32.dll, EXPLORERFRAME.dll,
DUser.dll, DUI70.dll, IMM32.dll, MSCTF.dll,
The problem is that this doesn't work so well for 64-bit processes:
C:\>tasklist /fi "ImageName eq firefox.exe" /m
Process name PID Modules
========================= ======== ============================================
firefox.exe 4980 ntdll.dll, wow64.dll, wow64win.dll,
wow64cpu.dll
What you see is incomplete, it looks more like this:
My question then is: Can I start tasklist as 32 bit program or otherwise ensure it will return correct values? I need to call tasklist from another program (Java) and get the list of loaded DLL files. I need this to ensure I do not attempt to load DLL twice.
The EnumProcessModulesEx() function can be used to enumerate 32-bit and/or 64-bit modules in the target process.
You can use ListDLLs from Windows Sysinternals
Example output:
F:\>c:\apps\NirSoft\SysinternalsSuite\listdlls firefox
ListDLLs v3.1 - List loaded DLLs
Copyright (C) 1997-2011 Mark Russinovich
Sysinternals - www.sysinternals.com
--------------------------------------------------------------------
firefox.exe pid: 2000
Command line: C:\apps\Firefox\firefox.exe
Base Size Path
0x00000000012c0000 0x5f000 C:\apps\Firefox\firefox.exe
0x0000000076d80000 0x1a9000 C:\Windows\SYSTEM32\ntdll.dll
0x00000000748a0000 0x3f000 C:\Windows\SYSTEM32\wow64.dll
0x0000000074840000 0x5c000 C:\Windows\SYSTEM32\wow64win.dll
0x0000000074830000 0x8000 C:\Windows\SYSTEM32\wow64cpu.dll
0x00000000012c0000 0x5f000 C:\apps\Firefox\firefox.exe
0x0000000076f60000 0x180000 C:\Windows\SysWOW64\ntdll.dll
0x000000006d100000 0x47000 C:\apps\Avast\snxhk.dll
0x0000000074960000 0x110000 C:\Windows\syswow64\KERNEL32.dll
0x00000000766d0000 0x47000 C:\Windows\syswow64\KERNELBASE.dll
0x00000000741d0000 0x62000 C:\Windows\SysWOW64\guard32.dll
0x0000000076880000 0x100000 C:\Windows\syswow64\USER32.dll
0x00000000753d0000 0x90000 C:\Windows\syswow64\GDI32.dll
0x0000000075110000 0xa000 C:\Windows\syswow64\LPK.dll
0x0000000074b00000 0x9d000 C:\Windows\syswow64\USP10.dll
0x00000000757c0000 0xac000 C:\Windows\syswow64\msvcrt.dll
0x0000000075720000 0xa0000 C:\Windows\syswow64\ADVAPI32.dll
0x00000000753b0000 0x19000 C:\Windows\SysWOW64\sechost.dll
0x0000000075490000 0xf0000 C:\Windows\syswow64\RPCRT4.dll
0x0000000074900000 0x60000 C:\Windows\syswow64\SspiCli.dll
0x00000000748f0000 0xc000 C:\Windows\syswow64\CRYPTBASE.dll
0x00000000746a0000 0x9000 C:\Windows\SysWOW64\VERSION.dll
0x0000000074c50000 0x5000 C:\Windows\syswow64\PSAPI.DLL
0x0000000074bb0000 0x60000 C:\Windows\SysWOW64\IMM32.DLL
0x0000000074d50000 0xcc000 C:\Windows\syswow64\MSCTF.dll
0x0000000074600000 0x7000 C:\Windows\SysWOW64\fltlib.dll
0x000000006d0c0000 0x1b000 C:\apps\Firefox\mozglue.dll
0x000000006d040000 0x71000 C:\apps\Firefox\MSVCP120.dll
0x0000000067c20000 0xee000 C:\apps\Firefox\MSVCR120.dll
0x0000000060700000 0x1a0000 C:\apps\Firefox\nss3.dll
...
Source ListDLLs v3.1
ListDLLs is a utility that reports the DLLs loaded into processes. You can use it to list all DLLs loaded into all processes, into a specific process, or to list the processes that have a particular DLL loaded. ListDLLs can also display full version information for DLLs, including their digital signature, and can be used to scan processes for unsigned DLLs.
Disclaimer: I am not affiliated with Windows Sysinternals in any way, I am just an end user of the software.
You can use PowerShell:
Get-Process winword| select -ExpandProperty modules|ft -Autosize
Get-process Winword (32bits) on 64bits OS
Extracted from: https://www.sysadmit.com/2019/07/windows-saber-dll-utiliza-programa.html

SAS: Using the Put statement to create dynamic code

I'd like to create dynamic code using the PUT statement. According to this document from SUGI 29 (http://www2.sas.com/proceedings/sugi29/175-29.pdf),
put
"data XXXXX; "
/ 'infile "&datadir/&compid&filetype" missover ls=' tbla_fle
';' / 'input'
;
is equivalent to running
data onecomp ;
infile
"&datadir/&compid&filetype"
missover ls = 268 ;
input
However, when I try something similar to their example, the code enclosed in the PUT statement doesn't run and is instead written to the SAS Output Log:
data _NULL_;
put // "data put_test;" / "b=2;" / "run;";
run;
In Output Log:
data put_test;
b=2;
run;
I've checked the SAS documentation, and it seems that PUT is only used to "Write lines to the SAS log, to the SAS output window, or to an external location that is specified in the most recent FILE statement." Nowhere does it say that it can be used to create dynamically generated code.
I know that I must be missing something, but I'm not sure what. I'm using SAS Enterprise Guide 4.1.
Thank you!
The idea is to use put to write your generated code to a file. You then %include the file into your SAS session to run it. What you're missing is a file statement and the %include directive.
data _null_;
file 'temp.sas'; /* redirects put to a file instead of the SAS log */
put
"data XXXXX; "
/ 'infile "&datadir/&compid&filetype" missover ls=' tbla_fle
';' / 'input'
;
run;
%include 'temp.sas';

MS SQL Server use old log file location after detach/copy/attach

I create database "Test" in folder "d:\test". Database files are "d:\test\Test.mdf" and "d:\Test\Test_log.ldf". I detach database from MS SQL Server 2008 R2, copy all files to new folder ("d:\test_new"), delete log file ("d:\test_new\Test_log.ldf"), and try to attach database again from new location. When I use SQL Server Management Studio, and choose "d:\test_new\Test.mdf" file, it determines that log file is located in "d:\test\Test_log.ldf" (old location). How can I attach this database with rebuilding log in new location? Just imagine, that I cannot copy ldf file again to new location, and that it is still available there, so SQL Server see it anyway. I want to say to SQL Server - "please, forget that log file, and create new log file here". It's be better if you help me with T-SQL script, but if it will be steps in Management studio - I will convert it to script myself.
What I had tried already:
1.
CREATE DATABASE [test]
ON ( FILENAME = N'D:\test_new\test.mdf' )
FOR ATTACH_REBUILD_LOG
attaches log file from old location (FOR ATTACH - the same)
2.
CREATE DATABASE [test]
ON ( FILENAME = N'D:\test_new\test.mdf' )
LOG ON ( FILENAME = N'D:\test_new\test_log.ldf' )
FOR ATTACH_REBUILD_LOG
returns an error: Unable to open the physical file "D:\test_new\test_log.ldf". Operating system error 2: "2(File not found.)".
3.
sp_attach_db and sp_attach_single_file_db
was tried too. And I even had checked their source codes - they just create dynamic SQL and call CREATE DATABASE ... FOR ATTACH statement.
The question is slowly changed to: "Is it possible?"
UPDATE
Well, it looks like it's not possible with current versions of SQL Server. If anybody knows a way to do it - please, I will be very pleased to know it too!
Edit2: To my knowledge, it is not possible for SQL Server to recreate a log file. It can shrink the ldf, but not create it when only the mdf exists.
When you copy your files from d:\test\ to d:\test_new\, do not delete the d:\test_new\Test_log.ldf.
Leave the log file there, because you cannot reattach the new DB without that log file. Afterwards, you can shrink that log to a minimum size.
So, to synthesize:
Copy your files from d:\test\ to d:\test_new\ and leave the log
file there.
Run your create database script that you posted in your question (point 2).
Run the following script to shrink the log to a minimum size
.
USE test
GO
DBCC SHRINKFILE(logicalFileName, 1)
GO
To find out what logicalFileName is, run sp_helpfile, that will give you the logical file name for your log file:
USE test
GO
EXEC sp_helpfile
GO
more info here
Edit:
I think you need first to detach the test database from the old location:
(You might create a script that does it all, from the following commands)
C:\> osql -E
1> sp_detach_db 'test'
2> go
3> quit
C:\>
Then copy the files to the new location.
C:\> copy d:\test\* d:\test_new\*
Next, attach the test DB to the new path location:
C:\> osql -E
1> sp_attach_db #dbname = N'test', #filename1 = N'd:\test_new\Test.mdf', #filename2 = N'd:\test_new\Test_log.ldf'
2> go
3> quit
C:\>
to test if the new database was successfully attached:
C:\> osql -E
1> use test
2> go
3> quit
C:\>
If there are no errors after the go command, then all is ok
Hope this helps
Microsoft article on how to move files
The users must copy BOTH the .mdf and .ldf files. They then have to use the following command (one or the other).
sp_attach_db (deprecated, use the CREATE DATABASE WITH ATTACH in the future)
EXEC sp_attach_db #dbname = 'dbname', #filename1='d:\test_new\test.mdf', #filename2='d:\test_new\test.ldf'
This will result in the databas using the data file (mdf) and transaction log (ldf) from the \test_new directory
CREATE DATABASE FOR ATTACH
CREATE DATABASE dbname ON '(FILENAME=d:\test_new\test.mdf'), (FILENAME='d:\test_new\test.ldf') FOR ATTACH

Version configuration from SQL

How would you create an installation setup that runs against multiple schemas taking into consideration the latest version of the database updates? Ideally: update a single file with a new version number, then send the DBAs an archive containing everything needed to perform the database update.
Here is the directory structure:
| install.sql
| install.bat
|
\---DATABASE_1.3.4.0
| README.txt
|
\---SCHEMA_01
| install.sql
| SCM1_VIEW_NAME_01_VW.vw
| SCM1_VIEW_NAME_02_VW.vw
| SCM1_PACKAGE_01_PKG.pkb
| SCM1_PACKAGE_01_PKG.pks
|
\---SCHEMA_02
install.sql
SCM2_VIEW_NAME_01_VW.vw
SCM2_VIEW_NAME_02_VW.vw
SCM2_PACKAGE_01_PKG.pkb
SCM2_PACKAGE_01_PKG.pks
The following code (sanitized and trimmed for brevity and security) is in install.sql:
ACCEPT tns
ACCEPT schemaUsername
ACCEPT schemaPassword
CONNECT &&schemaUsername/&&schemaPassword#&&tns
##install.sql
/
The following code is in install.bat:
#echo off
sqlplus /nolog #install.sql
pause
There are several schemas, not all of which need updates each time. Those that do not need updates will not have directories created.
What I would like to do is create two files:
version.txt
schemas.txt
These two (hand-crafted) files would be used by install.sql to determine which version of scripts to run.
For example:
version.txt
1.3.4.0
schemas.txt
SCHEMA_01
SCHEMA_02
What I really would like to know is how would you read those text files from install.sql to run the corresponding install scripts? (Without PL/SQL; other Oracle-specific conventions are acceptable.)
All ideas welcome; many thanks in advance.
Here is a solution.
install.bat
#echo off
REM *************************************************************************
REM
REM This script performs a database upgrade for the application suite.
REM
REM *************************************************************************
setLocal EnableDelayedExpansion
REM *************************************************************************
REM
REM Read the version from the file.
REM
REM *************************************************************************
set /p VERSION=<version.txt
set DB=DB_%VERSION%
set SCHEMAS=%DB%\schema-order.txt
REM *************************************************************************
REM
REM Each line in the schema-order.txt file contains the name of a schema.
REM Blank lines are ignored.
REM
REM *************************************************************************
for /f "tokens=* delims= " %%a in (%SCHEMAS%) do (
if not "%%a" == "" sqlplus /nolog #install.sql %VERSION% %%a
)
Primary install.sql
ACCEPT schemaUsername CHAR DEFAULT &2 PROMPT 'Schema Owner [&2]: '
ACCEPT schemaPassword CHAR PROMPT 'Password: ' HIDE
PROMPT Verifying Database Connection
CONNECT &&schemaUsername/&&schemaPassword#&&tns
DEFINE INSTALL_PATH = DB_&1&&ds^&2&&ds
##&&INSTALL_PATH^install.sql
This uses a batch file to parse the files, then passes the parameters to the SQL script on the command-line.
Secondary install.sql
Each line in the file executed by the first installation script can then use the INSTALL_PATH variable to reference a file containing actual SQL to run. This secondary script is responsible for running the individual SQL files that actually exact a change in the database.
##&&INSTALL_PATH^DIR&&ds^SCM1_VIEW_OBJECT_VW.vw
This solution could be modified to automatically run all files in a specific order through clever use of sorting and naming of directories (i.e., the SQL files listed in a table directory run before the SQL files in a view directory).