Lua: ProteaAudio API confuse -- How to use it? - api

Hello everyone.
Sorry for my noob question as I'm just a non-programmer trying to learn to program with Lua.
I'm so attracted with Lua since it's indeed very simple, either in size as well as in syntax.
And I decided to explore further experiment with this Brazilian born language, like playing with sound -- as I did in Python and Ruby.
So I found this ProteaAudio and tried to play the sample scripts came within package I downloaded from here.
The package comes with two sample scripts:
first named example.lua to play the ogg sample file (also comes within the package)
and another to play function generated sound named scale.lua
The first script runs just fine on my Win 7 and Ubuntu 12.04 x86 machine.
But the second script only runs on Windows and got an error when I tried to run it on Ubuntu, generating this message:
../lua52: scale.lua:13: bad argument #1 to 'soundLoop' (number expected, got nil)
stack traceback:
[C]: in function 'soundLoop'
scale.lua:13: in function 'playNote'
scale.lua:29: in main chunk
[C]: in ?
The full original source-code from scale.lua is:
-- function creating a sine wave sample:
function sampleSine(freq, duration, sampleRate)
local data = { }
for i = 1,duration*sampleRate do
data[i] = math.sin( (i*freq/sampleRate)*math.pi*2)
end
return proAudio.sampleFromMemory(data, sampleRate)
end
-- plays a sample shifted by a number of halftones for a definable period of time
function playNote(sample, pitch, duration, volumeL, volumeR, disparity)
local scale = 2^(pitch/12)
local sound = proAudio.soundLoop(sample, volumeL, volumeR, disparity, scale)
proAudio.sleep(duration)
proAudio.soundStop(sound)
end
-- create an audio device using default parameters and exit in case of errors
require("proAudioRt")
if not proAudio.create() then os.exit(1) end
-- generate a sample:
local sample = sampleSine(440, 0.5, 88200)
-- play scale (a major):
local duration = 0.5
for i,note in ipairs({ 0, 2, 4, 5, 7, 9, 11, 12 }) do
playNote(sample, note, duration)
end
-- cleanup
proAudio.destroy()
And since I got confused with this ProteaAudio Lua API, I really can't get why this error comes.
Please help.

This is actually just a guess, but...
To play a "major" scale upwards (8 notes, jumping: full full half, full full full half) the original code does:
local duration = 0.5
for i,note in ipairs({ 0, 2, 4, 5, 7, 9, 11, 12 }) do
playNote(sample, note, duration)
end
where the sample is a handle to a pre-generated sample created by proAudio.sampleFromMemory which is returned by function sampleSine, that passed it a calculated 'table' representing a 440hz sine-wave (concert-pitch frequency for note 'A4', the first above middle 'C').
Thus playing an 'A major scale' by changing (increasing) the 'pich' (frequency) of that sample (in 8 steps=notes). That pitch-calculation is done by function playNote.
Function playNote accepts the following arguments:
sample, pitch, duration, volumeL, volumeR, disparity,
but it currently does not receive the arguments:
volumeL, volumeR, disparity (which will then be nil).
So when function playNote tries to call:
proAudio.soundLoop(sample, volumeL, volumeR, disparity, scale),
then the call will end up like:
proAudio.soundLoop(sample, nil, nil, nil, scale),
where the sample is passed on and scale is the 'playback-pitch' of that sample, as just calculated (according to specified note) by function playNote.
Your error-message states: bad argument #1 to 'soundLoop' (number expected, got nil).
Hmm, that seems consistent with what is happening (assuming that 'bad argument #1' is the second argument, in this case volumeL).
So,
you might want to try specifying some values for volumeL, volumeR, disparity like:
local duration = 0.5
local volumeL = 1.0
local volumeR = 1.0
local disparity = 0.0
for i,note in ipairs({ 0, 2, 4, 5, 7, 9, 11, 12 }) do
playNote(sample, note, duration, volumeL, volumeR, disparity)
end
From the proteaAudio documentation one can read about soundLoop's arguments:
sample - A sample handle returned by a previous load() call
volumeL - (optional) Left volume
volumeR - (optional) Right volume
disparity - (optional) Time difference between left and right channel in seconds.
Use negative values to specify a delay for the left
channel, positive for the right.
pitch - (optional) Pitch factor for playback. 0.5 corresponds to one octave
below, 2.0 to one above the original sample.
If that should do the trick, then the arguments might not be so optional on Ubuntu.
Hope this helps!

Related

AviSynth - turn sound off

I am using AviSynth+ and I play an .avs script into VLC (I've installed the AviSynth plugin for VLC).
My script is very basic and it looks like this:
DirectShowSource("D:\MyVideo.asf", fps=25, convertfps=true)
How can I turn off the sound of the video, only for the first two minutes of the video?
I am using Windows 8 - 64 bit
I know that this is a very late answer, but an easy way to do it would be to split the clip into two segments, silence the first segment with Amplify, and then rejoin them:
original = DirectShowSource("D:\MyVideo.asf", fps=25, convertfps=true)
numSeconds = 60 * 2
numFrames = Round(source.FrameRate() * numSeconds)
beginning = source.Trim(0, numFrames - 1, False)
rest = source.Trim(numFrames, 0, False)
beginning.Amplify(0.0) + rest

Write bytes to a PLC device

I'm working around a connection between a PLC device and my companies PC. The PLC is the known Siemens S7-200 and I'm using vb.NET. Probably I should use another language but vb.NET is the one I'm more comfortable with. To do so, I'm also using a PPI protocol through COM1 and LibNoDave library to establish the connection.
The program I'm testing has to have Input 0.0 On, so I attached a switch to make it happen. Also I made a vb console to read (and write) the state of the Inputs and Outputs (as the LED physical indicators on the device) and also the state of the Bit memories:
The console reader (LEITOR section - sorry) is working like I intended and all the Q's, I's and M's are correctly lighting up if it is the case.
The problem is, to run the PLC program, I also have to lit up Q 1.1.
The Ladder Network that describes this has the following logical map:
I know I have to use the code:
Public FDS As libnodave.daveOSserialType 'Serial type
Public DI As libnodave.daveInterface 'Interface
Public DC As libnodave.daveConnection 'Connection
Public lPPI As Integer = 0 'Local
Public pPPI As Integer = 2 'PLC
Public RES As Integer = 0 'Response
Public REP As Integer = 0 'Response
Public buf(100) As Byte
Sub Code()
FDS.rfd = libnodave.setPort("COM1", "9600", AscW("E"))
DI = New libnodave.daveInterface(FDS, "IF1", lPPI, libnodave.daveProtoPPI, libnodave.daveSpeed93k)
DI.setTimeout(1000000)
DC = New libnodave.daveConnection(DI, pPPI, 0, 0)
RES = DC.connectPLC
'Write on PLC:
RES = DC.writeBytes(...
End sub
The code is working fine with no errors and an establish connection (until the last RESponse).
Here's the problem:
I can lit up the Output 1.1 (on the device and on the console) by doing the following:
RES = DC.writeBytes(libnodave.daveDB, 1, 1500, 16, buf)
where
buf = BitConverter.GetBytes(libnodave.daveSwapIed_16(30))
by repeating these two steps five more times (another time with 30 again, two more times with 50 and, finally, another two times with 50).
I'm pretty sure I'm doing something wrong, but there's not a lot of these commands description available online for a guy like me (who's just got started).
Can anyone explain what's going on? And also, How can I lit Q 1.1 with just one step?

R2WinBugs data entry incompatible copy error for Conditional Binomial likelihood, probit link, Random Effects (Psoriasis example)

I was working on the the guide for calculating the effect size of different treatments using a NETWORK META-ANALYSIS as done in example 6.a.
Here
It works fine in winBugs, but I want to do the analysis in R using R2winugs so that I can automate the data input.
It reads the model just fine according to the log, but it gets hung up when reading the data.
E.G.
display(log)
check(C:/Users/Temp User/.../Plaque_Psoriasis_Project/RE_Psoriasis.bug.txt)
model is syntactically correct
data(C:/Users/Temp User/.../Plaque_Psoriasis_Project/data.txt)
This is where the program just hangs.
The trap screen reads
incompatible copy
BugsCmds.TextError [000003A1H]
.beg INTEGER 1699018032
.end INTEGER 34825508
.......
I've tried reading the data as:
data <- list(t=t,C=C,r=r,n=n,na=na,nc=nc, ns=ns, nt=nt, Cmax=Cmax, meanA=meanA, precA=precA)
and as
data <- list("t","C","r","n","na","nc", "ns", "nt", "Cmax", "meanA", "precA")
Neither works.
I got R2winbugs to do the toy school example in the documentation so it can work.
Any thoughts?

Dymola Results of checkModel()

checkmodel([Some Model]) opens the GUI "Dymola Messages", tab "Translation" and displays Errors, Warnings, and Messages.
Does anyone know how to write these infos to a logfile or get them as kind of return value of checkModel(). All I've found in the documentation was, that checkModel() only returns a success-boolean. Are these infos saved temporarily somewhere?
Note, that I only want to apply checkModel() but not actually translating the code.
I finally found a solution at least for Dymola 2016 and newer, so if someone is interested - here it is (it is not very user-friendly, but it works):
The key-command is getLastError() which not only returns the last error (as one could think...), but all errors that are detected by checkModel() as well as the overall statistics.
All informations are sampled in one string, in which the last lines looks like:
"[...]
Local classes checked, checking <[Some Path]>
ERROR: 2 errors were found
WARNING: 13 warnings were issued
= false
"
Following operations will return the number of actual errors (for warnings it is more or less the same):
b = checkmodel([Some Model])
s = getLastError()
ind1 = Modelica.Utilities.Strings.findLast(s,"ERROR:")
ind2 = Modelica.Utilities.Strings.findLast(s," errors were found")
nErrors = Modelica.Utilities.Strings.substring(s,ind1+6,ind2) //6 = len(ERROR:)
nErrors = Modelica.Utilities.Strings.replace(nErrors," ","")
nErrors
= "2"
Note:
I used findLast as I know, that the lines of interest are at the very end of the string. So this is significantly faster than using find
This only works, if the line "ERROR: ...." actually exists. Otherwise, the substring call will throw an error.
Of course this could be done in less lines, but maybe this version is easier to read.
NOTE: This will only works with Dymola 2016 and newer. The return-string of getLastError is of a different structure in Dymola 2015 and older.
The following should handle it:
clearlog(); // To start fresh
Advanced.TranslationInCommandLog=true;
checkModel(...);
savelog(...);
This is mentioned in the Dymola User Manual Volume 1, section "Parameter studies by running Dymola a number of time in “batch mode”" on pg 630 or so.

Error returned by Nuance DragonMobile text-to-speech when maximum number of transactions is reached

I'm about to release my App on IOS that uses Nuance Dragon Mobile SDK. I'm signed up for the "Silver" plan, which allows me 20 transactions per day.
My question is, does anyone know what error is returned by Nuance, when the limit is exceeded? I'm concerned, because I am filtering out:
error.code == 5 // Because this fires whenever I interrupt running speech
error.code == 1 // Because after interrupting speech, the first time I restart, it cuts off
// before finished, so I automatically start again, so as not to trouble the user to do so
I figure if Nuance returns an error different from these, I'll allow it to pass through, and be able to alert the user that they've reached their daily limit.
I think the following gives the possible errors:
extern NSString * const SKSpeechErrorDomain;
enum {
SKServerConnectionError = 1,
SKServerRetryError = 2,
SKRecognizerError = 3,
SKVocalizerError = 4,
SKCancelledError = 5,
};
It seems likely to me that it's the SKServerConnectionError that would be fired. In that case, I need to come up with a different strategy. If I could figure out what's going on with the restart issue I wouldn't have to filter out that error. Plus, when I automatically restart these false starts, I'm probably racking up my transaction count, which is unfortunate.
Anybody have experience with this aspect of the Nuance SDK for IOS?