I am just beginning to learn audio programming using supercollider.
When I play a sound I am able to hear it on speakers but not headphone.
I get the following message on starting server -
booting 57110
localhost
JackDriver: client name is 'SuperCollider'
SC_AudioDriver: sample rate = 48000.000000, driver's block size = 1024
JackDriver: connected system:capture_1 to SuperCollider:in_1
JackDriver: connected system:capture_2 to SuperCollider:in_2
JackDriver: connected SuperCollider:out_1 to system:playback_1
JackDriver: connected SuperCollider:out_2 to system:playback_2
SuperCollider 3 server ready.
JackDriver: max output latency 42.7 ms
Receiving notification messages from server localhost
Shared memory server interface initialized
I went through some forums and they suggested to look for output devices options and set them, I did a -
ServerOptions.devices;
to look for device list but I got the following error in the post window -
ERROR: A primitive was not bound. 0 676
Instance of Method { (0x21199c0, gc=01, fmt=00, flg=11, set=04)
instance variables [15]
raw1 : Float 0.000000 00000000 0080000C
raw2 : Float 0.000000 00000300 03020003
code : instance of Int8Array (0x2119cc0, size=4, set=2)
selectors : nil
constants : nil
prototypeFrame : instance of Array (0x2119c00, size=3, set=2)
context : nil
argNames : instance of SymbolArray (0x2119b40, size=3, set=2)
varNames : nil
sourceCode : nil
ownerClass : class Meta_ServerOptions (0x21113c0)
name : Symbol 'prListDevices'
primitiveName : Symbol '_ListAudioDevices'
filenameSymbol : Symbol '/usr/share/SuperCollider/SCClassLibrary/Common/Control/Server.sc'
charPos : Integer 4025
}
ERROR: Primitive 'none' failed.
Failed.
RECEIVER:
nil
CALL STACK:
MethodError:reportError 0x3601498
arg this =
Nil:handleError 0x1f730f8
arg this = nil
arg error =
Thread:handleError 0x35fcfd8
arg this =
arg error =
Object:throw 0x3980c58
arg this =
Object:primitiveFailed 0x33395a8
arg this = nil
Interpreter:interpretPrintCmdLine 0x3d061e8
arg this =
var res = nil
var func =
var code = "ServerOptions.devices;"
var doc = nil
var ideClass =
Process:interpretPrintCmdLine 0x3443c08
arg this =
^^ The preceding error dump is for ERROR: Primitive 'none' failed.
Failed.
RECEIVER: nil
booting 57110
localhost
JackDriver: client name is 'SuperCollider'
SC_AudioDriver: sample rate = 48000.000000, driver's block size = 1024
JackDriver: connected system:capture_1 to SuperCollider:in_1
JackDriver: connected system:capture_2 to SuperCollider:in_2
JackDriver: connected SuperCollider:out_1 to system:playback_1
JackDriver: connected SuperCollider:out_2 to system:playback_2
SuperCollider 3 server ready.
JackDriver: max output latency 42.7 ms
Receiving notification messages from server localhost
Shared memory server interface initialized
ERROR: A primitive was not bound. 0 676
Instance of Method { (0x21199c0, gc=01, fmt=00, flg=11, set=04)
instance variables [15]
raw1 : Float 0.000000 00000000 0080000C
raw2 : Float 0.000000 00000300 03020003
code : instance of Int8Array (0x2119cc0, size=4, set=2)
selectors : nil
constants : nil
prototypeFrame : instance of Array (0x2119c00, size=3, set=2)
context : nil
argNames : instance of SymbolArray (0x2119b40, size=3, set=2)
varNames : nil
sourceCode : nil
ownerClass : class Meta_ServerOptions (0x21113c0)
name : Symbol 'prListDevices'
primitiveName : Symbol '_ListAudioDevices'
filenameSymbol : Symbol '/usr/share/SuperCollider/SCClassLibrary/Common/Control/Server.sc'
charPos : Integer 4025
}
ERROR: Primitive 'none' failed.
Failed.
RECEIVER:
nil
CALL STACK:
MethodError:reportError 0x35be518
arg this =
Nil:handleError 0x1ee0b78
arg this = nil
arg error =
Thread:handleError 0x3470ab8
arg this =
arg error =
Object:throw 0x3636a78
arg this =
Object:primitiveFailed 0x3cd86c8
arg this = nil
Interpreter:interpretPrintCmdLine 0x3d44b98
arg this =
var res = nil
var func =
var code = "ServerOptions.devices;"
var doc = nil
var ideClass =
Process:interpretPrintCmdLine 0x37c8708
arg this =
^^ The preceding error dump is for ERROR: Primitive 'none' failed.
Failed.
RECEIVER: nil
I am new to supercollider and I having a hard time figuring the reason for the error. Please suggest me how to resolve this.
Thanks in Advance.
I was having a similar problem (no output from supercollider at all, just complete and total silence), and this post ultimately led me to the right solution. I think it will be helpful to you and others.
From the ServerOptions documentation, I found that I could configure how SC talks to jack with environment variables.
In my case, I start scsynth with the relevant environment variables like so:
SC_JACK_DEFAULT_INPUTS="system:capture_1" SC_JACK_DEFAULT_OUTPUTS="system" scsynth -u 57110 &
It seems this can also be done from within sclang like so:
"SC_JACK_DEFAULT_INPUTS".setenv("system:capture_1");
"SC_JACK_DEFAULT_OUTPUTS".setenv("system");
In your case, where you are connecting to the wrong outputs, you might want to start scsynth like this:
SC_JACK_DEFAULT_OUTPUTS="system:playback_3,system:playback_4" scsynth -u 57110 &
Another alternative that will let you play with these connections and find what works for you is to use the jack_lsp, jack_connect, and jack_disconnect commands.
To see all of the ins/outs of your jack server as well as the current connections, run
jack_lsp -c
From your post, I think you will see something like
system:capture_1
SuperCollider:in_1
system:capture_2
SuperCollider:in_2
system:playback_1
SuperCollider:out_1
system:playback_2
SuperCollider:out_2
system:playback_3
system:playback_4
SuperCollider:out_1
system:playback_1
SuperCollider:out_2
system:playback_2
To make SuperCollider output to your headphones and speakers, you could conect out_1 and out_2 to playback_3 and playback_4 (assuming those are your headphones) like so:
jack_connect SuperCollider:out_1 system:playback_3
jack_connect SuperCollider:out_2 system:playback_4
To disconnect from the speakers, you could do
jack_disconnect SuperCollider:out_1 system:playback_1
jack_disconnect SuperCollider:out_2 system:playback_2
Run jack_lsp -c again to see if your system is setup how you want!
I had the same problem. I discovered the solution by using Catia from KXStudio. See Catia
Catia is a JACK Patchbay. (Other patchbays are available. QJackctl and Patchage are examples). On my system (Ubuntu 14.04 on a Dell Studio laptop), SuperCollider maps its first 4 outputs to the 4 system playbacks. The first 2 system playbacks are the speakers, system playbacks 3 and 4 are the headphones. By remapping out1 and out2 from SC to playback_3 and playback_4, I hear it through the headphones. So, get hold of a patchbay for JACK, and see what you see.
Hope this helps.
After struggling countless times with this issue I managed to get it working with:
Add your user to the audio linux group.
Use cadence to start jack
Additional resource that could be helpful: https://wiki.archlinux.org/index.php/JACK_Audio_Connection_Kit
Firstly: the big long error message saying "A primitive was not bound" is unpleasant but in this case it just means you typed the wrong command. I don't know where you got that command ServerOptions.devices from but it's just wrong. Maybe the message was intended to tell you to type s.options.device which is more sensible but it's NOT what you need to do. Forget that and forget that long error message.
Secondly: the message you see when you boot the server is good, it tells you that the server has booted and connected to jack. SuperCollider is happy. If you hear sound out of the speakers but not from the headphones (I take it you mean when you plug the headphones in!), this is NOT a supercollider problem but just a standard operating-system issue about setting the volume on your headphones.
It appears that you're using linux so run the command alsamixer in a terminal, that's a good way to check whether the headphone output is muted. Use man alsamixer to understand how to use it, if it's not familiar to you.
Related
I am building a flutter app that needs to record an audio and predict some label using a tflite model I built. For linking the audio recording and tflite I use the flutter plugin tf-lite audio (https://github.com/Caldarie/flutter_tflite_audio).
The tensorflow model works on colab but when I launch the app and inference happens hence when it calls interpreter.invoke(), the following error occurs:
TensorFlow Lite Error: tensorflow/lite/kernels/reshape.cc:58 stretch_dim != -1 (0 != -1)
TensorFlow Lite Error: Node number 26 (RESHAPE) failed to prepare.
Failed to invoke the interpreter with error: Must call allocateTensors().
2
Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value: file tflite_audio/SwiftTfliteAudioPlugin.swift, line 290
* thread #2, queue = 'conversionQueue', stop reason = Fatal error: Unexpectedly found nil while implicitly unwrapping an Optional value
frame #0: 0x00000001a672ee08 libswiftCore.dylib`_swift_runtime_on_report
libswiftCore.dylib`_swift_runtime_on_report:
-> 0x1a672ee08 <+0>: ret
libswiftCore.dylib`_swift_reportToDebugger:
0x1a672ee0c <+0>: b 0x1a672ee08 ; _swift_runtime_on_report
libswiftCore.dylib`_swift_shouldReportFatalErrorsToDebugger:
0x1a672ee10 <+0>: adrp x8, 341475
0x1a672ee14 <+4>: ldrb w0, [x8, #0x7c8]
Target 0: (Runner) stopped.
Lost connection to device.
This error message appears even though I added allocateTensors in the SwiftTfliteAudioPlugin.swift file here:
var interval: TimeInterval!
var outputTensor: Tensor!
do {
// Copy the `[Int16]` buffer data as an array of Floats to the audio buffer input Tensor.
let audioBufferData = Data(copyingBufferOf: buffer.map { Float($0) / maxInt16AsFloat32 })
try interpreter.copy(audioBufferData, toInputAt: 0)
// I added this line
try interpreter.allocateTensors()
// Calculate inference time
let startDate = Date()
try interpreter.invoke() //required!!! Do not touch
interval = Date().timeIntervalSince(startDate) * 1000
// Get the output `Tensor` to process the inference results.
outputTensor = try interpreter.output(at: 0)
print(outputTensor as Any)
} catch let error {
print("Failed to invoke the interpreter with error: \(error.localizedDescription)")
}
In the tflite model here is the problematic node on netron
It looks like it is only squeezing the first dimension so maybe it cannot because as you can see on the summary of my model the first dimension is None, I tried some tricks to avoid having this None but I am not familiar enough with tensorflow to be sure about the validity of the operations I am doing.
I have boiled down my model to the minimal size and this node is between these 2 lines of code, so I suspect the tf.signal.stft function to do this reshaping but have no idea.
spectrograms = tf.signal.stft(waveforms,
frame_length=self.fft_size,
frame_step=self.hop_size,
pad_end=False)
magnitude_spectrograms = tf.abs(spectrograms)
Can anyone help on this issue?
Thanks!
As stated in the error message, you need to call allocateTensors first.
I try to read out the ActiveState property of a systemd unit with gdbus/glib-2.0. For sd-bus there exists the convenient function sd_bus_get_property_string. What would the equivalent call if gdbus is used. I am ware of the gdbus introspect command, but I need to implement that in C/C++.
I managed to start and stop units already. Now I need to verify that a unit has been successful started/stopped. I am new to dbus and have been searching the internet for some hours for an example, without finding something helpful.
I also implemented some systemd stuff in C++. Here was my solution:
std::string Unit::GetPropertyString(const std::string& property) const
{
sd_bus_error err = SD_BUS_ERROR_NULL;
char* msg = nullptr;
int r;
r = sd_bus_get_property_string(m_bus,
"org.freedesktop.systemd1",
("/org/freedesktop/systemd1/unit/" + m_unit).c_str(),
"org.freedesktop.systemd1.Unit",
property.c_str(),
&err,
&msg);
if (r < 0)
{
std::string err_msg(err.message);
sd_bus_error_free(&err);
std::string err_str("Failed to get " + property + " for service "
+ m_name + ". Error: " + err_msg);
throw slib_exception(err_str);
}
sd_bus_error_free(&err);
// Free memory (avoid leaking)
std::string ret(msg);
free (msg);
return ret;
}
From this, you can call
activestate = GetPropertyString("ActiveState");
substate = GetPropertyString("SubState");
I found that a lot of the <systemd/sd-bus.h> wasn't well documented. There is a fantastic explanation by the author here:
http://0pointer.net/blog/the-new-sd-bus-api-of-systemd.html
But outside of the few examples he gives, I found it was easier to inspect the source code. Specifically, I found it nice looking into the source-code of the systemctl and journalctl applications to see how sd-bus was used in those contexts.
I'm having trouble calculating the MAC of the finished message.The RFC gives the formula
HMAC_hash(MAC_write_secret, seq_num + TLSCompressed.type +
TLSCompressed.version + TLSCompressed.length +
TLSCompressed.fragment));
But the tlsCompressed(tlsplaintext in this case because no compression is used) does not contain version information:(hex dump)
14 00 00 0c 2c 93 e6 c5 d1 cb 44 12 bd a0 f9 2d
the first byte is the tlsplaintext.type, followed by uint24 length.
The full message, with the MAC and padding appended and before encryption is
1400000c2c93e6c5d1cb4412bda0f92dbc175a02daab04c6096da8d4736e7c3d251381b10b
I have tried to calculate the hmac with the following parameters(complying to the rfc) but it does not work:
uint64 seq_num
uint8 tlsplaintext.type
uint8 tlsplaintext.version_major
uint8 tlscompressed.version_minor
uint16 tlsplaintext.length
opaque tlsplaintext.fragment
I have also tried omitting the version and using uint24 length instead.no luck.
My hmac_hash() function cannot be the problem because it has worked thus far. I am also able to compute the verify_data and verify it.
Because this is the first message sent under the new connection state, the sequence number is 0.
So, what exactly are the parameters for the calculation of the MAC for the finished message?
Here's the relevant source from Forge (JS implementation of TLS 1.0):
The HMAC function:
var hmac_sha1 = function(key, seqNum, record) {
/* MAC is computed like so:
HMAC_hash(
key, seqNum +
TLSCompressed.type +
TLSCompressed.version +
TLSCompressed.length +
TLSCompressed.fragment)
*/
var hmac = forge.hmac.create();
hmac.start('SHA1', key);
var b = forge.util.createBuffer();
b.putInt32(seqNum[0]);
b.putInt32(seqNum[1]);
b.putByte(record.type);
b.putByte(record.version.major);
b.putByte(record.version.minor);
b.putInt16(record.length);
b.putBytes(record.fragment.bytes());
hmac.update(b.getBytes());
return hmac.digest().getBytes();
};
The function that creates the Finished record:
tls.createFinished = function(c) {
// generate verify_data
var b = forge.util.createBuffer();
b.putBuffer(c.session.md5.digest());
b.putBuffer(c.session.sha1.digest());
// TODO: determine prf function and verify length for TLS 1.2
var client = (c.entity === tls.ConnectionEnd.client);
var sp = c.session.sp;
var vdl = 12;
var prf = prf_TLS1;
var label = client ? 'client finished' : 'server finished';
b = prf(sp.master_secret, label, b.getBytes(), vdl);
// build record fragment
var rval = forge.util.createBuffer();
rval.putByte(tls.HandshakeType.finished);
rval.putInt24(b.length());
rval.putBuffer(b);
return rval;
};
The code to handle a Finished message is a bit lengthier and can be found here. I see that I have a comment in that code that sounds like it might be relevant to your problem:
// rewind to get full bytes for message so it can be manually
// digested below (special case for Finished messages because they
// must be digested *after* handling as opposed to all others)
Does this help you spot anything in your implementation?
Update 1
Per your comments, I wanted to clarify how TLSPlainText works. TLSPlainText is the main "record" for the TLS protocol. It is the "wrapper" or "envelope" for content-specific types of messages. It always looks like this:
struct {
ContentType type;
ProtocolVersion version;
uint16 length;
opaque fragment[TLSPlaintext.length];
} TLSPlaintext;
So it always has a version. A Finished message is a type of handshake message. All handshake messages have a content type of 22. A handshake message looks like this:
struct {
HandshakeType msg_type;
uint24 length;
body
} Handshake;
A Handshake message is yet another envelope/wrapper for other messages, like the Finished message. In this case, the body will be a Finished message (HandshakeType 20), which looks like this:
struct {
opaque verify_data[12];
} Finished;
To actually send a Finished message, you have to wrap it up in a Handshake message envelope, and then like any other message, you have to wrap it up in a TLS record (TLSPlainText). The ultimate result looks/represents something like this:
struct {
ContentType type=22;
ProtocolVersion version=<major, minor>;
uint16 length=<length of fragment>;
opaque fragment=<struct {
HandshakeType msg_type=20;
uint24 length=<length of finished message>;
body=<struct {
opaque verify_data[12]>;
} Finished>
} Handshake>
} TLSPlainText;
Then, before transport, the record may be altered. You can think of these alterations as operations that take a record and transform its fragment (and fragment length). The first operation compresses the fragment. After compression you compute the MAC, as described above and then append that to the fragment. Then you encrypt the fragment (adding the appropriate padding if using a block cipher) and replace it with the ciphered result. So, when you're finished, you've still got a record with a type, version, length, and fragment, but the fragment is encrypted.
So, just so we're clear, when you're computing the MAC for the Finished message, imagine passing in the above TLSPlainText (assuming there's no compression as you indicated) to a function. This function takes this TLSPlainText record, which has properties for type, version, length, and fragment. The HMAC function above is run on the record. The HMAC key and sequence number (which is 0 here) are provided via the session state. Therefore, you can see that everything the HMAC function needs is available.
In any case, hopefully this better explains how the protocol works and that will maybe reveal what's going wrong with your implementation.
I'm trying to use a remote IO connection and route the audio input through the built in filter effect (iOS 5 only) and then back out of the hardware. I can make it route straight from the input to the output but I can't get the filter to work. I'm not sure whether it's the filter Audio Unit or the routing that I've got wrong.
This bit is just my attempt at setting up the filter and changing the routing so that the data is processed by it.
Any help is appreciated.
// ******* BEGIN FILTER ********
NSLog(#"Begin filter");
// Creates Audio Component Description - Output Filter
AudioComponentDescription filterCompDesc;
filterCompDesc .componentType = kAudioUnitType_Effect;
filterCompDesc.componentSubType = kAudioUnitSubType_LowPassFilter;
filterCompDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
filterCompDesc.componentFlags = 1;
filterCompDesc.componentFlagsMask = 1;
// Create Filter Unit
AudioUnit lpFilterUnit;
AudioComponent filterComponent = AudioComponentFindNext(NULL, &filterCompDesc);
setupErr = AudioComponentInstanceNew(filterComponent, &lpFilterUnit);
NSAssert(setupErr == noErr, #"No instance of filter");
AudioUnitElement bus2 = 2;
setupErr = AudioUnitSetProperty(lpFilterUnit, kAudioUnitSubType_LowPassFilter, kAudioUnitScope_Output, bus2, &oneFlag, sizeof(oneFlag));
AudioUnitElement bus3 = 3;
setupErr = AudioUnitSetProperty(lpFilterUnit, kAudioUnitSubType_LowPassFilter, kAudioUnitScope_Input, bus3, &oneFlag, sizeof(oneFlag));
// ******** END FILTER ******** //
AudioUnitConnection hardInToLP;
hardInToLP.sourceAudioUnit = remoteIOunit;
hardInToLP.sourceOutputNumber = 1;
hardInToLP.destInputNumber = 3;
setupErr = AudioUnitSetProperty (
remoteIOunit, // connection destination
kAudioUnitProperty_MakeConnection, // property key
kAudioUnitScope_Input, // destination scope
bus3, // destination element
&hardInToLP, // connection definition
sizeof (hardInToLP)
);
AudioUnitConnection LPToHardOut;
LPToHardOut.sourceAudioUnit = lpFilterUnit;
LPToHardOut.sourceOutputNumber = 1;
LPToHardOut.destInputNumber = 3;
setupErr = AudioUnitSetProperty (
remoteIOunit, // connection destination
kAudioUnitProperty_MakeConnection, // property key
kAudioUnitScope_Input, // destination scope
bus3, // destination element
&hardInToLP, // connection definition
sizeof (hardInToLP)
);
/*
// Sets up the Audio Units Connection - new instance called connection
AudioUnitConnection connection;
// Connect Audio Input's out to Audio Out's in
connection.sourceAudioUnit = remoteIOunit;
connection.sourceOutputNumber = bus1;
connection.destInputNumber = bus0;
setupErr = AudioUnitSetProperty(remoteIOunit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, bus0, &connection, sizeof(connection));
*/
NSAssert(setupErr == noErr, #"No RIO connection");
A couple things going on here:
You're gonna help yourself a lot if you do an assert (or some sort of check-error-and-log-it) after every call that can return an OSStatus. That way you'll figure out how far you're getting. Probably also want to log the actual OSStatus value when it's != noErr, and then look it up (start in "Audio Unit Component Services Reference" in Xcode documentation viewer).
After you create the filter AudioUnit, I don't get what you're doing with the AudioUnitSetProperty() calls. The second parameter should be the name of a property (something that starts with kAudioUnitProperty...). That's almost certainly returning an error right there.
remoteIOunit only has two buses, and they have special meanings. bus 1 is input from the mic, bus 0 is output to hardware. Trying to connect to remote io input scope bus 3 is probably going to be another error
Suggest you roll back to when you had audio pass-through working. That would mean you had just remoteIO, and a connection from output scope / bus 1 to input scope / bus 0.
Then create the filter unit. Change your connections so you connect:
remoteIO output scope bus 1 to filter input scope bus 0
filter output scope bus 0 to remoteIO input scope bus 0
The other thing that's going to be a problem is that all these iOS 5 filters seem to want to use floating-point LPCM formats, which is not the canonical format your other units will default to. You may have to get the stream format from the filter unit (input or output scope are probably the same?) and then set that as the format that remoteIO output scope / bus 1 produces and remoteIO input scope / bus 0 accepts. Another option would be to introduce AUConverter units before and after the filter unit.
The first answer given here just saved me a lot more frustration. No where does the Apple documentation tell you that the file formats for the Effect units require floating point. I couldn't figure out why it kept failing to play my audio properly until I read this post. I followed the advice above and retrieved the stream format from the low pass filter unit, and used that to set up two converter units that I created (ie. set the output format of the pre filter converter, and the input format of the post filter converter. Once I did that and connected all the nodes together it started working as expected.
im trying to use a low pass filter and when trying to do as suggested aka set the format i keep getting an error "the operation could not be completed" what in this code is faulty?
After retrieving the lowpassUnit I also check for errors but there are none.
result = AudioUnitSetProperty(lowpassUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &stereoStreamFormat, sizeof (stereoStreamFormat));
if (noErr != result)
{
NSLog(#"%#", [NSError errorWithDomain:NSOSStatusErrorDomain code:result userInfo:nil]);
return;
}
PS: If anyone knows of proper Audio unit documentation please share as the official documentation is really lacking
I'm trying to experiment with using splice (man 2 splice) to copy data from a UDP socket directly to a file. Unfortunately the first call to splice() returns EINVAL.
The man page states:
EINVAL Target file system doesn't support splicing; target file is opened in
append mode; neither of the descriptors refers to a pipe; or offset
given for nonseekable device.
However, I believe none of those conditions apply. I'm using Fedora 15 (kernel 2.6.40-4) so I believe splice() is supported on all filesystems. The target file should be irrelevant in the first call to splice, but for completeness I'm opening it via open(path, O_CREAT | O_WRONLY | O_TRUNC, S_IRUSR | S_IWUSR). Both calls use a pipe and neither call uses an offset besides NULL.
Here's my sample code:
int sz = splice(sock_fd, 0, mPipeFds[1], 0, 8192, SPLICE_F_MORE);
if (-1 == sz)
{
int err = errno;
LOG4CXX_ERROR(spLogger, "splice from: " << strerror(err));
return 0;
}
sz = splice(mPipeFds[0], 0, file_fd, 0, sz, SPLICE_F_MORE);
if (-1 == sz)
{
int err = errno;
LOG4CXX_ERROR(spLogger, "splice to: " << strerror(err));
}
return 0;
sock_fd is initialized by the following psuedocode:
int sock_fd = socket(AF_INET, SOCK_DGRAM, IPPROTO_UDP);
setsockopt(sock_fd, SOL_SOCKET, SO_REUSEADDR, &one, sizeof(one));
fcntl(sock_fd, F_SETFL, flags | O_NONBLOCK);
bind(sock_fd, ...);
Possibly related is that this code snippet is running inside a libevent loop. libevent is using epoll() to determine if the UDP socket is hot.
Found my answer. tl;dr - UDP isn't supported on the inbound side.
After enough Googling I stumbled upon a forum discussion and some test code which prints out a table of in/out fd types and their support:
$ ./a.out
in\out pipe reg chr unix tcp udp
pipe yes yes yes yes yes yes
reg yes no no no no no
chr yes no no no no no
unix no no no no no no
tcp yes no no no no no
udp no no no no no no
Yeah, it is definitely not supported for reading from a UDP socket, even in the latest kernels. References to the kernel source follow.
splice invokes do_splice in the kernel, which calls do_splice_to, which calls the splice_read member in the file_operations structure for the file.
For sockets, that structure is defined as socket_file_ops in net/socket.c, which initializes the splice_read field to sock_splice_read.
That function, in turn, contains this line of code:
if (unlikely(!sock->ops->splice_read))
return -EINVAL;
The ops field of the socket is a struct proto_ops. For an IPv4 UDP socket, it is initialized to inet_dgram_ops in net/ipv4/af_inet.c. Finally, that structure does not explicitly initialize the splice_read field of struct proto_ops; i.e., it initializes it to zero.
So sock_splice_read returns -EINVAL, and that propagates up.