Vulkan load vkCreateDebugReportCallbackEXT - vulkan

I am getting into Vulkan and stumbled on my first problem. When trying to create a debug report callback (validation layers and debug extensions are available on my intel hd vulkan driver, at least it says so), it fails telling me vkCreateDebugReportCallbackEXT is an unresolved symbol. When trying to get the function pointer it fails telling me vkCreateDebugReportCallbackEXT is already defined.
Which it is, in the Vulkan header. I could set VK_NO_PROTOTYPES but then I would have to load everything by hand. Is there a way around this? Just using a different name for the function pointer won't work, since I am using Vulkan-Hpp and it uses vkCreateDebugReportCallbackEXT as it is.
Is this a driver bug, telling me debug extensions are available, but there are not?
Btw, I am using VS2015.
Thanks for any help

That's normal. vulkan.h defines them as a global functions. But the loader commands obviously return function pointer.
Normally you would just use a different name you like. But I like to have the canonical names too...
I solve it by defining the function myself (using the declaration from vulkan.h) which in turn calls the loaded pointer:
VKAPI_ATTR VkResult VKAPI_CALL vkCommandEXT( /*...*/ ){
return fpCommandEXT( /*...*/ );
}
(Shameless self-promotion) Like so:
https://github.com/krOoze/Hello_Triangle/blob/8227220/ErrorHandling.h#L181
I make the command to self-load on its first use — if you don't like that, in older commit I had more conventional loader:
https://github.com/krOoze/Hello_Triangle/blob/699ab57/HelloTriangle.cpp#L731
PS:
Khronos themselves just added loader code that illustrates that nicely:
https://github.com/KhronosGroup/Vulkan-Docs/blob/1.0/src/ext_loader/vulkan_ext.c
If you handle multiple VkInstances or VkDevices the loaded functions have to be dispatched to the correct instance or device. For example, I do that (likely inefficiently) here:
https://github.com/krOoze/Hello_Triangle/blob/a691de5/ExtensionLoader.h

I had same issue, couldn't find solution so i solved it this way
(might be wrong, but i just want to share in case it will help somebody):
struct DebugDispatch {
//KHRONOS
PFN_vkCreateDebugReportCallbackEXT vkCreateDebugReportCallbackEXT = 0;
PFN_vkDestroyDebugReportCallbackEXT vkDestroyDebugReportCallbackEXT = 0;
//LUNARG
PFN_vkCreateDebugUtilsMessengerEXT vkCreateDebugUtilsMessengerEXT = 0;
PFN_vkDestroyDebugUtilsMessengerEXT vkDestroyDebugUtilsMessengerEXT = 0;
}
VKAPI_ATTR vk::Bool32 VKAPI_CALL debugReportCallback(...){...}
VKAPI_ATTR vk::Bool32 VKAPI_CALL debugUtilsMessengerCallback(...){...}
enum class ValidationFlagsBits : unsigned int {
NONE = 0,
KHRONOS = 1,
LUNARG = 1 << 1
};
typedef vk::Flags<ValidationFlagsBits> ValidationFlags;
void Example(){
...
vk::Instance instance;
instance = vk::createInstance(...);
DebugDispatch debug_dispatch;
vk::DebugReportCallbackEXT debug_report_callback;
vk::DebugUtilsMessengerEXT debug_utils_messenger;
if(validation_flags & ValidationFlagsBits::KHRONOS){
debug_dispatch.vkCreateDebugReportCallbackEXT =
(PFN_vkCreateDebugReportCallbackEXT)instance.getProcAddr("vkCreateDebugReportCallbackEXT");
debug_dispatch.vkDestroyDebugReportCallbackEXT =
(PFN_vkDestroyDebugReportCallbackEXT)instance.getProcAddr("vkDestroyDebugReportCallbackEXT");
vk::DebugUtilsMessengerCreateInfoEXT create_info{};
create_info.messageSeverity = ...;
create_info.messageType = ...;
create_info.pfnUserCallback = reinterpret_cast<PFN_vkDebugUtilsMessengerCallbackEXT>(&debugUtilsMessengerCallback);
debug_utils_messenger = instance.createDebugUtilsMessengerEXT(create_info, nullptr, debug_dispatch);
}
if(validation_flags & ValidationFlagsBits::LUNARG){
debug_dispatch.vkCreateDebugUtilsMessengerEXT =
(PFN_vkCreateDebugUtilsMessengerEXT)instance.getProcAddr("vkCreateDebugUtilsMessengerEXT");
debug_dispatch.vkDestroyDebugUtilsMessengerEXT =
(PFN_vkDestroyDebugUtilsMessengerEXT)instance.getProcAddr("vkDestroyDebugUtilsMessengerEXT");
vk::DebugReportCallbackCreateInfoEXT create_info{};
create_info.flags = ...;
create_info.pfnCallback = reinterpret_cast<PFN_vkDebugReportCallbackEXT>(&debugReportCallback);
debug_report_callback = instance.createDebugReportCallbackEXT(create_info, nullptr, debug_dispatch);
}
...
if(validation_flags & ValidationFlagsBits::KHRONOS){
instance.destroyDebugUtilsMessengerEXT(debug_utils_messenger, nullptr, debug_dispatch);
}
if(validation_flags & ValidationFlagsBits::LUNARG){
instance.destroyDebugReportCallbackEXT(debug_report_callback, nullptr, debug_dispatch);
}
instance.destroy();
}

Related

Unable to compile tensorflow lite examples on adafruit circuitplayground bluefruit due to missing files in Adafruit_Tensorflow_Lite library

I am unable to compile the examples , hello_world_arcada and micro_speech_arcada shown below , on the adafruit website found here on my Circuit playground bluefruit microcontroller:
I installed the Adafruit_Tensorflow_Lite library as mentioned in the site however it turns out that examples cannot compile because they have numerous missing files. So i downloaded this tensorflow git hub repo and then transfered the missing files into the Adafruit_Tensorflow_Lite library.
I am now facing this error for the missing files : am_bsp.h ,am_mcu_apollo.h , am_util.h , i cannot locate these files on the repo or on google.[Note: i have found the am_bsp.h file in this repo
but it still doesnt compile.
Can anyone assist me in locating where i can find these files or a way to compile the example code mentioned in the adafruit website ?
The error is shown in the pic below of the missing file am_bsp.h when using Arduino IDE to compile:
My code is shown below:
#include <TensorFlowLite.h>
#include "Adafruit_TFLite.h"
#include "Adafruit_Arcada.h"
#include "output_handler.h"
#include "sine_model_data.h"
// Create an area of memory to use for input, output, and intermediate arrays.
// Finding the minimum value for your model may require some trial and error.
const int kTensorAreaSize (2 * 1024);
// This constant represents the range of x values our model was trained on,
// which is from 0 to (2 * Pi). We approximate Pi to avoid requiring additional
// libraries.
const float kXrange = 2.f * 3.14159265359f;
// Will need tuning for your chipset
const int kInferencesPerCycle = 200;
int inference_count = 0;
Adafruit_Arcada arcada;
Adafruit_TFLite ada_tflite(kTensorAreaSize);
// The name of this function is important for Arduino compatibility.
void setup() {
Serial.begin(115200);
//while (!Serial) yield();
arcada.arcadaBegin();
// If we are using TinyUSB we will have the filesystem show up!
arcada.filesysBeginMSD();
arcada.filesysListFiles();
// Set the display to be on!
arcada.displayBegin();
arcada.setBacklight(255);
arcada.display->fillScreen(ARCADA_BLUE);
if (! ada_tflite.begin()) {
arcada.haltBox("Failed to initialize TFLite");
while (1) yield();
}
if (arcada.exists("model.tflite")) {
arcada.infoBox("Loading model.tflite from disk!");
if (! ada_tflite.loadModel(arcada.open("model.tflite"))) {
arcada.haltBox("Failed to load model file");
}
} else if (! ada_tflite.loadModel(g_sine_model_data)) {
arcada.haltBox("Failed to load default model");
}
Serial.println("\nOK");
// Keep track of how many inferences we have performed.
inference_count = 0;
}
// The name of this function is important for Arduino compatibility.
void loop() {
// Calculate an x value to feed into the model. We compare the current
// inference_count to the number of inferences per cycle to determine
// our position within the range of possible x values the model was
// trained on, and use this to calculate a value.
float position = static_cast<float>(inference_count) /
static_cast<float>(kInferencesPerCycle);
float x_val = position * kXrange;
// Place our calculated x value in the model's input tensor
ada_tflite.input->data.f[0] = x_val;
// Run inference, and report any error
TfLiteStatus invoke_status = ada_tflite.interpreter->Invoke();
if (invoke_status != kTfLiteOk) {
ada_tflite.error_reporter->Report("Invoke failed on x_val: %f\n",
static_cast<double>(x_val));
return;
}
// Read the predicted y value from the model's output tensor
float y_val = ada_tflite.output->data.f[0];
// Output the results. A custom HandleOutput function can be implemented
// for each supported hardware target.
HandleOutput(ada_tflite.error_reporter, x_val, y_val);
// Increment the inference_counter, and reset it if we have reached
// the total number per cycle
inference_count += 1;
if (inference_count >= kInferencesPerCycle) inference_count = 0;
}
Try to install the library from below link, it should solve your problems,
https://github.com/tensorflow/tflite-micro-arduino-examples#how-to-install

sd_bus_get_property_string equivalent for gdbus?

I try to read out the ActiveState property of a systemd unit with gdbus/glib-2.0. For sd-bus there exists the convenient function sd_bus_get_property_string. What would the equivalent call if gdbus is used. I am ware of the gdbus introspect command, but I need to implement that in C/C++.
I managed to start and stop units already. Now I need to verify that a unit has been successful started/stopped. I am new to dbus and have been searching the internet for some hours for an example, without finding something helpful.
I also implemented some systemd stuff in C++. Here was my solution:
std::string Unit::GetPropertyString(const std::string& property) const
{
sd_bus_error err = SD_BUS_ERROR_NULL;
char* msg = nullptr;
int r;
r = sd_bus_get_property_string(m_bus,
"org.freedesktop.systemd1",
("/org/freedesktop/systemd1/unit/" + m_unit).c_str(),
"org.freedesktop.systemd1.Unit",
property.c_str(),
&err,
&msg);
if (r < 0)
{
std::string err_msg(err.message);
sd_bus_error_free(&err);
std::string err_str("Failed to get " + property + " for service "
+ m_name + ". Error: " + err_msg);
throw slib_exception(err_str);
}
sd_bus_error_free(&err);
// Free memory (avoid leaking)
std::string ret(msg);
free (msg);
return ret;
}
From this, you can call
activestate = GetPropertyString("ActiveState");
substate = GetPropertyString("SubState");
I found that a lot of the <systemd/sd-bus.h> wasn't well documented. There is a fantastic explanation by the author here:
http://0pointer.net/blog/the-new-sd-bus-api-of-systemd.html
But outside of the few examples he gives, I found it was easier to inspect the source code. Specifically, I found it nice looking into the source-code of the systemctl and journalctl applications to see how sd-bus was used in those contexts.

Parallel Dynamic Programming with CUDA

It is my first attempt to implement recursion with CUDA. The goal is to extract all the combinations from a set of chars "12345" using the power of CUDA to parallelize dynamically the task. Here is my kernel:
__device__ char route[31] = { "_________________________"};
__device__ char init[6] = { "12345" };
__global__ void Recursive(int depth) {
// up to depth 6
if (depth == 5) return;
// newroute = route - idx
int x = depth * 6;
printf("%s\n", route);
int o = 0;
int newlen = 0;
for (int i = 0; i<6; ++i)
{
if (i != threadIdx.x)
{
route[i+x-o] = init[i];
newlen++;
}
else
{
o = 1;
}
}
Recursive<<<1,newlen>>>(depth + 1);
}
__global__ void RecursiveCount() {
Recursive <<<1,5>>>(0);
}
The idea is to exclude 1 item (the item corresponding to the threadIdx) in each different thread. In each recursive call, using the variable depth, it works over a different base (variable x) on the route device variable.
I expect the kernel prompts something like:
2345_____________________
1345_____________________
1245_____________________
1234_____________________
2345_345_________________
2345_245_________________
2345_234_________________
2345_345__45_____________
2345_345__35_____________
2345_345__34_____________
..
2345_245__45_____________
..
But it prompts ...
·_____________
·_____________
·_____________
·_____________
·_____________
·2345
·2345
·2345
·2345
...
What I´m doing wrong?
What I´m doing wrong?
I may not articulate every problem with your code, but these items should get you a lot closer.
I recommend providing a complete example. In my view it is basically required by Stack Overflow, see item 1 here, note use of the word "must". Your example is missing any host code, including the original kernel call. It's only a few extra lines of code, why not include it? Sure, in this case, I can deduce what the call must have been, but why not just include it? Anyway, based on the output you indicated, it seems fairly evident the launch configuration of the host launch would have to be <<<1,1>>>.
This doesn't seem to be logical to me:
I expect the kernel prompts something like:
2345_____________________
The very first thing your kernel does is print out the route variable, before making any changes to it, so I would expect _____________________. However we can "fix" this by moving the printout to the end of the kernel.
You may be confused about what a __device__ variable is. It is a global variable, and there is only one copy of it. Therefore, when you modify it in your kernel code, every thread, in every kernel, is attempting to modify the same global variable, at the same time. That cannot possibly have orderly results, in any thread-parallel environment. I chose to "fix" this by making a local copy for each thread to work on.
You have an off-by-1 error, as well as an extent error in this loop:
for (int i = 0; i<6; ++i)
The off-by-1 error is due to the fact that you are iterating over 6 possible items (that is, i can reach a value of 5) but there are only 5 items in your init variable (the 6th item being a null terminator. The correct indexing starts out over 0-4 (with one of those being skipped). On subsequent iteration depths, its necessary to reduce this indexing extent by 1. Note that I've chosen to fix the first error here by increasing the length of init. There are other ways to fix, of course. My method inserts an extra _ between depths in the result.
You assume that at each iteration depth, the correct choice of items is the same, and in the same order, i.e. init. However this is not the case. At each depth, the choices of items must be selected not from the unchanging init variable, but from the choices passed from previous depth. Therefore we need a local, per-thread copy of init also.
A few other comments about CUDA Dynamic Parallelism (CDP). When passing pointers to data from one kernel scope to a child scope, local space pointers cannot be used. Therefore I allocate for the local copy of route from the heap, so it can be passed to child kernels. init can be deduced from route, so we can use an ordinary local variable for myinit.
You're going to quickly hit some dynamic parallelism (and perhaps memory) limits here if you continue this. I believe the total number of kernel launches for this is 5^5, which is 3125 (I'm doing this quickly, I may be mistaken). CDP has a pending launch limit of 2000 kernels by default. We're not hitting this here according to what I see, but you'll run into that sooner or later if you increase the depth or width of this operation. Furthermore, in-kernel allocations from the device heap are by default limited to 8KB. I don't seem to be hitting that limit, but probably I am, so my design should probably be modified to fix that.
Finally, in-kernel printf output is limited to the size of a particular buffer. If this technique is not already hitting that limit, it will soon if you increase the width or depth.
Here is a worked example, attempting to address the various items above. I'm not claiming it is defect free, but I think the output is closer to your expectations. Note that due to character limits on SO answers, I've truncated/excerpted some of the output.
$ cat t1639.cu
#include <stdio.h>
__device__ char route[31] = { "_________________________"};
__device__ char init[7] = { "12345_" };
__global__ void Recursive(int depth, const char *oroute) {
char *nroute = (char *)malloc(31);
char myinit[7];
if (depth == 0) memcpy(myinit, init, 6);
else memcpy(myinit, oroute+(depth-1)*6, 6);
myinit[6] = 0;
if (nroute == NULL) {printf("oops\n"); return;}
memcpy(nroute, oroute, 30);
nroute[30] = 0;
// up to depth 6
if (depth == 5) return;
// newroute = route - idx
int x = depth * 6;
//printf("%s\n", nroute);
int o = 0;
int newlen = 0;
for (int i = 0; i<(6-depth); ++i)
{
if (i != threadIdx.x)
{
nroute[i+x-o] = myinit[i];
newlen++;
}
else
{
o = 1;
}
}
printf("%s\n", nroute);
Recursive<<<1,newlen>>>(depth + 1, nroute);
}
__global__ void RecursiveCount() {
Recursive <<<1,5>>>(0, route);
}
int main(){
RecursiveCount<<<1,1>>>();
cudaDeviceSynchronize();
}
$ nvcc -o t1639 t1639.cu -rdc=true -lcudadevrt -arch=sm_70
$ cuda-memcheck ./t1639
========= CUDA-MEMCHECK
2345_____________________
1345_____________________
1245_____________________
1235_____________________
1234_____________________
2345__345________________
2345__245________________
2345__235________________
2345__234________________
2345__2345_______________
2345__345___45___________
2345__345___35___________
2345__345___34___________
2345__345___345__________
2345__345___45____5______
2345__345___45____4______
2345__345___45____45_____
2345__345___45____5______
2345__345___45____5_____5
2345__345___45____4______
2345__345___45____4_____4
2345__345___45____45____5
2345__345___45____45____4
2345__345___35____5______
2345__345___35____3______
2345__345___35____35_____
2345__345___35____5______
2345__345___35____5_____5
2345__345___35____3______
2345__345___35____3_____3
2345__345___35____35____5
2345__345___35____35____3
2345__345___34____4______
2345__345___34____3______
2345__345___34____34_____
2345__345___34____4______
2345__345___34____4_____4
2345__345___34____3______
2345__345___34____3_____3
2345__345___34____34____4
2345__345___34____34____3
2345__345___345___45_____
2345__345___345___35_____
2345__345___345___34_____
2345__345___345___45____5
2345__345___345___45____4
2345__345___345___35____5
2345__345___345___35____3
2345__345___345___34____4
2345__345___345___34____3
2345__245___45___________
2345__245___25___________
2345__245___24___________
2345__245___245__________
2345__245___45____5______
2345__245___45____4______
2345__245___45____45_____
2345__245___45____5______
2345__245___45____5_____5
2345__245___45____4______
2345__245___45____4_____4
2345__245___45____45____5
2345__245___45____45____4
2345__245___25____5______
2345__245___25____2______
2345__245___25____25_____
2345__245___25____5______
2345__245___25____5_____5
2345__245___25____2______
2345__245___25____2_____2
2345__245___25____25____5
2345__245___25____25____2
2345__245___24____4______
2345__245___24____2______
2345__245___24____24_____
2345__245___24____4______
2345__245___24____4_____4
2345__245___24____2______
2345__245___24____2_____2
2345__245___24____24____4
2345__245___24____24____2
2345__245___245___45_____
2345__245___245___25_____
2345__245___245___24_____
2345__245___245___45____5
2345__245___245___45____4
2345__245___245___25____5
2345__245___245___25____2
2345__245___245___24____4
2345__245___245___24____2
2345__235___35___________
2345__235___25___________
2345__235___23___________
2345__235___235__________
2345__235___35____5______
2345__235___35____3______
2345__235___35____35_____
2345__235___35____5______
2345__235___35____5_____5
2345__235___35____3______
2345__235___35____3_____3
2345__235___35____35____5
2345__235___35____35____3
2345__235___25____5______
2345__235___25____2______
2345__235___25____25_____
2345__235___25____5______
2345__235___25____5_____5
2345__235___25____2______
2345__235___25____2_____2
2345__235___25____25____5
2345__235___25____25____2
2345__235___23____3______
2345__235___23____2______
2345__235___23____23_____
2345__235___23____3______
2345__235___23____3_____3
2345__235___23____2______
2345__235___23____2_____2
2345__235___23____23____3
2345__235___23____23____2
2345__235___235___35_____
2345__235___235___25_____
2345__235___235___23_____
2345__235___235___35____5
2345__235___235___35____3
2345__235___235___25____5
2345__235___235___25____2
2345__235___235___23____3
2345__235___235___23____2
2345__234___34___________
2345__234___24___________
2345__234___23___________
2345__234___234__________
2345__234___34____4______
2345__234___34____3______
2345__234___34____34_____
2345__234___34____4______
2345__234___34____4_____4
2345__234___34____3______
2345__234___34____3_____3
2345__234___34____34____4
2345__234___34____34____3
2345__234___24____4______
2345__234___24____2______
2345__234___24____24_____
2345__234___24____4______
2345__234___24____4_____4
2345__234___24____2______
2345__234___24____2_____2
2345__234___24____24____4
2345__234___24____24____2
2345__234___23____3______
2345__234___23____2______
2345__234___23____23_____
2345__234___23____3______
2345__234___23____3_____3
2345__234___23____2______
2345__234___23____2_____2
2345__234___23____23____3
2345__234___23____23____2
2345__234___234___34_____
2345__234___234___24_____
2345__234___234___23_____
2345__234___234___34____4
2345__234___234___34____3
2345__234___234___24____4
2345__234___234___24____2
2345__234___234___23____3
2345__234___234___23____2
2345__2345__345__________
2345__2345__245__________
2345__2345__235__________
2345__2345__234__________
2345__2345__345___45_____
2345__2345__345___35_____
2345__2345__345___34_____
2345__2345__345___45____5
2345__2345__345___45____4
2345__2345__345___35____5
2345__2345__345___35____3
2345__2345__345___34____4
2345__2345__345___34____3
2345__2345__245___45_____
2345__2345__245___25_____
2345__2345__245___24_____
2345__2345__245___45____5
2345__2345__245___45____4
2345__2345__245___25____5
2345__2345__245___25____2
2345__2345__245___24____4
2345__2345__245___24____2
2345__2345__235___35_____
2345__2345__235___25_____
2345__2345__235___23_____
2345__2345__235___35____5
2345__2345__235___35____3
2345__2345__235___25____5
2345__2345__235___25____2
2345__2345__235___23____3
2345__2345__235___23____2
2345__2345__234___34_____
2345__2345__234___24_____
2345__2345__234___23_____
2345__2345__234___34____4
2345__2345__234___34____3
2345__2345__234___24____4
2345__2345__234___24____2
2345__2345__234___23____3
2345__2345__234___23____2
1345__345________________
1345__145________________
1345__135________________
1345__134________________
1345__1345_______________
1345__345___45___________
1345__345___35___________
1345__345___34___________
1345__345___345__________
1345__345___45____5______
1345__345___45____4______
1345__345___45____45_____
1345__345___45____5______
1345__345___45____5_____5
1345__345___45____4______
1345__345___45____4_____4
1345__345___45____45____5
1345__345___45____45____4
1345__345___35____5______
1345__345___35____3______
1345__345___35____35_____
1345__345___35____5______
1345__345___35____5_____5
1345__345___35____3______
1345__345___35____3_____3
1345__345___35____35____5
1345__345___35____35____3
1345__345___34____4______
1345__345___34____3______
1345__345___34____34_____
1345__345___34____4______
1345__345___34____4_____4
1345__345___34____3______
1345__345___34____3_____3
1345__345___34____34____4
1345__345___34____34____3
1345__345___345___45_____
1345__345___345___35_____
1345__345___345___34_____
1345__345___345___45____5
1345__345___345___45____4
1345__345___345___35____5
1345__345___345___35____3
1345__345___345___34____4
1345__345___345___34____3
1345__145___45___________
1345__145___15___________
1345__145___14___________
1345__145___145__________
1345__145___45____5______
1345__145___45____4______
1345__145___45____45_____
1345__145___45____5______
1345__145___45____5_____5
1345__145___45____4______
1345__145___45____4_____4
1345__145___45____45____5
1345__145___45____45____4
1345__145___15____5______
1345__145___15____1______
1345__145___15____15_____
1345__145___15____5______
1345__145___15____5_____5
1345__145___15____1______
1345__145___15____1_____1
1345__145___15____15____5
1345__145___15____15____1
1345__145___14____4______
1345__145___14____1______
1345__145___14____14_____
1345__145___14____4______
1345__145___14____4_____4
1345__145___14____1______
1345__145___14____1_____1
1345__145___14____14____4
1345__145___14____14____1
1345__145___145___45_____
1345__145___145___15_____
1345__145___145___14_____
1345__145___145___45____5
1345__145___145___45____4
1345__145___145___15____5
1345__145___145___15____1
1345__145___145___14____4
1345__145___145___14____1
1345__135___35___________
1345__135___15___________
1345__135___13___________
1345__135___135__________
1345__135___35____5______
1345__135___35____3______
1345__135___35____35_____
1345__135___35____5______
1345__135___35____5_____5
1345__135___35____3______
1345__135___35____3_____3
1345__135___35____35____5
1345__135___35____35____3
1345__135___15____5______
1345__135___15____1______
1345__135___15____15_____
1345__135___15____5______
1345__135___15____5_____5
1345__135___15____1______
1345__135___15____1_____1
1345__135___15____15____5
1345__135___15____15____1
1345__135___13____3______
1345__135___13____1______
1345__135___13____13_____
1345__135___13____3______
1345__135___13____3_____3
1345__135___13____1______
1345__135___13____1_____1
1345__135___13____13____3
1345__135___13____13____1
1345__135___135___35_____
1345__135___135___15_____
1345__135___135___13_____
1345__135___135___35____5
1345__135___135___35____3
1345__135___135___15____5
1345__135___135___15____1
1345__135___135___13____3
1345__135___135___13____1
1345__134___34___________
1345__134___14___________
1345__134___13___________
1345__134___134__________
1345__134___34____4______
1345__134___34____3______
1345__134___34____34_____
1345__134___34____4______
1345__134___34____4_____4
1345__134___34____3______
1345__134___34____3_____3
1345__134___34____34____4
1345__134___34____34____3
1345__134___14____4______
1345__134___14____1______
1345__134___14____14_____
1345__134___14____4______
1345__134___14____4_____4
1345__134___14____1______
1345__134___14____1_____1
1345__134___14____14____4
1345__134___14____14____1
1345__134___13____3______
1345__134___13____1______
1345__134___13____13_____
1345__134___13____3______
1345__134___13____3_____3
1345__134___13____1______
1345__134___13____1_____1
1345__134___13____13____3
1345__134___13____13____1
1345__134___134___34_____
1345__134___134___14_____
1345__134___134___13_____
1345__134___134___34____4
1345__134___134___34____3
1345__134___134___14____4
1345__134___134___14____1
1345__134___134___13____3
1345__134___134___13____1
1345__1345__345__________
1345__1345__145__________
1345__1345__135__________
1345__1345__134__________
1345__1345__345___45_____
1345__1345__345___35_____
1345__1345__345___34_____
1345__1345__345___45____5
1345__1345__345___45____4
1345__1345__345___35____5
1345__1345__345___35____3
1345__1345__345___34____4
1345__1345__345___34____3
1345__1345__145___45_____
1345__1345__145___15_____
1345__1345__145___14_____
1345__1345__145___45____5
1345__1345__145___45____4
1345__1345__145___15____5
1345__1345__145___15____1
1345__1345__145___14____4
1345__1345__145___14____1
1345__1345__135___35_____
1345__1345__135___15_____
1345__1345__135___13_____
1345__1345__135___35____5
1345__1345__135___35____3
1345__1345__135___15____5
1345__1345__135___15____1
1345__1345__135___13____3
1345__1345__135___13____1
1345__1345__134___34_____
1345__1345__134___14_____
1345__1345__134___13_____
1345__1345__134___34____4
1345__1345__134___34____3
1345__1345__134___14____4
1345__1345__134___14____1
1345__1345__134___13____3
1345__1345__134___13____1
1245__245________________
1245__145________________
1245__125________________
1245__124________________
1245__1245_______________
1245__245___45___________
1245__245___25___________
1245__245___24___________
1245__245___245__________
1245__245___45____5______
1245__245___45____4______
1245__245___45____45_____
1245__245___45____5______
1245__245___45____5_____5
1245__245___45____4______
1245__245___45____4_____4
1245__245___45____45____5
1245__245___45____45____4
1245__245___25____5______
1245__245___25____2______
1245__245___25____25_____
1245__245___25____5______
1245__245___25____5_____5
1245__245___25____2______
1245__245___25____2_____2
1245__245___25____25____5
1245__245___25____25____2
1245__245___24____4______
1245__245___24____2______
1245__245___24____24_____
1245__245___24____4______
1245__245___24____4_____4
1245__245___24____2______
1245__245___24____2_____2
1245__245___24____24____4
1245__245___24____24____2
1245__245___245___45_____
1245__245___245___25_____
1245__245___245___24_____
1245__245___245___45____5
1245__245___245___45____4
1245__245___245___25____5
1245__245___245___25____2
1245__245___245___24____4
1245__245___245___24____2
1245__145___45___________
1245__145___15___________
1245__145___14___________
1245__145___145__________
1245__145___45____5______
1245__145___45____4______
1245__145___45____45_____
1245__145___45____5______
1245__145___45____5_____5
1245__145___45____4______
...
1235__1235__235___25_____
1235__1235__235___23_____
1235__1235__235___35____5
1235__1235__235___35____3
1235__1235__235___25____5
1235__1235__235___25____2
1235__1235__235___23____3
1235__1235__235___23____2
1235__1235__135___35_____
1235__1235__135___15_____
1235__1235__135___13_____
1235__1235__135___35____5
1235__1235__135___35____3
1235__1235__135___15____5
1235__1235__135___15____1
1235__1235__135___13____3
1235__1235__135___13____1
1235__1235__125___25_____
1235__1235__125___15_____
1235__1235__125___12_____
1235__1235__125___25____5
1235__1235__125___25____2
1235__1235__125___15____5
1235__1235__125___15____1
1235__1235__125___12____2
1235__1235__125___12____1
1235__1235__123___23_____
1235__1235__123___13_____
1235__1235__123___12_____
1235__1235__123___23____3
1235__1235__123___23____2
1235__1235__123___13____3
1235__1235__123___13____1
1235__1235__123___12____2
1235__1235__123___12____1
1234__234________________
1234__134________________
1234__124________________
1234__123________________
1234__1234_______________
1234__234___34___________
1234__234___24___________
1234__234___23___________
1234__234___234__________
1234__234___34____4______
1234__234___34____3______
1234__234___34____34_____
1234__234___34____4______
1234__234___34____4_____4
1234__234___34____3______
1234__234___34____3_____3
1234__234___34____34____4
1234__234___34____34____3
1234__234___24____4______
1234__234___24____2______
1234__234___24____24_____
1234__234___24____4______
1234__234___24____4_____4
1234__234___24____2______
1234__234___24____2_____2
1234__234___24____24____4
1234__234___24____24____2
1234__234___23____3______
1234__234___23____2______
1234__234___23____23_____
1234__234___23____3______
1234__234___23____3_____3
1234__234___23____2______
1234__234___23____2_____2
1234__234___23____23____3
1234__234___23____23____2
1234__234___234___34_____
1234__234___234___24_____
1234__234___234___23_____
1234__234___234___34____4
1234__234___234___34____3
1234__234___234___24____4
1234__234___234___24____2
1234__234___234___23____3
1234__234___234___23____2
1234__134___34___________
1234__134___14___________
1234__134___13___________
1234__134___134__________
1234__134___34____4______
1234__134___34____3______
1234__134___34____34_____
1234__134___34____4______
1234__134___34____4_____4
1234__134___34____3______
1234__134___34____3_____3
1234__134___34____34____4
1234__134___34____34____3
1234__134___14____4______
1234__134___14____1______
1234__134___14____14_____
1234__134___14____4______
1234__134___14____4_____4
1234__134___14____1______
1234__134___14____1_____1
1234__134___14____14____4
1234__134___14____14____1
1234__134___13____3______
1234__134___13____1______
1234__134___13____13_____
1234__134___13____3______
1234__134___13____3_____3
1234__134___13____1______
1234__134___13____1_____1
1234__134___13____13____3
1234__134___13____13____1
1234__134___134___34_____
1234__134___134___14_____
1234__134___134___13_____
1234__134___134___34____4
1234__134___134___34____3
1234__134___134___14____4
1234__134___134___14____1
1234__134___134___13____3
1234__134___134___13____1
1234__124___24___________
1234__124___14___________
1234__124___12___________
1234__124___124__________
1234__124___24____4______
1234__124___24____2______
1234__124___24____24_____
1234__124___24____4______
1234__124___24____4_____4
1234__124___24____2______
1234__124___24____2_____2
1234__124___24____24____4
1234__124___24____24____2
1234__124___14____4______
1234__124___14____1______
1234__124___14____14_____
1234__124___14____4______
1234__124___14____4_____4
1234__124___14____1______
1234__124___14____1_____1
1234__124___14____14____4
1234__124___14____14____1
1234__124___12____2______
1234__124___12____1______
1234__124___12____12_____
1234__124___12____2______
1234__124___12____2_____2
1234__124___12____1______
1234__124___12____1_____1
1234__124___12____12____2
1234__124___12____12____1
1234__124___124___24_____
1234__124___124___14_____
1234__124___124___12_____
1234__124___124___24____4
1234__124___124___24____2
1234__124___124___14____4
1234__124___124___14____1
1234__124___124___12____2
1234__124___124___12____1
1234__123___23___________
1234__123___13___________
1234__123___12___________
1234__123___123__________
1234__123___23____3______
1234__123___23____2______
1234__123___23____23_____
1234__123___23____3______
1234__123___23____3_____3
1234__123___23____2______
1234__123___23____2_____2
1234__123___23____23____3
1234__123___23____23____2
1234__123___13____3______
1234__123___13____1______
1234__123___13____13_____
1234__123___13____3______
1234__123___13____3_____3
1234__123___13____1______
1234__123___13____1_____1
1234__123___13____13____3
1234__123___13____13____1
1234__123___12____2______
1234__123___12____1______
1234__123___12____12_____
1234__123___12____2______
1234__123___12____2_____2
1234__123___12____1______
1234__123___12____1_____1
1234__123___12____12____2
1234__123___12____12____1
1234__123___123___23_____
1234__123___123___13_____
1234__123___123___12_____
1234__123___123___23____3
1234__123___123___23____2
1234__123___123___13____3
1234__123___123___13____1
1234__123___123___12____2
1234__123___123___12____1
1234__1234__234__________
1234__1234__134__________
1234__1234__124__________
1234__1234__123__________
1234__1234__234___34_____
1234__1234__234___24_____
1234__1234__234___23_____
1234__1234__234___34____4
1234__1234__234___34____3
1234__1234__234___24____4
1234__1234__234___24____2
1234__1234__234___23____3
1234__1234__234___23____2
1234__1234__134___34_____
1234__1234__134___14_____
1234__1234__134___13_____
1234__1234__134___34____4
1234__1234__134___34____3
1234__1234__134___14____4
1234__1234__134___14____1
1234__1234__134___13____3
1234__1234__134___13____1
1234__1234__124___24_____
1234__1234__124___14_____
1234__1234__124___12_____
1234__1234__124___24____4
1234__1234__124___24____2
1234__1234__124___14____4
1234__1234__124___14____1
1234__1234__124___12____2
1234__1234__124___12____1
1234__1234__123___23_____
1234__1234__123___13_____
1234__1234__123___12_____
1234__1234__123___23____3
1234__1234__123___23____2
1234__1234__123___13____3
1234__1234__123___13____1
1234__1234__123___12____2
1234__1234__123___12____1
========= ERROR SUMMARY: 0 errors
$
The answer given by Robert Crovella is correct at the 5th point, the mistake was in the using of init in every recursive call, but I want to clarify something that can be useful for other beginners with CUDA.
I used this variable because when I tried to launch a child kernel passing a local variable I always got the exception: Error: a pointer to local memory cannot be passed to a launch as an argument.
As I´m C# expert developer I´m not used to using pointers (Ref does the low-level-work for that) so I thought there was no way to do it in CUDA/c programming.
As Robert shows in its code it is possible copying the pointer with memalloc for using it as a referable argument.
Here is a kernel simplified as an example of deep recursion.
__device__ char init[6] = { "12345" };
__global__ void Recursive(int depth, const char* route) {
// up to depth 6
if (depth == 5) return;
//declaration for a referable argument (point 6)
char* newroute = (char*)malloc(6);
memcpy(newroute, route, 5);
int o = 0;
int newlen = 0;
for (int i = 0; i < (6 - depth); ++i)
{
if (i != threadIdx.x)
{
newroute[i - o] = route[i];
newlen++;
}
else
{
o = 1;
}
}
printf("%s\n", newroute);
Recursive <<<1, newlen>>>(depth + 1, newroute);
}
__global__ void RecursiveCount() {
Recursive <<<1, 5>>>(0, init);
}
I don't add the main call because I´m using ManagedCUDA for C# but as Robert says it can be figured-out how the call RecursiveCount is.
About ending arrays of char with /0 ... sorry but I don't know exactly what is the benefit; this code works fine without them.

Did Vulkan-HPP developers change anything in vk::DebugUtilsMessengerEXT creation?

Recently I've updated my system and tried to recompile my Vulkan app (which uses Vulkan cpp binding) and got almost no output from vk::DebugUtilsMessengerEXT (except the string "Added messenger"). I set it up to std::cout every kind of callback, and it printed lots of information strings (before update). Does anyone know, what to do to bring back debug output?
Here is my debug messenger code:
// ...
vk::DebugUtilsMessengerCreateInfoEXT messengerInfo;
messengerInfo.setMessageSeverity(
vk::DebugUtilsMessageSeverityFlagBitsEXT::eError |
vk::DebugUtilsMessageSeverityFlagBitsEXT::eWarning |
vk::DebugUtilsMessageSeverityFlagBitsEXT::eInfo |
vk::DebugUtilsMessageSeverityFlagBitsEXT::eVerbose);
messengerInfo.setMessageType(
vk::DebugUtilsMessageTypeFlagBitsEXT::eGeneral |
vk::DebugUtilsMessageTypeFlagBitsEXT::eValidation |
vk::DebugUtilsMessageTypeFlagBitsEXT::ePerformance);
messengerInfo.setPfnUserCallback(callback);
messengerInfo.setPUserData(nullptr);
if(instance.createDebugUtilsMessengerEXT(&messengerInfo, nullptr, &debugMessenger, loader) != vk::Result::eSuccess) throw std::runtime_error("Failed to create debug messenger!\n");
}
VKAPI_ATTR VkBool32 VKAPI_CALL System::callback(VkDebugUtilsMessageSeverityFlagBitsEXT messageSeverity, VkDebugUtilsMessageTypeFlagsEXT messageType, const VkDebugUtilsMessengerCallbackDataEXT* pCallbackData, void* pUserData)
{
std::cout << pCallbackData->pMessage << '\n';
return false;
}
"loader" is vk::DispatchLoaderDynamic
Seems like the trouble is not only with Vulkan-Hpp, but also with C Vulkan.
Did some testing and the callbacks seem to be working correctly. I'm thinking the issue here might be the removal of some old INFO messages for performance reasons -- see Vulkan-ValidationLayers commits 18BF5C637 and 523D9C775. If INFORMATION_BIT messages were enabled in SDK releases previous to 1.1.108, you'd have seen a ton of spew. If expected validation errors are not making it to your callback, please create a github issue in the VVL repo and we'll address it immediately.
How I'm doing it and it works, albeit not the latest SDK:
void VulkanContext::createInstance()
{
// create the list of required extensions
uint32_t glfwExtensionCount = 0;
const char **glfwExtensions;
glfwExtensions = glfwGetRequiredInstanceExtensions(&glfwExtensionCount);
std::vector<const char *> extensions(glfwExtensions, glfwExtensions + glfwExtensionCount);
extensions.push_back(VK_KHR_GET_PHYSICAL_DEVICE_PROPERTIES_2_EXTENSION_NAME);
auto layers = std::vector<const char *>();
if ( enableValidationLayers )
{
extensions.push_back(VK_EXT_DEBUG_REPORT_EXTENSION_NAME);
extensions.push_back(VK_EXT_DEBUG_UTILS_EXTENSION_NAME);
layers.push_back("VK_LAYER_LUNARG_standard_validation");
}
vk::ApplicationInfo appInfo("Vulkan test", 1, "test", 1, VK_API_VERSION_1_1);
auto createInfo = vk::InstanceCreateInfo(
vk::InstanceCreateFlags(),
&appInfo,
static_cast<uint32_t>(layers.size()),
layers.empty() ? nullptr : layers.data(),
static_cast<uint32_t>(extensions.size()),
extensions.empty() ? nullptr : extensions.data()
);
instance = vk::createInstanceUnique(createInfo);
dispatcher = vk::DispatchLoaderDynamic(instance.get(), vkGetInstanceProcAddr);
if ( enableValidationLayers )
{
auto severityFlags = vk::DebugUtilsMessageSeverityFlagBitsEXT::eError
| vk::DebugUtilsMessageSeverityFlagBitsEXT::eWarning
| vk::DebugUtilsMessageSeverityFlagBitsEXT::eVerbose
| vk::DebugUtilsMessageSeverityFlagBitsEXT::eInfo;
auto typeFlags = vk::DebugUtilsMessageTypeFlagBitsEXT::eGeneral
| vk::DebugUtilsMessageTypeFlagBitsEXT::eValidation
| vk::DebugUtilsMessageTypeFlagBitsEXT::ePerformance;
messenger = instance->createDebugUtilsMessengerEXTUnique(
{{}, severityFlags, typeFlags, debugCallback},
nullptr,
dispatcher
);
}
}
Make sure you're enabling the debug extensions and validation layers.
Check that your loader/dispatcher is initialized correctly.
Try some of the other commands for creating the messenger, not sure but maybe the API changed and the severity flags are passed in the wrong place.
Make sure the validation layers are installed correctly, don't recall dealing with that myself but saw mentions that it can be an issue.

Vulkan error vkCreateDevice : VK_ERROR_EXTENSION_NOT_PRESENT

I am starting with Vulkan and I follow the Niko Kauppi's tutorial on Youtube.
I have an error when creating a device with vkCreateDevice, it returns VK_ERROR_EXTENSION_NOT_PRESENT
Here some part of my code:
The call to vkCreateDevice
_gpu_count = 0;
vkEnumeratePhysicalDevices(instance, &_gpu_count, nullptr);
std::vector<VkPhysicalDevice> gpu_list(_gpu_count);
vkEnumeratePhysicalDevices(instance, &_gpu_count, gpu_list.data());
_gpu = gpu_list[0];
vkGetPhysicalDeviceProperties(_gpu, &_gpu_properties);
VkDeviceCreateInfo device_create_info = _CreateDeviceInfo();
vulkanCheckError(vkCreateDevice(_gpu, &device_create_info, nullptr, &_device));
_gpu_count = 1 and _gpu_properties seems to recognize well my nvidia gpu (which is not up to date)
device_create_info
VkDeviceCreateInfo _createDeviceInfo;
_createDeviceInfo.sType = VK_STRUCTURE_TYPE_DEVICE_CREATE_INFO;
_createDeviceInfo.queueCreateInfoCount = 1;
VkDeviceQueueCreateInfo _queueInfo = _CreateDeviceQueueInfo();
_createDeviceInfo.pQueueCreateInfos = &_queueInfo;
I don't understand the meaning of the error: "A requested extension is not supported" according to Khronos' doc.
Thanks for your help
VK_ERROR_EXTENSION_NOT_PRESENT is returned when one of the extensions in [enabledExtensionCount, ppEnabledExtensionNames] vector you provided is not supported by the driver (as queried by vkEnumerateDeviceExtensionProperties()).
Extensions can also have dependencies, so VK_ERROR_EXTENSION_NOT_PRESENT is also returned when an extension dependency of extension in the list is missing there too.
If you want no device extensions, make sure enabledExtensionCount of VkDeviceCreateInfo is 0 (and not e.g. some uninitialized value).
I assume 2. is the whole body of _CreateDeviceInfo(), which would confirm the "uninitialized value" suspicion.
Usually though you would want a swapchain extension there to be able to render to screen directly.
First of all, make sure your VkDeviceCreateInfo is zero filled, otherwise it may carry garbage to your VkCreateDevice() call.
Add following line just after declaring your VkDeviceCreateInfo:
memset ( &_createDeviceInfo, 0, sizeof(VkDeviceCreateInfo) );
Some extensions are absolutely necessary, as swapchain one.
To retrieve available extensions do this:
// Available extensions and layers names
const char* const* _ppExtensionNames = NULL;
// get extension names
uint32 _extensionCount = 0;
vkEnumerateDeviceExtensionProperties( _gpu, NULL, &_extensionCount, NULL);
std::vector<const char *> extNames;
std::vector<VkExtensionProperties> extProps(_extensionCount);
vkEnumerateDeviceExtensionProperties(_gpu, NULL, &_extensionCount, extProps.data());
for (uint32_t i = 0; i < _extensionCount; i++) {
extNames.push_back(extProps[i].extensionName);
}
_ppExtensionNames = extNames.data();
Once you have all extension names in _ppExtensionNames, pass it to your deviceCreateInfo struct:
VkDeviceCreateInfo device_create_info ...
[...]
device_create_info.enabledExtensionCount = _extensionCount;
device_create_info.ppEnabledExtensionNames = _ppExtensionNames;
[...]
vulkanCheckError(vkCreateDevice(_gpu, &device_create_info, nullptr, &_device));
I hope it helps.
Please double check above code, as I'm writing it by heart.