Sample cairo applications to test cairo-gles backend in 1.12.14 - opengl-es-2.0

I was able to successfully port, cross compile and run the cairo gears
application in gles backend, on my embedded system target.
http://people.linaro.org/~afrantzis/cairogears-0~git20100719.2b01100+gles2.tar.gz
The ported samples trap, comp, text and shadow run well in cairo1.12.3
and 1.12.4.
But I face problem in running the same in 1.12.14.
I could not run the texture related samples like comp, text, shadow.
Trap plays well but the gradient could not be displayed in the gradient sample.
I use gles backend and converting all image surfaces I load from png
file to gl surface.
Let me know if there is something that should be done for the
texture+gradient samples to work in 1.12.14.
thanks
Sundara raghavan

The problem was because of the need to convert the GL_BGRA,the internal image format of cairo , to GL_RGBA for loading in to GL textures (which were GL_RGBA by default). I solved it by applying an existing patch which uses BGRA GL texture and hence avoids conversion. This was possible because my hardware is capable of both reading as well as creating bgra textures.
The Patch was found here:
http://lists.freedesktop.org/archives/cairo/2013-February/024038.html

Related

Convert Vulkan nvpro app to vulkan headless

I am trying to convert vk_raytrace to headless so that I can run it via commandline and dump rendered image. I am new to vulkan and saw that vulkan supports headless surface. My first approach was to replace the surface created using glfw window to a headless surface. However, I get VK_ERROR_EXTENSION_NOT_PRESENT for VK_EXT_headless_surface. Next I tried removing surface and swapchain related logic and create frame buffers with image view as attachment. Haven't had any luck with that either.
Any pointers on this would be very helpful.

React native video as GL texture

It seems it’s not possible to use a video as a texture with expo-gl (texImage2D is not able to take any video params, cf. context definition and API documentation). This feature is currently requested on the canny.
I'm looking to convert an expo-av video to an ArrayBuffer with pixel data or any way to pass a video as a texture to a shader.
I made some research but I didn't find any solution for the moment:
gl-react: video processing: https://github.com/gre/gl-react/issues/215
react-native-gpuimage (react-native-video is not working inside GL.Node): https://github.com/CubeSugar/react-native-GPUImage/issues/1
Actually, this could be interesting: https://github.com/shahen94/react-native-video-processing. This library purpose is to edit, trim and compress videos, but there's actually no way to pass custom shaders.
A lot of issues has been open on Github (even on the expo org expo-three package) but there's no answer yet. I'm looking for resources or any advice to accomplish this. For the moment, the best solution I can see is to do a sprite sheet.

Using iOS, Tiled, Box2D and XCode. Problems importing my own .tmx files

So I'm getting the error "TMX: unsupported compression method" using this tutorial's code and any of my own created TMX tiled map editor files.
I was just trying to figure out how to use tiled to make interesting maps, since I am creating a side scrolling game I was using this as a reference.
Any help would be appreciated :)
Tiled has multiple compression modes for saving - under File -> Preferences -> Store tile layer data as: for iOS and Box2D, this should be set to Base64 (gzip compressed).

Webp very low quality when converting transparent PNG image files

I want to use Webp and get my images much small, at last for serving them to chrome browsers. I've download many types of conversion tools, including the official one. I tried to convert PNG with Alpha transparency and get awful results. See image below..
According to what I see on many websites, it shold handle transparency incredibly well. I wanted to know if you know why I get these kind of results and what I need to do or what tool I need to produce high quality webp transparent images that will replace my png ones.
Second, I wanted to know the comparability. Should I server those images only for chrome uses? - OF course the most important issue is the image quality outpu.
Thanks
You are probably using an older version of WebP library/binary, which didn't have alpha support. So the images you see have alpha channel stripped.
Try again with the latest release v0.2.0:
http://code.google.com/p/webp/downloads/list

3D animation programatically rendered in Blender

I have a project in which I would like to programatically create and render a 3d animation based upon input. I originally asked here on stackoverflow if Blender was right for the job, and the response was yes, but upon looking at the API, it says this:
Python was embedded in Blender, so to access BPython modules you need to run scripts from the program itself: you can't import the Blender module into an external Python interpreter.
I want to be able to create and render this scene without having to ever open another program like Blender. Is this possible, and is Blender still the right choice?
Thanks in advance!
At work me and colleague worked on a project that rendered 3d scenes altered externally. We used Python to modify/create scenes, and did the rending on server through the command line interface (no GUI).
You can pass a python script as an argument to Blender in the command line options to
generate your scene objects and do the rendering.
I don't see how you can render in Blender without using Blender.
You can use Blender if you want, obviously this is not your only option.
If you need to
create and render a 3d animation based upon input.
You can go as simple or as you complex as you'd like.
You can use OpenGL in your language of choice (C++, Java, Python, etc.)
and display the animation (with or without fancy renderings).
It's up to what 'render' means to your context.
If you need some nice shading(light, soft shadows, reflections, etc. - ray tracers basically), you can still show an interactive preview to your users and generate the scene
for a 3rd party renderer(like Yafaray, Sunflow, LuxRender, etc. - I've put together a short list of free renders), and show the progress to the users after they've chosen the external render option.
On a similar note, have a look at joons.
HTH
Cart by Suomi - Yafaray Gallery image
Julia quaternion fractal - Sunflow Gallery image
Klein Bottle - LuxRender Gallery image