Why do the usable sizes differ - size

Again, dont really know how to phrase the question so I will explain.
I have a video recorder application. I open my camera with
cameraRecorder = Camera.open(1); //(this is the front facing camera)
And get the camera parameters and all supported preview sizes
CameraParameters tmpParams = cameraRecorder.getParameters();
List<Camera.Size> tmpList = tmpParams.getSupportedPreviewSizes();
one of the preview sizes on the Galaxy Tab 10.1 running ICS (4.0.4) is 800x600
but when I try to set the Video size in my media Player
mediaRecorder.setVideoSize(800,600);
I get this error:
12-19 17:27:55.035: E/CameraSource(110): Video dimension (800x600) is unsupported
12-19 17:27:55.035: E/StagefrightRecorder(110): cameraSource do not init
12-19 17:27:55.035: E/StagefrightRecorder(110): setupCameraSource failed. (-19)
12-19 17:27:55.035: E/StagefrightRecorder(110): setupMediaSource is failed. (-19)
12-19 17:27:55.035: E/StagefrightRecorder(110): setupMPEG4Recording is failed. (-19)
12-19 17:27:55.035: E/MediaRecorder(30119): start failed: -19
Does anyone know why this discrepancy might exist (I know one of the supported record sizes is 1280x720 but that is too big for me).

That's because preview and recording is different. You should use the sizes specified in Camera.Parameters.getSupportedVideoSizes().
public List<Camera.Size> getSupportedVideoSizes ()
Gets the supported video frame sizes that can be used by MediaRecorder.

Related

What is correct "Path" usage for startRecordingVideo in distriqt camera ANE?

I'm trying to record a short video using the camera ANE from Distriqt and one of the params for startRecordingVideo() is "path".
I can't figure out what a valid path would be when saving on iOS (and probably Android when I get to it)
Unfortunately the documentation is somewhat lacking...
http://docs.airnativeextensions.com/camera/docs/com/distriqt/extension/camera/Camera.html#startRecordingVideo%28%29
An invalid path stops the device from even starting to record.
OK that was me just having a stupid 5 min...
var file:File = File.applicationStorageDirectory.resolvePath("sample.mp4");
Camera.instance.startRecordingVideo(file.nativePath);

MediaEncodingProfile.CreateWmv gives "No suitable transform was found to encode or decode the content." error

I am creating a Windows Phone app (XAML/C#) that uploads audio and video to a server. Using VideoCaptureDevice on Windows Phone 8.0 works fine, but it only allows resolutions supported by the device (on a Nokia 625 the smallest is 640 x 480). To get the size down I have upgraded the app to Windows Phone 8.1 Silverlight (Developer Preview) to use the Windows.Media.Capture.MediaCapture libraries. This works and the generic Qvga format:
MediaEncodingProfile profile = MediaEncodingProfile.CreateMp4(Windows.Media.MediaProperties.VideoEncodingQuality.Qvga);
Works on both the Nokia 625 and 520 and gets the resolution down to 320 x 240, but the file size is still ~24MB for 4 minutes of video. If I set a custom resolution like this:
MediaEncodingProfile profile = MediaEncodingProfile.CreateMp4(Windows.Media.MediaProperties.VideoEncodingQuality.Auto);
profile.Video.Width = 480;
profile.Video.Height = 320;
I get a much smaller file size (4 min == ~6MB, which is odd) but it is corrupted on the 625.
I would like to try it out with other file types, eg .wmv, but:
MediaEncodingProfile profile = MediaEncodingProfile.CreateWmv(Windows.Media.MediaProperties.VideoEncodingQuality.Auto);
Gives a System.Exception "No suitable transform was found to encode or decode the content."
I'm also going to need to do this for audio, ie:
MediaEncodingProfile profile = MediaEncodingProfile.CreateMp3(Windows.Media.MediaProperties.AudioEncodingQuality.Auto);
But I get the same error.
I suppose I'm asking lots of questions here, but I'm really asking:
What's the best way to reduce video size in Windows Phone 8.1 Silverlight?
Can anyone help me apply a suitable transform for .wmv and/or .mp3 recording?
Does anyone know why setting the video resolution manually causes instability?
I have also tried setting the audio properties manually to see if that will get the size down:
MediaEncodingProfile profile = MediaEncodingProfile.CreateMp4(Windows.Media.MediaProperties.VideoEncodingQuality.Qvga);
profile.Audio.Subtype = "PCM";
profile.Audio.ChannelCount = 2;
profile.Audio.BitsPerSample = 8;
profile.Audio.SampleRate = 22050;
But this also leads to a corrupted file.
Any help greatly appreciated - have hunted around but found very little on the subject...
Maybe it is related to this IoT:
https://ms-iot.github.io/content/en-US/win10/ReleaseNotesRTM.htm
Release Notes for Windows 10 IoT Core Build Number 10586. December 2015
Known Issues
A MediaEncodingProfile.CreateWma( Windows.Media.MediaProperties. AudioEncodingQuality.Auto) method call may fail on the Raspberry Pi 2 with the error message No suitable transform was found to encode or decode the content. (Exception from HRESULT: 0xC00D5212). (4510128) WORKAROUND: None.

Load a touch-responsive 3D model

I'm programmimg a museum app and I'd like to display a 3D model that responses to the user touches, like pinch to zoom or moving arround the model. I've searched a lot but all I found is game engines that seem very complicated for this thing. Is there any way to import the models (it doesn't matter the format that they have), display it and make it touch responsive? If the code (or the engine) is open source would be better, I'd prefer a free app than a paid app.So many thanks!
Update: right now I'm able to load the 3D model using cocos3D, but as I've said on an answer, the model I can load is very low-ploy. It's an app for a museum and I'd have to be a much more detailed model. I'm using the cocos3D standard template project that shows the animated "hello world", just changed the .pod file to load the one I want and started adding a few modifications to support user touch interaction. I'm reducing about 80% the quantity of original polygons to load it (this is how looks a small part of the model
). If I try to load the model reducing about 50% the original (which looks great, like these
), the app crashes and gives me this log crash:
** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'OpenGL ES 1.1 supports only GL_UNSIGNED_SHORT or GL_UNSIGNED_BYTE types for vertex indices'
* First throw call stack:
(0x22cc012 0x1ca9e7e 0x22cbe78 0x173ff35 0x1b550f 0x186751 0x180a81 0x17b750 0x11de32 0x1270d4 0x1263ac 0x14f1a2 0x13ca01 0x14ee02 0x14d45e 0x14d3c2 0x14bb22 0x14a452 0x14efcc 0x14d493 0x14d3c2 0x1643e3 0x162a41 0x10c197 0x10c11d 0x10c098 0x3d79c 0x3d76f 0x85282 0x16e9884 0x16e9737 0x8b56f 0xc4192d 0x1cbd6b0 0x505fc0 0x4fa33c 0x4fa150 0x4780bc 0x479227 0x51bb50 0xbef9ff 0xbf04e1 0xc01315 0xc0224b 0xbf3cf8 0x2fd4df9 0x2fd4ad0 0x2241bf5 0x2241962 0x2272bb6 0x2271f44 0x2271e1b 0xbef7da 0xbf165c 0x1ca506 0x2a55)
libc++abi.dylib: terminate called throwing an exception
(lldb)
It can't load all the polygons and crashes. Is there any solution for that? Or I must star looking another way to load the model? It you want more information just ask. Thanks.
I used Cocos3D to import a Earth model and rotate it according to the gestures made by the users. You can give it a look, it's not a complex thing to do.
Have a look at this post for some sample code about loading the model. For handling rotation, I found very useful this post.

Setting sound output/input

I've been looking all over the web, but I don't know if it is possible: can a Cocoa Mac OS X app change the sound input/output device? If so, how come?
can a Cocoa Mac OS X app change the sound input/output device?
Yes, by setting the relevant Audio System Object property.
If so, how come?
Probably because the user might want to change the default input or output device from within an application, rather than having to jump over to the Sound prefpane before and after or use the Sound menu extra.
I know this is an old post but I've been struggling these days trying to find a way of changing the sound input/output device using code and I finally found how to do it. In case someone else runs into the same problem, here's the answer!
There's a command line utility called SwitchAudio-OSX (https://code.google.com/p/switchaudio-osx/) that allows you to switch the audio source from the terminal. It is open-source and you can find the latest version here: https://github.com/deweller/switchaudio-osx.
Anyway, you can use these lines to change the sound input/output device:
UInt32 propertySize = sizeof(UInt32);
AudioHardwareSetProperty(kAudioHardwarePropertyDefaultInputDevice, propertySize, &newDeviceID); // To change the input device
AudioHardwareSetProperty(kAudioHardwarePropertyDefaultOutputDevice, propertySize, &newDeviceID); // To change the output device
AudioHardwareSetProperty(kAudioHardwarePropertyDefaultSystemOutputDevice, propertySize, &newDeviceID); // To change the system output device
Where newDeviceID is an instance of AudioDeviceID and represents the id of the device you want to select. Also, a list of all available devices can be obtained using this code:
AudioDeviceID dev_array[64];
AudioHardwareGetProperty(kAudioHardwarePropertyDevices, &propertySize, dev_array);
int numberOfDevices = (propertySize / sizeof(AudioDeviceID));

Error messages when using OpenGL ES template

I made a new OpenGL ES application, and without modifying anything, I ran the program. It runs, but I see these error messages:
Detected an attempt to call a symbol in system libraries that is not present on the iPhone:
open$UNIX2003 called from function _ZN4llvm12MemoryBuffer7getFileEPKcPSsx in image libLLVMContainer.dylib.
Detected an attempt to call a symbol in system libraries that is not present on the iPhone:
fstat$INODE64 called from function _ZN4llvm12MemoryBuffer7getFileEPKcPSsx in image libLLVMContainer.dylib.
Detected an attempt to call a symbol in system libraries that is not present on the iPhone:
mmap$UNIX2003 called from function _ZN4llvm3sys4Path14MapInFilePagesEiy in image libLLVMContainer.dylib.
Detected an attempt to call a symbol in system libraries that is not present on the iPhone:
close$UNIX2003 called from function _ZN4llvm12MemoryBuffer7getFileEPKcPSsx in image libLLVMContainer.dylib.
Detected an attempt to call a symbol in system libraries that is not present on the iPhone:
pthread_mutexattr_destroy$UNIX2003 called from function _ZN4llvm3sys5MutexC2Eb in image libLLVMContainer.dylib.
The program displays a colored box that moves up and down. Is this what it's supposed to do? What do these error messages mean?
I've been noticing the same. I'm running iOS sdk 4.1. But it only happens on the simulator.
From what I found at the apple forums, it seems to be a simulator bug. An Apple representative claimed that it was a bug on "their" end.
Here's the quote "This is a bug on our end, but as long as things are otherwise working for you, the logging can safely be ignored."