php-ffmpeg can't open stats file. Will run in Terminal if I remove temp file name - ffmpeg-php

I have a simple app that is trying to:
1) rotate videos that have a rotation value
2) re-encode HEVC videos so they play in Chrome (until Chrome supports HEVC)
3) stitch several of the results together.
I'm currently developing locally and in testing, I'm trying to scan the current directory for some example mp4's I have in there. It chokes on the very first one. Here is the code:
<?PHP
$files1 = scandir('./');
foreach ($files1 as $file) {
$fileinfo = new SplFileInfo($file);
$extn = $fileinfo->getExtension();
if ($extn == 'mp4'){
fixVideo($file);
}
}
// FUNCTIONS
function fixVideo($file) {
$config = array(
'ffmpeg.binaries' => '/usr/local/bin/ffmpeg',
'ffprobe.binaries' => '/usr/local/bin/ffprobe',
'timeout' => 3600, // The timeout for the underlying process
'ffmpeg.threads' => 12, // The number of threads that FFMpeg should use
);
$resultFile = 'FIXED_'.$file;
// create the ffmpeg object
$ffmpeg = FFMpeg\FFMpeg::create($config, null);
// open video file
$video = $ffmpeg->open($file);
// get the first video stream
$videostream = $ffmpeg->getFFProbe()
->streams($file)
->videos()
->first();
//echo "<pre>";print_r($videostream);
if (!$videostream instanceof FFMpeg\FFProbe\DataMapping\Stream) {
throw new \Exception('No stream given');
} else {
echo "<video src='$file' width='100'></video><br>";
echo "$file<br>";
if ($videostream->has('tags')) {
//echo "has tags<BR>";
// MUST WE ROTATE?
$tags = $videostream->get('tags');
if (isset($tags['rotate'])) {
echo "has rotate" . $tags['rotate'] . "<BR>" ;
if ($tags['rotate'] != 0) {
echo "rotate not 0<BR>";
switch($tags['rotate']) {
case 270:
$angle = FFMpeg\Filters\Video\RotateFilter::ROTATE_270;
break;
case 180:
$angle = FFMpeg\Filters\Video\RotateFilter::ROTATE_180;
break;
case 90:
$angle = FFMpeg\Filters\Video\RotateFilter::ROTATE_90;
break;
}
$video->filters()
->rotate($angle); echo "rotating<br>";
} // if ($tags['rotate']
// MUST WE REENCODE TO H.264?
if (isset($tags['encoder'])) {
echo "encoding: " . $tags['encoder'];
}
$format = new FFMpeg\Format\Video\X264();
$format->setAudioCodec("aac");
$video->save($format,$resultFile );
} // if (isset($tags['rotate']
} // if ($videostream->has('tags')
echo "<BR><BR>";
} // if $videostream instanceof
}//function
?>
When I run that in the browser, I get:
appL.mp4
has rotate0
encoding: HEVC
Fatal error: Uncaught Alchemy\BinaryDriver\Exception\ExecutionFailureException: ffmpeg failed to execute command '/usr/local/bin/ffmpeg' '-y' '-i' 'appL.mp4' '-threads' '12' '-vcodec' 'libx264' '-acodec' 'libmp3lame' '-b:v' '1000k' '-refs' '6' '-coder' '1' '-sc_threshold' '40' '-flags' '+loop' '-me_range' '16' '-subq' '7' '-i_qfactor' '0.71' '-qcomp' '0.6' '-qdiff' '4' '-trellis' '1' '-b:a' '128k' '-pass' '1' '-passlogfile' '/var/tmp/ffmpeg-passes5ba267bb96d0abvdgk/pass-5ba267bb96e01' 'FIXED_appL.mp4' in /Users/[redacted]/test3/vendor/alchemy/binary-driver/src/Alchemy/BinaryDriver/ProcessRunner.php:100 Stack trace: #0 /Users/[redacted]/test3/vendor/alchemy/binary-driver/src/Alchemy/BinaryDriver/ProcessRunner.php(72): Alchemy\BinaryDriver\ProcessRunner->doExecutionFailure(''/usr/local/bin...') #1 /Users/[redacted]/test3/vendor/alchemy/binary-driver/src/Alchemy/BinaryDriver/AbstractBinary.php(209): Alchemy\Bi in /Users/[redacted]/test3/vendor/php-ffmpeg/php-ffmpeg/src/FFMpeg/Media/AbstractVideo.php on line 106
When I run it in the Terminal, I get:
MacBook-Pro:videos [redacted]$ '/usr/local/bin/ffmpeg' '-y' '-i' 'appL.mp4' '-threads' '12' '-vcodec' 'libx264' '-acodec' 'libmp3lame' '-b:v' '1000k' '-refs' '6' '-coder' '1' '-sc_threshold' '40' '-flags' '+loop' '-me_range' '16' '-subq' '7' '-i_qfactor' '0.71' '-qcomp' '0.6' '-qdiff' '4' '-trellis' '1' '-b:a' '128k' '-pass' '1' '-passlogfile' '/var/tmp/ffmpeg-passes5ba267bb96d0abvdgk/pass-5ba267bb96e01' 'FIXED_appL.mp4'
ffmpeg version 4.0.2 Copyright (c) 2000-2018 the FFmpeg developers
built with Apple LLVM version 9.1.0 (clang-902.0.39.2)
configuration: --prefix=/usr/local/Cellar/ffmpeg/4.0.2 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags=-I/System/Library/Frameworks/JavaVM.framework/Versions/Current/Headers/ --host-ldflags= --enable-gpl --enable-chromaprint --enable-ffplay --enable-frei0r --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfdk-aac --enable-libfontconfig --enable-libfreetype --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopus --enable-librsvg --enable-librtmp --enable-librubberband --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtesseract --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libzimg --enable-libzmq --enable-opencl --enable-videotoolbox --enable-openssl --enable-libsrt --enable-lzma --enable-libopenjpeg --disable-decoder=jpeg2000 --extra-cflags=-I/usr/local/Cellar/openjpeg/2.3.0/include/openjpeg-2.3 --enable-nonfree
libavutil 56. 14.100 / 56. 14.100
libavcodec 58. 18.100 / 58. 18.100
libavformat 58. 12.100 / 58. 12.100
libavdevice 58. 3.100 / 58. 3.100
libavfilter 7. 16.100 / 7. 16.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 1.100 / 5. 1.100
libswresample 3. 1.100 / 3. 1.100
libpostproc 55. 1.100 / 55. 1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'appL.mp4':
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2018-09-17T16:01:54.000000Z
Duration: 00:00:03.30, start: 0.000000, bitrate: 4574 kb/s
Stream #0:0(und): Video: hevc (Main) (hvc1 / 0x31637668), yuv420p(tv, bt709), 960x540, 4459 kb/s, 29.97 fps, 29.97 tbr, 600 tbn, 600 tbc (default)
Metadata:
rotate : 0
creation_time : 2018-09-17T16:01:54.000000Z
handler_name : Core Media Data Handler
encoder : HEVC
Side data:
displaymatrix: rotation of -0.00 degrees
Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 98 kb/s (default)
Metadata:
creation_time : 2018-09-17T16:01:54.000000Z
handler_name : Core Media Data Handler
Stream mapping:
Stream #0:0 -> #0:0 (hevc (native) -> h264 (libx264))
Stream #0:1 -> #0:1 (aac (native) -> mp3 (libmp3lame))
Press [q] to stop, [?] for help
[libx264 # 0x7f89e0018600] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 # 0x7f89e0018600] ratecontrol_init: can't open stats file
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
[libmp3lame # 0x7f89e0019e00] 4 frames left in the queue on closing
Conversion failed!

Try this.. Change:
$format = new FFMpeg\Format\Video\X264();
To:
$format = new FFMpeg\Format\Video\X264('libmp3lame');

Related

Detect if MediaStreamTrack is black/blank

I'm creating videochat with peerjs.
I'm toggling camera (on/off) with the following function:
function toggleCamera() {
localStream.getVideoTracks()[0].enabled = !(localStream.getVideoTracks()[0].enabled);
}
After calling this function, video goes black and receiver gets just black screen (which works as intended).
Now I want to detect black/blank screen so I can show user some message or icon that camera is disabled and there is no stream.
How do I do detect that?
The common approach is to send a signaling message (either via the normal path or a datachannel). Polling getStats to detect the black frames is a valid approach but more expensive in terms of computation.
After some time I've managed to get solution:
var previousBytes = 0;
var previousTS = 0;
var currentBytes = 0;
var currentTS = 0;
// peer - new Peer()
// stream - local camera stream (received from navigator.mediaDevices.getUserMedia(constraints))
let connection = peer.call(peerID, stream);
// peerConnection - reference to RTCPeerConnection (https://peerjs.com/docs.html#dataconnection-peerconnection)
connection.peerConnection.getStats(null).then(stats => {
stats.forEach(report => {
if (report.type === "inbound-rtp") {
currentBytes = report.bytesReceived;
currentTS = report.timestamp;
if (previousBytes == 0) {
previousBytes = currentBytes;
previousTS = currentTS;
return;
}
console.log({ previousBytes })
console.log({ currentBytes })
var deltaBytes = currentBytes - previousBytes;
var deltaTS = currentTS - previousTS;
console.log("Delta: " + (deltaBytes / deltaTS) + " kB/s")
previousBytes = currentBytes;
previousTS = currentTS;
}
});
});
This code is actually in function which gets called every second. When camera is turned on and it's not covered, deltaBytes is between 100 and 250, when camera is turned off (programmatically) or covered (with a napkin or something), so camera stream is black/blank, deltaBytes is med 1.5-3kbps. After you turn camera back on, there is a spike in deltaBytes, which reaches around 500kbps.
This is short console log:
124.52747252747253 kB/s
202.213 kB/s
194.64764764764766 kB/s
15.313 kB/s (this is where camera is covered)
11.823823823823824 kB/s
11.862137862137862 kB/s
2.164 kB/s
2.005 kB/s
2.078078078078078 kB/s
1.99 kB/s
2.059 kB/s
1.992992992992993 kB/s
159.89810189810188 kB/s (uncovered camera)
502.669 kB/s
314.7927927927928 kB/s
255.0909090909091 kB/s
220.042 kB/s
213.46353646353646 kB/s
EDIT:
So in the end I did as #Philipp Hancke said. I created master connection which is open from when the page loads until user closes it. Over this connection I'm sending commands for initiating video call, canceling video session, turning on/off camera,... Then on the other side I'm parsing these commands and executing functions.
function sendMutedMicCommand() { masterConnection.send(`${commands.MutedMic}`); }
function sendUnmutedMicCommand() { masterConnection.send(`${commands.UnmutedMic}`); }
function sendPromptVideoCallCommand() { masterConnection.send(`${commands.PromptVideoCall}`); }
function sendAcceptVideoCallCommand() { masterConnection.send(`${commands.AcceptVideoCall}`); }
function sendDeclineVideoCallCommand() { masterConnection.send(`${commands.DeclineVideoCall}`); }
Function which handles data:
function handleData(data) {
let actionType = data;
switch (actionType) {
case commands.MutedMic: ShowMuteIconOnReceivingVideo(true); break;
case commands.UnmutedMic: ShowMuteIconOnReceivingVideo(false); break;
case commands.PromptVideoCall: showVideoCallModal(); break;
case commands.AcceptVideoCall: startVideoConference(); break;
case commands.DeclineVideoCall: showDeclinedCallAlert(); break;
default: break;
}
}
const commands = {
MutedMic: "mutedMic",
UnmutedMic: "unmutedMic",
PromptVideoCall: "promptVideoCall",
AcceptVideoCall: "acceptVideoCall",
DeclineVideoCall: "declineVideoCall",
}
And then when I receive mutedMic command, I show icon with crossed mic. When I receive AcceptVideoCall command I create another peer, videoCallPeer with random ID, which is then then sent to other side. Other side then created another peer with random ID and initiated video session with received ID.

Golang web server leaking memory at crypto/tls.(*block).reserve

I've got a web server written in Go.
tlsConfig := &tls.Config{
PreferServerCipherSuites: true,
MinVersion: tls.VersionTLS12,
CurvePreferences: []tls.CurveID{
tls.CurveP256,
tls.X25519,
},
CipherSuites: []uint16{
tls.TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,
tls.TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,
tls.TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305,
tls.TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305,
tls.TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,
tls.TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,
},
}
s := &http.Server{
ReadTimeout: 5 * time.Second,
WriteTimeout: 10 * time.Second,
IdleTimeout: 120 * time.Second,
Handler: r, // where r is my router
TLSConfig: tlsConfig,
}
// redirect http to https
redirect := &http.Server{
ReadTimeout: 5 * time.Second,
WriteTimeout: 10 * time.Second,
IdleTimeout: 120 * time.Second,
Handler: http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Connection", "close")
url := "https://" + r.Host + r.URL.String()
http.Redirect(w, r, url, http.StatusMovedPermanently)
}),
}
go func() {
log.Fatal(redirect.ListenAndServe())
}()
log.Fatal(s.ListenAndServeTLS(certFile, keyFile))
Here is a screenshot from my Digital Ocean dashboard.
As you can see memory keeps growing and growing. So I started looking at https://github.com/google/pprof. Here is the output of top5.
Type: inuse_space
Time: Nov 7, 2018 at 10:31am (CET)
Entering interactive mode (type "help" for commands, "o" for options)
(pprof) top5
Showing nodes accounting for 289.50MB, 79.70% of 363.24MB total
Dropped 90 nodes (cum <= 1.82MB)
Showing top 5 nodes out of 88
flat flat% sum% cum cum%
238.98MB 65.79% 65.79% 238.98MB 65.79% crypto/tls.(*block).reserve
20.02MB 5.51% 71.30% 20.02MB 5.51% crypto/tls.Server
11.50MB 3.17% 74.47% 11.50MB 3.17% crypto/aes.newCipher
10.50MB 2.89% 77.36% 10.50MB 2.89% crypto/aes.(*aesCipherGCM).NewGCM
The SVG shows the same huge amount of memory allocated by crypto/tls.(*block).reserve.
Here is the exact code.
I spent the last days reading every article, document, blog post, source code, help file I could find. However nothing helps. The code is running on a Ubuntu 17.10 x64 machine using Go 1.11 inside a Docker container.
It looks like the server doesn't close the connections to the client. I thought setting all the xyzTimeout would help but it didn't.
Any ideas?
Edit 12/20/2018:
fixed now https://github.com/golang/go/issues/28654#issuecomment-448477056
Adding an answer so this doesn't keep showing up in the list of upvoted and unanswered questions.
It appears that the memory leak was related to the gorilla context bug https://github.com/gorilla/sessions/commit/12bd4761fc66ac946e16fcc2a32b1e0b066f6177 and had nothing to do with tls in the stdlib.

MoveIt octomap not displayed in rviz

I am working with ROS INDIGO, a Sawyer robot (Rethink robotics), a Kinect v1, and MoveIt.
I am working on a Human-Robot Collaboration project.
After installing MoveIt, I have edited "camera_link_pose" from sawyer_moveit.launch to suit my base_to_camera transform.
When I run:
roslaunch sawyer_moveit_config sawyer_moveit.launch electric_gripper:=true kinect:=true
Rviz appears, everything works except that the octomap isn't displayed:
rviz_capture.png
Here is what I have in the terminal:
~/ros_ws$ roslaunch sawyer_moveit_config sawyer_moveit.launch electric_gripper:=true kinect:=true
... logging to /home/sawyer/.ros/log/33db5810-3ace-11e7-860e-1866da494b18/roslaunch-sawyer-HP-Compaq-Pro-6300-SFF-20862.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
started roslaunch server http://169.254.5.155:38982/
SUMMARY
========
PARAMETERS
* /move_group/allow_trajectory_execution: True
* /move_group/allowed_execution_duration_scaling: 1.2
* /move_group/allowed_goal_duration_margin: 0.5
* /move_group/camera/camera_nodelet_manager/num_worker_threads: 4
* /move_group/camera/depth_rectify_depth/interpolation: 0
* /move_group/camera/depth_registered_rectify_depth/interpolation: 0
* /move_group/camera/disparity_depth/max_range: 4.0
* /move_group/camera/disparity_depth/min_range: 0.5
* /move_group/camera/disparity_registered_hw/max_range: 4.0
* /move_group/camera/disparity_registered_hw/min_range: 0.5
* /move_group/camera/disparity_registered_sw/max_range: 4.0
* /move_group/camera/disparity_registered_sw/min_range: 0.5
* /move_group/camera/driver/data_skip: 0
* /move_group/camera/driver/debug: False
* /move_group/camera/driver/depth_camera_info_url:
* /move_group/camera/driver/depth_frame_id: camera_depth_opti...
* /move_group/camera/driver/depth_registration: False
* /move_group/camera/driver/device_id: #1
* /move_group/camera/driver/diagnostics_max_frequency: 30.0
* /move_group/camera/driver/diagnostics_min_frequency: 30.0
* /move_group/camera/driver/diagnostics_tolerance: 0.05
* /move_group/camera/driver/diagnostics_window_time: 5.0
* /move_group/camera/driver/enable_depth_diagnostics: False
* /move_group/camera/driver/enable_ir_diagnostics: False
* /move_group/camera/driver/enable_rgb_diagnostics: False
* /move_group/camera/driver/rgb_camera_info_url:
* /move_group/camera/driver/rgb_frame_id: camera_rgb_optica...
* /move_group/capabilities: move_group/MoveGr...
* /move_group/controller_list: [{'default': True...
* /move_group/controller_manager_name: simple_controller...
* /move_group/head/planner_configs: ['SBLkConfigDefau...
* /move_group/jiggle_fraction: 0.05
* /move_group/max_range: 5.0
* /move_group/max_safe_path_cost: 1
* /move_group/move_group/octomap_frame: camera_link
* /move_group/move_group/octomap_resolution: 0.02
* /move_group/move_group/point_subsample: 1
* /move_group/move_group/sensors: [{'max_range': 5....
* /move_group/moveit_controller_manager: moveit_simple_con...
* /move_group/moveit_manage_controllers: True
* /move_group/octomap_resolution: 0.025
* /move_group/planner_configs/BKPIECEkConfigDefault/border_fraction: 0.9
* /move_group/planner_configs/BKPIECEkConfigDefault/failed_expansion_score_factor: 0.5
* /move_group/planner_configs/BKPIECEkConfigDefault/min_valid_path_fraction: 0.5
* /move_group/planner_configs/BKPIECEkConfigDefault/range: 0.0
* /move_group/planner_configs/BKPIECEkConfigDefault/type: geometric::BKPIECE
* /move_group/planner_configs/ESTkConfigDefault/goal_bias: 0.05
* /move_group/planner_configs/ESTkConfigDefault/range: 0.0
* /move_group/planner_configs/ESTkConfigDefault/type: geometric::EST
* /move_group/planner_configs/KPIECEkConfigDefault/border_fraction: 0.9
* /move_group/planner_configs/KPIECEkConfigDefault/failed_expansion_score_factor: 0.5
* /move_group/planner_configs/KPIECEkConfigDefault/goal_bias: 0.05
* /move_group/planner_configs/KPIECEkConfigDefault/min_valid_path_fraction: 0.5
* /move_group/planner_configs/KPIECEkConfigDefault/range: 0.0
* /move_group/planner_configs/KPIECEkConfigDefault/type: geometric::KPIECE
* /move_group/planner_configs/LBKPIECEkConfigDefault/border_fraction: 0.9
* /move_group/planner_configs/LBKPIECEkConfigDefault/min_valid_path_fraction: 0.5
* /move_group/planner_configs/LBKPIECEkConfigDefault/range: 0.0
* /move_group/planner_configs/LBKPIECEkConfigDefault/type: geometric::LBKPIECE
* /move_group/planner_configs/PRMkConfigDefault/max_nearest_neighbors: 10
* /move_group/planner_configs/PRMkConfigDefault/type: geometric::PRM
* /move_group/planner_configs/PRMstarkConfigDefault/type: geometric::PRMstar
* /move_group/planner_configs/RRTConnectkConfigDefault/range: 0.0
* /move_group/planner_configs/RRTConnectkConfigDefault/type: geometric::RRTCon...
* /move_group/planner_configs/RRTkConfigDefault/goal_bias: 0.05
* /move_group/planner_configs/RRTkConfigDefault/range: 0.0
* /move_group/planner_configs/RRTkConfigDefault/type: geometric::RRT
* /move_group/planner_configs/RRTstarkConfigDefault/delay_collision_checking: 1
* /move_group/planner_configs/RRTstarkConfigDefault/goal_bias: 0.05
* /move_group/planner_configs/RRTstarkConfigDefault/range: 0.0
* /move_group/planner_configs/RRTstarkConfigDefault/type: geometric::RRTstar
* /move_group/planner_configs/SBLkConfigDefault/range: 0.0
* /move_group/planner_configs/SBLkConfigDefault/type: geometric::SBL
* /move_group/planner_configs/TRRTkConfigDefault/frountierNodeRatio: 0.1
* /move_group/planner_configs/TRRTkConfigDefault/frountier_threshold: 0.0
* /move_group/planner_configs/TRRTkConfigDefault/goal_bias: 0.05
* /move_group/planner_configs/TRRTkConfigDefault/init_temperature: 10e-6
* /move_group/planner_configs/TRRTkConfigDefault/k_constant: 0.0
* /move_group/planner_configs/TRRTkConfigDefault/max_states_failed: 10
* /move_group/planner_configs/TRRTkConfigDefault/min_temperature: 10e-10
* /move_group/planner_configs/TRRTkConfigDefault/range: 0.0
* /move_group/planner_configs/TRRTkConfigDefault/temp_change_factor: 2.0
* /move_group/planner_configs/TRRTkConfigDefault/type: geometric::TRRT
* /move_group/planning_plugin: ompl_interface/OM...
* /move_group/planning_scene_monitor/publish_geometry_updates: True
* /move_group/planning_scene_monitor/publish_planning_scene: True
* /move_group/planning_scene_monitor/publish_state_updates: True
* /move_group/planning_scene_monitor/publish_transforms_updates: True
* /move_group/request_adapters: default_planner_r...
* /move_group/right_arm/planner_configs: ['SBLkConfigDefau...
* /move_group/start_state_max_bounds_error: 0.1
* /move_group/use_controller_manager: True
* /robot_description_kinematics/head/kinematics_solver: kdl_kinematics_pl...
* /robot_description_kinematics/head/kinematics_solver_attempts: 3
* /robot_description_kinematics/head/kinematics_solver_search_resolution: 0.005
* /robot_description_kinematics/head/kinematics_solver_timeout: 0.005
* /robot_description_kinematics/right_arm/kinematics_solver: kdl_kinematics_pl...
* /robot_description_kinematics/right_arm/kinematics_solver_attempts: 10
* /robot_description_kinematics/right_arm/kinematics_solver_search_resolution: 0.005
* /robot_description_kinematics/right_arm/kinematics_solver_timeout: 0.005
* /robot_description_planning/joint_limits/right_j0/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j0/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j0/max_acceleration: 3.5
* /robot_description_planning/joint_limits/right_j0/max_velocity: 0.88
* /robot_description_planning/joint_limits/right_j1/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j1/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j1/max_acceleration: 2.5
* /robot_description_planning/joint_limits/right_j1/max_velocity: 0.678
* /robot_description_planning/joint_limits/right_j2/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j2/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j2/max_acceleration: 5.0
* /robot_description_planning/joint_limits/right_j2/max_velocity: 0.996
* /robot_description_planning/joint_limits/right_j3/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j3/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j3/max_acceleration: 5.0
* /robot_description_planning/joint_limits/right_j3/max_velocity: 0.996
* /robot_description_planning/joint_limits/right_j4/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j4/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j4/max_acceleration: 5.0
* /robot_description_planning/joint_limits/right_j4/max_velocity: 1.776
* /robot_description_planning/joint_limits/right_j5/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j5/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j5/max_acceleration: 5.0
* /robot_description_planning/joint_limits/right_j5/max_velocity: 1.776
* /robot_description_planning/joint_limits/right_j6/has_acceleration_limits: True
* /robot_description_planning/joint_limits/right_j6/has_velocity_limits: True
* /robot_description_planning/joint_limits/right_j6/max_acceleration: 5.0
* /robot_description_planning/joint_limits/right_j6/max_velocity: 2.316
* /robot_description_semantic: <?xml version="1....
* /rosdistro: indigo
* /rosversion: 1.11.21
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/head/kinematics_solver: kdl_kinematics_pl...
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/head/kinematics_solver_attempts: 3
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/head/kinematics_solver_search_resolution: 0.005
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/head/kinematics_solver_timeout: 0.005
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/right_arm/kinematics_solver: kdl_kinematics_pl...
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/right_arm/kinematics_solver_attempts: 10
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/right_arm/kinematics_solver_search_resolution: 0.005
* /rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334/right_arm/kinematics_solver_timeout: 0.005
NODES
/move_group/camera/
camera_nodelet_manager (nodelet/nodelet)
depth_metric (nodelet/nodelet)
depth_metric_rect (nodelet/nodelet)
depth_points (nodelet/nodelet)
depth_rectify_depth (nodelet/nodelet)
depth_registered_hw_metric_rect (nodelet/nodelet)
depth_registered_metric (nodelet/nodelet)
depth_registered_rectify_depth (nodelet/nodelet)
depth_registered_sw_metric_rect (nodelet/nodelet)
disparity_depth (nodelet/nodelet)
disparity_registered_hw (nodelet/nodelet)
disparity_registered_sw (nodelet/nodelet)
driver (nodelet/nodelet)
ir_rectify_ir (nodelet/nodelet)
points_xyzrgb_hw_registered (nodelet/nodelet)
points_xyzrgb_sw_registered (nodelet/nodelet)
register_depth_rgb (nodelet/nodelet)
rgb_debayer (nodelet/nodelet)
rgb_rectify_color (nodelet/nodelet)
rgb_rectify_mono (nodelet/nodelet)
/move_group/
camera_base_link (tf/static_transform_publisher)
camera_base_link1 (tf/static_transform_publisher)
camera_base_link2 (tf/static_transform_publisher)
camera_base_link3 (tf/static_transform_publisher)
camera_link_broadcaster (tf/static_transform_publisher)
/
move_group (moveit_ros_move_group/move_group)
rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334 (rviz/rviz)
ROS_MASTER_URI=http://021611CP00073.local:11311
core service [/rosout] found
process[move_group/camera/camera_nodelet_manager-1]: started with pid [20874]
process[move_group/camera/driver-2]: started with pid [20875]
process[move_group/camera/rgb_debayer-3]: started with pid [20876]
process[move_group/camera/rgb_rectify_mono-4]: started with pid [20877]
process[move_group/camera/rgb_rectify_color-5]: started with pid [20878]
process[move_group/camera/ir_rectify_ir-6]: started with pid [20879]
process[move_group/camera/depth_rectify_depth-7]: started with pid [20883]
process[move_group/camera/depth_metric_rect-8]: started with pid [20890]
process[move_group/camera/depth_metric-9]: started with pid [20891]
process[move_group/camera/depth_points-10]: started with pid [20902]
process[move_group/camera/register_depth_rgb-11]: started with pid [20906]
process[move_group/camera/points_xyzrgb_sw_registered-12]: started with pid [20907]
process[move_group/camera/depth_registered_sw_metric_rect-13]: started with pid [20919]
process[move_group/camera/depth_registered_rectify_depth-14]: started with pid [20923]
process[move_group/camera/points_xyzrgb_hw_registered-15]: started with pid [20926]
[ INFO] [1495016462.598547792]: Initializing nodelet with 4 worker threads.
process[move_group/camera/depth_registered_hw_metric_rect-16]: started with pid [20927]
process[move_group/camera/depth_registered_metric-17]: started with pid [20941]
process[move_group/camera/disparity_depth-18]: started with pid [20946]
process[move_group/camera/disparity_registered_sw-19]: started with pid [20948]
process[move_group/camera/disparity_registered_hw-20]: started with pid [20952]
process[move_group/camera_base_link-21]: started with pid [20953]
process[move_group/camera_base_link1-22]: started with pid [20965]
process[move_group/camera_base_link2-23]: started with pid [20969]
process[move_group/camera_base_link3-24]: started with pid [20980]
process[move_group/camera_link_broadcaster-25]: started with pid [20984]
process[move_group-26]: started with pid [20986]
process[rviz_sawyer_HP_Compaq_Pro_6300_SFF_20862_7403073962417655334-27]: started with pid [20993]
[ INFO] [1495016462.693509168]: Number devices connected: 1
[ INFO] [1495016462.693602953]: 1. device on bus 000:00 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'A00366907653103A'
[ INFO] [1495016462.694673734]: Searching for device with index = 1
(rviz:20993): Gtk-WARNING **: Unable to locate theme engine in module_path: "adwaita",
[ INFO] [1495016462.750378327]: rviz version 1.11.15
[ INFO] [1495016462.750424452]: compiled against Qt version 4.8.6
[ INFO] [1495016462.750436423]: compiled against OGRE version 1.8.1 (Byatis)
[ INFO] [1495016462.778280205]: Starting a 3s RGB and Depth stream flush.
[ INFO] [1495016462.778359537]: Opened 'Xbox NUI Camera' on bus 0:0 with serial number 'A00366907653103A'
[ INFO] [1495016462.779418408]: Loading robot model 'sawyer'...
[ INFO] [1495016462.859296395]: Stereo is NOT SUPPORTED
[ INFO] [1495016462.859370647]: OpenGl version: 3 (GLSL 1.3).
[ WARN] [1495016462.895187551]: Could not find any compatible depth output mode for 1. Falling back to default depth output mode 1.
[ INFO] [1495016462.938451761]: Loading robot model 'sawyer'...
[ INFO] [1495016462.996939730]: Publishing maintained planning scene on 'monitored_planning_scene'
[ INFO] [1495016463.005977717]: MoveGroup debug mode is ON
Starting context monitors...
[ INFO] [1495016463.006036440]: Starting scene monitor
[ INFO] [1495016463.016688848]: Listening to '/planning_scene'
[ INFO] [1495016463.016734632]: Starting world geometry monitor
[ INFO] [1495016463.025046719]: Listening to '/collision_object' using message notifier with target frame '/base '
[ INFO] [1495016463.033351917]: Listening to '/planning_scene_world' for planning scene world geometry
[ INFO] [1495016463.062230866]: Listening to '/attached_collision_object' for attached collision objects
Context monitors started.
[ INFO] [1495016463.110704157]: Initializing OMPL interface using ROS parameters
[ INFO] [1495016463.159454318]: Using planning interface 'OMPL'
[ INFO] [1495016463.189799983]: Param 'default_workspace_bounds' was not set. Using default value: 10
[ INFO] [1495016463.192609516]: Param 'start_state_max_bounds_error' was set to 0.1
[ INFO] [1495016463.195020096]: Param 'start_state_max_dt' was not set. Using default value: 0.5
[ INFO] [1495016463.197465266]: Param 'start_state_max_dt' was not set. Using default value: 0.5
[ INFO] [1495016463.201089427]: Param 'jiggle_fraction' was set to 0.05
[ INFO] [1495016463.203597709]: Param 'max_sampling_attempts' was not set. Using default value: 100
[ INFO] [1495016463.203654817]: Using planning request adapter 'Add Time Parameterization'
[ INFO] [1495016463.203682529]: Using planning request adapter 'Fix Workspace Bounds'
[ INFO] [1495016463.203711417]: Using planning request adapter 'Fix Start State Bounds'
[ INFO] [1495016463.203723132]: Using planning request adapter 'Fix Start State In Collision'
[ INFO] [1495016463.203761613]: Using planning request adapter 'Fix Start State Path Constraints'
[ WARN] [1495016463.215644770]:
Deprecation warning: parameter 'allowed_execution_duration_scaling' moved into namespace 'trajectory_execution'.
Please, adjust file trajectory_execution.launch.xml!
[ WARN] [1495016463.217830741]:
Deprecation warning: parameter 'allowed_goal_duration_margin' moved into namespace 'trajectory_execution'.
Please, adjust file trajectory_execution.launch.xml!
[ INFO] [1495016463.481330910]: Added FollowJointTrajectory controller for /robot/limb/right
[ INFO] [1495016463.481450028]: Returned 1 controllers in list
[ INFO] [1495016463.516304840]: Trajectory execution is managing controllers
Loading 'move_group/ApplyPlanningSceneService'...
Loading 'move_group/ClearOctomapService'...
Loading 'move_group/MoveGroupCartesianPathService'...
Loading 'move_group/MoveGroupExecuteService'...
Loading 'move_group/MoveGroupExecuteTrajectoryAction'...
Loading 'move_group/MoveGroupGetPlanningSceneService'...
Loading 'move_group/MoveGroupKinematicsService'...
Loading 'move_group/MoveGroupMoveAction'...
Loading 'move_group/MoveGroupPickPlaceAction'...
Loading 'move_group/MoveGroupPlanService'...
Loading 'move_group/MoveGroupQueryPlannersService'...
Loading 'move_group/MoveGroupStateValidationService'...
[ INFO] [1495016463.767432287]:
********************************************************
* MoveGroup using:
* - ApplyPlanningSceneService
* - ClearOctomapService
* - CartesianPathService
* - ExecuteTrajectoryService
* - ExecuteTrajectoryAction
* - GetPlanningSceneService
* - KinematicsService
* - MoveAction
* - PickPlaceAction
* - MotionPlanService
* - QueryPlannersService
* - StateValidationService
********************************************************
[ INFO] [1495016463.767507058]: MoveGroup context using planning plugin ompl_interface/OMPLPlanner
[ INFO] [1495016463.767531704]: MoveGroup context initialization complete
All is well! Everyone is happy! You can start planning now!
[ INFO] [1495016463.848688873]: rgb_frame_id = 'camera_rgb_optical_frame'
[ INFO] [1495016463.848798099]: depth_frame_id = 'camera_depth_optical_frame'
[ WARN] [1495016463.876687681]: Camera calibration file /home/sawyer/.ros/camera_info/rgb_A00366907653103A.yaml not found.
[ WARN] [1495016463.876777590]: Using default parameters for RGB camera calibration.
[ WARN] [1495016463.876814517]: Camera calibration file /home/sawyer/.ros/camera_info/depth_A00366907653103A.yaml not found.
[ WARN] [1495016463.876862956]: Using default parameters for IR camera calibration.
[ INFO] [1495016465.779364068]: Stopping device RGB and Depth stream flush.
[ INFO] [1495016466.428636329]: Loading robot model 'sawyer'...
[ INFO] [1495016466.532779604]: Loading robot model 'sawyer'...
[ INFO] [1495016466.585212741]: Starting scene monitor
[ INFO] [1495016466.592453780]: Listening to '/move_group/monitored_planning_scene'
[ INFO] [1495016467.461837577]: No active joints or end effectors found for group ''. Make sure you have defined an end effector in your SRDF file and that kinematics.yaml is loaded in this node's namespace.
[ INFO] [1495016467.469390902]: Constructing new MoveGroup connection for group 'right_arm' in namespace ''
[ INFO] [1495016468.301998541]: TrajectoryExecution will use old service capability.
[ INFO] [1495016468.302066666]: Ready to take MoveGroup commands for group right_arm.
[ INFO] [1495016468.302105415]: Looking around: no
[ INFO] [1495016468.302133111]: Replanning: no
I think I have tried almost everything to make it work. I followed several tutorial but my configuration seems to be ok. Here are some documentation I've been through:
MoveIt tutorials:
http:// docs.ros.org/kinetic/api/moveit_tutorials/html/doc/pr2_tutorials/planning/src/doc/perception_configuration.html
MoveIt google group:
https:// groups.google.com/forum/#!forum/moveit-users
Baxter Kinect integration for MoveIt:
http:// sdk.rethinkrobotics.com/wiki/Kinect_basics
MoveIt installation and tuto for Sawyer
http:// sdk.rethinkrobotics.com/intera/MoveIt_Tutorial
It seems that I am not the only one not being able to see the octomap in rviz. Maybe someone here had already been confronted to this problem ?
Thanks in advance !
I finally found a solution, check here if you are interested: https://groups.google.com/forum/#!topic/moveit-users/Fz-Hxhjr5zI
There is a useless "move_group" namespace block in "sawyer_moveit_sensor_manager.launch.xml"

Rails 5 jquery file upload image orientation rotated 90 degrees for portraits

I'm having trouble with the image orientation after uploading an image to my s3 account. Portrait images are rotated 90 degrees when images are displayed. Here is the coffee script I am using from the Heroku - Direct to S3 Image Uploads in Rails tutorial.
# Use this instead of jQuery -> with Turbo links. Turbo links will trigger the ready page:load.
document.addEventListener 'turbolinks:load', ->
$('.directUpload').find('input:file').each (i, elem) ->
fileInput = $(elem)
form = $(fileInput.parents('form:first'))
submitButton = form.find('input[type="submit"]')
progressBar = $('<div class=\'bar\'></div>')
barContainer = $('<div class=\'progress\'></div>').append(progressBar)
fileInput.after barContainer
fileInput.fileupload
fileInput: fileInput
url: form.data('url')
type: 'POST'
autoUpload: true
formData: form.data('form-data')
paramName: 'file'
dataType: 'XML'
replaceFileInput: false
progressall: (e, data) ->
progress = parseInt(data.loaded / data.total * 100, 10)
progressBar.css 'width', progress + '%'
return
start: (e) ->
submitButton.prop 'disabled', true
progressBar.css('background', 'green').css('display', 'block').css('width', '0%').text 'Loading...'
return
done: (e, data) ->
submitButton.prop 'disabled', false
progressBar.text 'Uploading done'
# extract key and generate URL from response
key = $(data.jqXHR.responseXML).find('Key').text()
url = '//' + form.data('host') + '/' + key
# create hidden field
input = $('<input />',
type: 'hidden'
name: fileInput.attr('name')
value: url)
form.append input
return
fail: (e, data) ->
submitButton.prop 'disabled', false
progressBar.css('background', 'red').text 'Failed'
return
return
return
I'm using an image_tag to display the image from my S3 account.
= image_tag current_company.logo.image_url
I'm not sure what I need to do to fix the orientation. I did find this in the blueimp/jQuery-File-Upload/wiki/Options#imageorientation, but have no clue if this is what I am looking for or how to use it if it is. Any help would be appreciated.
This is my first time posting on here so please be gentle.

Display getUserMediaStream live video with media stream extensions (MSE)

I am trying to display a MediaStream taken from a webcam using getUserMedia, and to relay it to a remote peer using whatever mechanism possible for it to be played (as an experiment). I am not using webRTC directly as I want control over the raw data.
The issue I encounter is that my video element displays nothing, and I don't get any errors back. I am using Chrome Version 51.0.2704.103 (64-bit) on Elementary OS (Ubuntu 14.04 based linux OS).
As a sidenote, if I record all the blobs into an array and then create a new blob and set the video's src element to URL.createObjectUrl(blob), it displays video correctly.
Here is the code I tried to accomplish this (minus the relaying, I'm just trying to play it locally):
var ms = new MediaSource();
var video = document.querySelector("video");
video.src = window.URL.createObjectURL(ms);
ms.addEventListener("sourceopen", function() {
var sourceBuffer = ms.addSourceBuffer('video/webm; codecs="vorbis,vp8"');
navigator.getUserMedia({video: {width: 320, height: 240, framerate: 30}, audio: true}, function(stream) {
var recorder = new MediaRecorder(stream);
recorder.ondataavailable = function(event) {
var reader = new FileReader();
reader.addEventListener("loadend", function () {
var uint8Chunk = new Uint8Array(reader.result);
if (!sourceBuffer.updating) {
sourceBuffer.appendBuffer(uint8Chunk);
}
if (video.paused) video.play();
});
reader.readAsArrayBuffer(event.data);
};
recorder.start(10);
}, function(error) {
console.error(error);
});
}, false);
Here is the info I get in chrome://media-internal:
render_id: 147
player_id: 0
pipeline_state: kPlaying
event: WEBMEDIAPLAYER_CREATED
url: blob:http%3A//localhost%3A8080/e5c51dd8-5709-4e6f-9457-49ac8c34756b
found_audio_stream: true
audio_codec_name: opus
found_video_stream: true
video_codec_name: vp8
duration: unknown
audio_dds: false
audio_decoder: OpusAudioDecoder
video_dds: false
video_decoder: FFmpegVideoDecoder
Also the log:
00:00:00 00 pipeline_state kCreated
00:00:00 00 event WEBMEDIAPLAYER_CREATED
00:00:00 00 url blob:http%3A//localhost%3A8080/e5c51dd8-5709-4e6f-9457-49ac8c34756b
00:00:00 00 pipeline_state kInitDemuxer
00:00:01 603 found_audio_stream true
00:00:01 603 audio_codec_name opus
00:00:01 603 found_video_stream true
00:00:01 603 video_codec_name vp8
00:00:01 604 duration unknown
00:00:01 604 pipeline_state kInitRenderer
00:00:01 604 audio_dds false
00:00:01 604 audio_decoder OpusAudioDecoder
00:00:01 604 video_dds false
00:00:01 604 video_decoder FFmpegVideoDecoder
00:00:01 604 pipeline_state kPlaying
Update: I've tried sending the data to node and saving it to a webm file with ffmpeg (fluent-ffmpeg), and I can view the file in VLC correctly.
Update 2: After streaming it back from node, I get the following: Media segment did not contain any video coded frames, mismatching initialization segment. Therefore, MSE coded frame processing may not interoperably detect discontinuities in appended media.
. After doing some research, it appears that webm files must be segmented to work, however I have not come across a way to do this (either using ffmpeg or other tools) for live streams. Any ideas here?
A little late, but you can try it like this (in chrome):
<html>
<body>
<video class="real1" autoplay controls></video>
<video class="real2" controls></video>
<script>
const constraints = {video: {width: 320, height: 240, framerate: 30}, audio: true};
const video1 = document.querySelector('.real1');
const video2 = document.querySelector('.real2');
var mediaSource = new MediaSource();
video2.src = window.URL.createObjectURL(mediaSource);
var sourceBuffer;
mediaSource.addEventListener('sourceopen', function () {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs=opus,vp8');
console.log(sourceBuffer);
})
var isFirst = true;
var mediaRecorder;
var i = 0;
function handleSuccess(stream) {
video1.srcObject = stream;
mediaRecorder = new MediaRecorder(stream, { mimeType: 'video/webm; codecs=opus,vp8' });
console.log(mediaRecorder.mimeType)
mediaRecorder.ondataavailable = function (e) {
var reader = new FileReader();
reader.onload = function (e) {
sourceBuffer.appendBuffer(new Uint8Array(e.target.result));
}
reader.readAsArrayBuffer(e.data);
if (video2.paused) {
video2.play(0);
}
}
mediaRecorder.start(20);
}
function handleError(error) {
console.error('Reeeejected!', error);
}
navigator.mediaDevices.getUserMedia(constraints).
then(handleSuccess).catch(handleError);
</script>
</body>
</html>
I think you missed setting the same (supported) codec to both, recorder and sourceBuffer.