Code Monkey home page Code Monkey logo

raspicam_node's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

raspicam_node's Issues

Code Quality

HI!

I don't want to approach you as being "rude" or something, but as someone who is deeply interested into code quality, I do read code of the software I want to use.

I deeply appreciate effort into making this work, raspberry camera node is of great use to all of us, however...

For past few years, C++ comunity as developed many tools to avoid errors in code which are common.

After reading code of this node I realised it's not using most of them, and thus I fail to belive that it will work correctly in longterm. And I think it can be damaging to the software and it can be easily avoided.

I would like to help with that, however I can't modify everything to better code standarts. I simply don't have so much time at my hand.

So I would like to describe some of the problems here and possible solutions. I would like to use https://github.com/isocpp/CppCoreGuidelines guide as reference.

Things that are problematic:

  1. Global variables - https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#Ri-global

Problem with them, is that the can be modified by anything, and it's hard and complicated to track what modifies it. Which can result in bugs, which are hard to debug.

Even adding everything what is "Global" now, into structure that is owned inside main() function, helps a lot. That forces you to pass the struct around only when it's needed, that makes code more clear.

  1. Pointers -
    C++ introduced many mechanisms to avoid usage of pointers, because pointers can have several problems:
    -> you do not know if it's valid - can be nullptr
    -> ownership of the data is not clear - who should delete it?

If you want to use pointer in argument, to modify the 'pointed to' thing, use reference instead.

Avoid using c-style arrays -> pointer at beginning of the array. C++ introduced many types of containers for data, that makes it easy to avoid problems like referecing item out of the array, or need to remember the array size. They work, they are reliable and you avoid one of the most common pitfails.

And they are used by all of ROS APIs I've seen.

  1. wrong datatypes

Things like " int abort; " should be defined with datatype that reflects what they are, abort in PORT_USERDATA is clearly 'bool' variable, so it's better to use bool instead

covered by: https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md#Ri-typed

and potentially more problems... or signs of parts that can be improved to make entire thing more reliable


I would like to help making this more reliably with the tools C++ language provides and by my opinion should be used.

I am more than willing to explaing anything, or try to improve some of the parts of the code, however it's better to discuss this before any work begins and I don't have enough time to work on everything.

Raspbian installation: "No definition of [dynamic_reconfigure] for OS [debian]"

I followed the installation guide but when I run:

rosdep install --from-paths src --ignore-src --rosdistro=indigo -y

I get this error:

ERROR: the following packages/stacks could not have their rosdep keys resolved to system dependencies: raspicam_node: No definition of [dynamic_reconfigure] for OS [debian]

I'm using a Raspberry 3 b+ with Raspbian 9 Stretch and a Indigo ROS distro.

Support request for usb webcams

In need for a high update rate I search for ways to quickly transfer images with RPi's wifi.
The package "video_stream_opencv" doesn't offer compression and "usb_cam" is outdated and throws many errors.
Would it be possible to use the Pi's hardware encoder for arbitrary webcams?

EDIT: Sorry, "video_stream_opencv" does indeed offer compressed images
through the topic "camera/image_raw/compressed.

Turtlebot3 RaspiCam/Android-Teleop No Image

Issue (as originally submitted Oct 4/17 on discourse.ros.org): Trying to view the Raspicam Publication “/raspicam_node/image/compressed” on connected Galaxy Tab ( 4.0.4) Android Teleop but with no image is displayed (though Teleop works).

Action: 1) Inquiring “rosnode info /android/camera_view” that node appears to use a Subscription to “/compressed_image [unknown type]”.
2) I tried a suggested workaround "rosrun image_transport republish compressed in:=/raspicam_node/image raw out:=/raspicam_node/image". Attached "rqt_graph" report shows successful new image nodes and Android connections but still no android image. TB3-Wii-Android-rosgraph_201771009.png.pdf. Appreciate any troubleshooting actions - Thanks.

Use full resolution of cam v2

Hi i try to use the pi cam v2 with full resolution (3280x2464) but there is an issue when starting in that mode. Is this a known issue? And can i fix this somehow?
Selection_164

Missing Launch File

magni_demos fiducial_follow.launch
... logging to /home/ubuntu/.ros/log/04b680ea-ce4a-11e7-941f-b92814654e31/roslaunch-ubiquityrobotB5C4-4408.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

while processing /opt/ros/kinetic/share/raspicam_node/launch/camerav2_410x308_30fps.launch:
Invalid roslaunch XML syntax: [Errno 2] No such file or directory: u'/opt/ros/kinetic/share/raspicam_node/launch/camerav2_410x308_30fps.launch'
The traceback for the exception was written to the log file

Features

Hey,

is it possible to choose the spot, where to save those taken pictures?
and is ist possible to take time lapse?

BR

low framerate

The camerav1_1280x720.launch only gives me 51 Hz on /raspicam_node/image/compressed. Is this expected?

With roslaunch raspicam_node camerav1_1280x720.launch enable_raw:=true this drops to 21 Hz on /raspicam_node/image/compressed and 9 Hz on /raspicam_node/image.
It appears that the framerate drops in general when enable_raw is enabled. Is it possible to select a raw-only mode?

How can I achieve a framerate of 90 Hz for uncompressed raw images?

How to use the node for camera calibration

Hi, I'm Leon, an engineer from ROBOTIS
I noticed that this package came from https://github.com/fpasteau/raspicam_node which is currently not in service, and I'm very appreciated that you are continuing management of thus very important package.

By the way, I'm currently doing some works that uses this raspicam package for an official TurtleBot3 example,
but to do that, I needed to do camera calibration for raspberry pi camera. The thing is, the node doesn't send raw image which was available previously.

As I know, camera calibration node needs the raw image, and as you commented in another issue, yes I need to make an extra decoder node. Plus, I need to modify the code a little to make it does service response to camera calibration node. However, to put this into official TurtleBot3 official package, I should minimize a number of packages used.

So here is what I need.

  1. Could you bring raspicam_raw_node here? or modify raspicam_node to become publishable the raw image.

  2. Could you add the service response code for camera calibration node?

If you are very busy, but also hope that those parts comes in the package, I can make it and send a PR.

Cheers.

Horizontal/Vertical flip being ignored

Hi, like you, I've also been maintaining my own fork of the raspicam_node project. To simplify maintenance, I'd like to drop my project and switch to yours. However, I'm having trouble getting all the same parameters working with your node.

I noticed that you've also implemented the vFlip and hFlip flags to enable the camera's hardware support for image flipping, but these don't seem to be accessible via launch parameters.

I have a Pi camera mounted so that it needs to be flipped both vertically and horizontally in order to appear "upright", so I'm using the launch file:

<launch>
    <node type="raspicam_node" pkg="raspicam_node" name="raspicam_node" output="screen">
        
        <param name="framerate" type="int" value="30" />
        <param name="exposure_mode" value="antishake" />
        <param name="shutter_speed" value="0" />
        <param name="quality" type="int" value="10" />
        <param name="width" type="int" value="320" />
        <param name="height" type="int" value="240" />
        
        <param name="hFlip" type="int" value="1" />
        <param name="vFlip" type="int" value="1" />
        
        <param name="camera_frame_id" value="raspicam" />
        
    </node>
</launch>

and when I run it with:

roslaunch myrobot myraspicam.launch

I see the output:

[ INFO] [1519266611.729751741]: Loading CameraInfo from package://raspicam_node/camera_info/camerav2_410x308.yaml
[ INFO] [1519266611.869569770]: Camera component done

[ INFO] [1519266611.876421849]: Encoder component done

[ INFO] [1519266611.891155693]: camera calibration URL: package://raspicam_node/camera_info/camerav2_410x308.yaml
[ INFO] [1519266611.950335288]: Camera successfully calibrated
[ INFO] [1519266612.084681134]: Reconfigure Request: contrast 4, sharpness 0, brightness 55, saturation 0, ISO 460, exposureCompensation 0, videoStabilisation 0, vFlip 0, hFlip 0, zoom 1.00, exposure_mode antishake, awb_mode auto
[ INFO] [1519266612.089540454]: Reconfigure done
[ INFO] [1519266612.144103542]: Starting video capture (320, 240, 10, 30)

[ INFO] [1519266612.145129791]: Video capture started

showing that even though I'm passing in vFlip=1 and hFlip=1, they're still both set to 0. Viewing the image feed from rqt_image_view confirms the image renders in the original unflipped layout.

Why is this? Am I not using the param tags correctly, or is this not yet supported fully? If not yet supported, I've fully implemented this in my own node, so I could submit a patch if you're accepting contributions.

raspicam_node/image_raw

Hello,

Why my raspicam_node/image_raw topic don't have publisher?


pi@pi-desktop:~$ rostopic info /raspicam_node/image_raw
Type: sensor_msgs/Image

Publishers: None

Subscribers:

Grayscale images

Hi!

Would it be possible with raspicam_node to get from the raspicam a grayscale image and publish it instead of the color one?

Thank you!

stereo mode

Hi, how do I use this with the Compute Module to publish nodes for 2 cameras?

Cannot find references for awb_gains

Hello, i'm plannig to use the raspicam in an application where I have to use a custom White Balance, accordingly to the api reference we can change it with with awb_gains

but when I try to read the values (setting awb_mode to "off" does not change that), it says:

 ERROR: Parameter [/raspicam_node/awb_gains] is not set

And searching here in the repo, I cannot find any reference to it...

So, there is a way to use this setting? (i'm trying but it seems very buggy)

Support ROS Melodic + Ubuntu Mate Bionic 18.02

tldr:
Could you provide instructions or a binary for that environment?

Full story:
I have a Raspi3 B+ with UbuntuMate 18.02 and ROS Melodic installed with working V2 cam. Part of my motivation for this environment is that I want this Raspi ROS to interact with a Nvidia Jetson Nano ROS and this is the current matching environment for the Nano.

Following the build instructions until this step (note substituion of 'melodic' for 'kinetic)
$ rosdep install --from-paths src --ignore-src --rosdistro=melodic -y

which results in

ERROR: the following packages/stacks could not have their rosdep keys resolved
to system dependencies:
raspicam_node: Cannot locate rosdep definition for [libraspberrypi0]

But I think that dependency is/should be met:

$ apt policy libraspberrypi0
libraspberrypi0:
  Installed: 1.20190215-1~bionic1.3
  Candidate: 1.20190215-1~bionic1.3
  Version table:
 *** 1.20190215-1~bionic1.3 990
        990 http://ppa.launchpad.net/ubuntu-pi-flavour-makers/ppa/ubuntu bionic/main armhf Packages
        100 /var/lib/dpkg/status

$ apt policy libraspberrypi-dev
libraspberrypi-dev:
  Installed: 1.20190215-1~bionic1.3
  Candidate: 1.20190215-1~bionic1.3
  Version table:
 *** 1.20190215-1~bionic1.3 990
        990 http://ppa.launchpad.net/ubuntu-pi-flavour-makers/ppa/ubuntu bionic/main armhf Packages
        100 /var/lib/dpkg/status

I suspect it is a simple problem involving how the dependency is expressed and resolved, but I'm not familiar with those mechanisms.

Any guidance would be welcome!
-Chris

set video_device

Where can I change the video port (such as /dev/video0 /dev/video1)?
If is not possible, how can I use raspicam_node with USB cam?

thanks,
Gil

camera_component not initialized

If I type:

roslaunch raspicam_node camerav1_1280x720.launch

I get the following error:

camera_component not initialzed

The camera is not working and I don't know how to fix this. Any help would be appreciated :-)

Error when installing "Some packages could not be installed".

By using the command from the manpage, I get:
pi@raspberrypi:~/rosbots_catkin_ws $ sudo apt install ros-kinetic-raspicam-node

Reading package lists... Done
Building dependency tree
Reading state information... Done
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:

The following packages have unmet dependencies:
ros-kinetic-raspicam-node : Depends: ros-kinetic-camera-info-manager but it is not installable
Depends: ros-kinetic-compressed-image-transport but it is not installable
Depends: ros-kinetic-dynamic-reconfigure but it is not installable
Depends: ros-kinetic-roscpp but it is not installable
Depends: ros-kinetic-sensor-msgs but it is not installable
Depends: ros-kinetic-std-msgs but it is not installable
Depends: ros-kinetic-std-srvs but it is not installable
E: Unable to correct problems, you have held broken packages.

I am using kinetic. On a raspberry Pi 3B+. Camera is connected and workst with raspistill.
I tried installing ROS from both:

  1. This image here: https://medium.com/@rosbots/ready-to-use-image-raspbian-stretch-ros-opencv-324d6f8dcd96
  2. Manually: http://wiki.ros.org/ROSberryPi/Installing%20ROS%20Kinetic%20on%20the%20Raspberry%20Pi

Both approches yield the same error.
Any idea what to do?

cannot locate rosdep definition for [libraspberrypi0]

I followed the instructions and got the error when running "rosdep install --from-paths src --ignore-src --rosdistro=kinetic -y"

Here is what I did, any clues?

  1. I am using Indigo rarther than kinetic
  2. I created a new folder "/etc/ros/rosdep/sources.list.d/" and set up a file "30-ubiquity.list" and add this line to it: yaml https://raw.githubusercontent.com/UbiquityRobotics/rosdep/master/raspberry-pi.yaml
  3. install dependencies using
    cd ~/catkin_ws
    rosdep install --from-paths src --ignore-src --rosdistro=kinetic -y

then got the error above.

standalone library for raspicam

Would you mind splitting this project into a dedicated library and a thin ROS wrapper? This would make most of the code reusable by other projects.

The ROS node has quite some functionality now, that goes beyond what can be achieved with most C/C++ libraries for the Raspberry Pi camera. It would be useful for other projects to have this functionality available without ROS dependencies. This would also allow to reuse most of the code for a ROS2 node. The ROS guidelines recommend in general to export core functionality to a library and only provide a thin wrapper for the ROS node, making most code reusable.
A stand-alone library could simply interface via OpenCV images (or plain arrays) and the ROS node would provide the configuration and communication.

Compiling Error from Source.

Hi,
I am trying to compile the latest branch of the package from source under my catkin_ws, but I'm getting the following errors:

/home/pi/catkin_ws/src/raspicam_node/src/raspicam_node.cpp:273:11: error: ‘PORT_USERDATA {aka struct MMAL_PORT_USERDATA_T}’ has no member named ‘pState’
    pData->pState.updater.update();
           ^
/home/pi/catkin_ws/src/raspicam_node/src/raspicam_node.cpp: In function ‘int main(int, char**)’:
/home/pi/catkin_ws/src/raspicam_node/src/raspicam_node.cpp:1350:12: error: base operand of ‘->’ has non-pointer type ‘RASPIVID_STATE’
   state_srv->updater.setHardwareID("raspicam");
            ^
/home/pi/catkin_ws/src/raspicam_node/src/raspicam_node.cpp:1358:29: error: base operand of ‘->’ has non-pointer type ‘RASPIVID_STATE’
         image_pub, state_srv->updater, FrequencyStatusParam(&min_freq, &max_freq, 0.1, 10), TimeStampStatusParam(0, 0.2)));
                             ^
/home/pi/catkin_ws/src/raspicam_node/src/raspicam_node.cpp:1363:27: error: base operand of ‘->’ has non-pointer type ‘RASPIVID_STATE’
         imv_pub, state_srv->updater, FrequencyStatusParam(&min_freq, &max_freq, 0.1, 10), TimeStampStatusParam(0, 0.2)));
                           ^
/home/pi/catkin_ws/src/raspicam_node/src/raspicam_node.cpp:1367:28: error: base operand of ‘->’ has non-pointer type ‘RASPIVID_STATE’
       cimage_pub, state_srv->updater, FrequencyStatusParam(&min_freq, &max_freq, 0.1, 10), TimeStampStatusParam(0, 0.2)));
                            ^
raspicam_node/CMakeFiles/raspicam_node.dir/build.make:62: recipe for target 'raspicam_node/CMakeFiles/raspicam_node.dir/src/raspicam_node.cpp.o' failed
make[2]: *** [raspicam_node/CMakeFiles/raspicam_node.dir/src/raspicam_node.cpp.o] Error 1
CMakeFiles/Makefile2:2163: recipe for target 'raspicam_node/CMakeFiles/raspicam_node.dir/all' failed
make[1]: *** [raspicam_node/CMakeFiles/raspicam_node.dir/all] Error 2
Makefile:138: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j4 -l4" failed

I would appreciate any hint on how to solve this. I have ubuntu mate and ros kinetic on my Raspberry PI3 B.

EDIT (@rohbotics) Formatting

Figure out Packaging

Issues with packaging:

  • Build Infrastructure, Who builds it? How often?
  • Proper dependency management (PR #19)
  • ABI stability of VideoCore library
  • Keeping fake x86/x64 support? Just make a fake Debian package?

Ubuntu-Mate (16.04.3) Source Compile error

Raspberry Pi 3 (Turtlebot3) w/V2 camera, Linux cassini 4.9.53-v7+ #1040 SMP Fri Oct 6 14:19:18 BST 2017 armv7l armv7l armv7l GNU/Linux,
following https://github.com/UbiquityRobotics/raspicam_node/issues/new instructions.

Having to reinstall Ubuntu-Mate/ROS and raspicam_node (due SD Card crash) in my RasPi3, resulting in many compile errors in the "RaspiCamControl.h" . Thus the "make -j4 -l4" failed. Appreciate your advice. Ross

[ 84%] Built target turtlebot3_msgs_generate_messages
[ 86%] Generating dynamic reconfigure files from cfg/Camera.cfg: /home/turtlebot/catkin_ws/devel/include/raspicam_node/CameraConfig.h /home/turtlebot/catkin_ws/devel/lib/python2.7/dist-packages/raspicam_node/cfg/CameraConfig.py
Scanning dependencies of target raspicli
Scanning dependencies of target raspicamcontrol
[ 88%] Building CXX object raspicam_node/CMakeFiles/raspicli.dir/src/RaspiCLI.cpp.o
[ 90%] Building CXX object raspicam_node/CMakeFiles/raspicamcontrol.dir/src/RaspiCamControl.cpp.o
Generating reconfiguration files for Camera in raspicam_node
Wrote header file in /home/turtlebot/catkin_ws/devel/include/raspicam_node/CameraConfig.h
[ 90%] Built target raspicam_node_gencfg
In file included from /home/turtlebot/catkin_ws/src/raspicam_node/src/RaspiCamControl.cpp:37:0:
/home/turtlebot/catkin_ws/src/raspicam_node/include/RaspiCamControl.h:116:5: error: ‘MMAL_PARAM_EXPOSUREMODE_T’ does not name a type
MMAL_PARAM_EXPOSUREMODE_T exposureMode;
^
/home/turtlebot/catkin_ws/src/raspicam_node/include/RaspiCamControl.h:117:5: error: ‘MMAL_PARAM_EXPOSUREMETERINGMODE_T’ does not name a type
MMAL_PARAM_EXPOSUREMETERINGMODE_T exposureMeterMode;.

and many more similar errors . The RaspiCamControl,h file is in the include directory

Raw output errors

Decoding the CompressedImage stream is adding a prohibitive amount of latency to my application, so I'm trying to investigate using raw images instead.

I've seen in your code you can enable raw Image output on raspicam_node/image by adding <param name="enable_raw" value="true"/> to the launch file.

This seems to work, but rviz and my node(s) are struggling to process the topic. I get the following error from rviz for example:

[ERROR] [1539965934.276631376]: Error loading image: OGRE EXCEPTION(2:InvalidParametersException): Stream size does not match calculated image size in Image::loadRawData at /build/ogre-1.9-mqY1wq/ogre-1.9-1.9.0+dfsg1/OgreMain/src/OgreImage.cpp (line 283)

From the CV bridge I get:

mg = bridge.imgmsg_to_cv2(msg)
File "/opt/ros/kinetic/lib/python2.7/dist-packages/cv_bridge/core.py", line 171, in imgmsg_to_cv2
    dtype=dtype, buffer=img_msg.data)
TypeError: buffer is too small for requested array

Is this a bug or do I need to enable something else to work with raw images?

Raspicam_node generates blackened images where as uvc_camera doesn't produce blackened images .

I was trying to run the magni_nav , aurco.launch .
It calls the raspicam_node to publish the images , the node generates highly blackened images .

I tried all the different launch files in raspicam_node . Same result always .(blackened images)

Also , Can I make the fiducial_slam work with uvc_camera node ?
I tried publishing the topics in the correct name , but it didn't work.

Ps:- I am using a 5mp raspi_cam .(probably a Chinese clone )

Raw Image does not Load into Rviz.

Hi there,
When I enable the enable_raw flag to get the raw image, unfortunately the image cannot be loaded to Rviz. The reason is that the data being published into the /raspicam_node/image topic is somehow corrupted. Here is an instance of the published raw image:

header: seq: 1109 stamp: secs: 1555185561 nsecs: 712540212 frame_id: raspicam height: 308 width: 410 encoding: bgr8 is_bigendian: 0 step: 1230 data: <array type: uint8, length: 399360>

By looking at the published raw image above, we see that the stream size (length of the data:399360) does not match the calculated image size (3084103=378840). And that is why Rviz cannot visualise it.
If I use the image_transport package to uncompress the compressed image published into the /raspicam_node/image/compressed, then I get the following data type, and I can visualise it in Rviz.

header: seq: 1246 stamp: secs: 1555185575 nsecs: 412069710 frame_id: raspicam height: 308 width: 410 encoding: bgr8 is_bigendian: 0 step: 1230 data: <array type: uint8, length: 378840>

I would appreciate any hint on how to fix this bug. Thank you

Want both raw and compressed topics

In my project I need to have the raw data from the pi camera sampled and processed using CV. I find the republish function starts to really jump cpu utilization (and power and heat) at larger frame sizes. I need to record high res images at 10 fps, and downscale for image analysis.

Its really useful to have the hardware jpeg encoding, but it would be better to just make a separate jpeg encoder node that takes raw in and uses the pi hardware to encode.

Another very good thing would be to add the pi's h.264 hardware encoder to the node.

or maybe a new node needs to be developed for that.

good stuff, but im going to have to try out dbangolds raspicam_node which offers raw output.

raspicam_node stop publishing after a few seconds of launching

Hi everyone,

I was trying to run the raspicam_node with a v1.3 camera attached to a Raspberry Pi 3B+ installed with the Ubiquity Robotics Raspberry Pi image for camera calibration. I tried the following launch command -

roslaunch raspicam_node camerav2_1280x960_10fps.launch enable_raw:=true

I can see with rostopic list that it was publishing correctly for a few seconds then it stops. I tried with camerav1_1280x720.launch instead with the same behaviour. If I drop the fps in the launch file to 10fps from 90fps the publishing only stops after a minute or so. Lowering the resolution did not change this. The behaviour was repeated by killing and rerunning the launch file again without rebooting so connection with camera is still alive?

However I can run the motion vector command with no issue of stopping -

roslaunch raspicam_node camerav2_410x308_30fps.launch enable_imv:=true

Anyone here has seen this behaviour before?

Thank you.

The topic names shouldn't start with raspicam_node

Hi,

Thank you very much for the work. I have only one problem, the topic names shouldn't start with raspicam_node. The topic names should be as minimal as possible, to allow using namespaces.

In my project I want the topics to be /robotname/image, /robotname/image/compressed...
Normally this should be as easy as adding a namespace on the raspicam node, but currently I have to manually remap the topics which is not future proof.
This is part of ROS good practices.

Thanks in advance.

Framerate at 1/3

I am using this package now for a few weeks. But suddenly the frame rate of the published images is around 1/3 of the frame rate I use while initializing the node. 10->3.3 or 60->20

I use v2 cam and ROS Kinetic. I already re-installed the package from source and not from source.

any idea?

[raspicam_node-2] process has died on Raspberry pi3, Linux raspberrypi 4.9.59-v7+

... logging to /home/pi/.ros/log/dc2f88fc-f841-11e7-b831-b827eb75bfae/roslaunch-raspberrypi-1075.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://raspberrypi:36793/

SUMMARY

PARAMETERS

  • /raspicam_node/camera_frame_id: raspicam
  • /raspicam_node/camera_info_url: package://raspica...
  • /raspicam_node/exposure_mode: antishake
  • /raspicam_node/framerate: 10
  • /raspicam_node/height: 960
  • /raspicam_node/shutter_speed: 0
  • /raspicam_node/width: 1280
  • /rosdistro: kinetic
  • /rosversion: 1.12.12

NODES
/
raspicam_node (raspicam_node/raspicam_node)

auto-starting new master
process[master]: started with pid [1085]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to dc2f88fc-f841-11e7-b831-b827eb75bfae
process[rosout-1]: started with pid [1106]
started core service [/rosout]
process[raspicam_node-2]: started with pid [1117]
[ INFO] [1515834746.562220740]: Loading CameraInfo from package://raspicam_node/camera_info/camerav2_1280x960.yaml
[ INFO] [1515834746.899745114]: Camera component done

[ INFO] [1515834746.906983500]: Encoder component done

[ INFO] [1515834746.932041469]: camera calibration URL: package://raspicam_node/camera_info/camerav2_1280x960.yaml
[raspicam_node-2] process has died [pid 1117, exit code -11, cmd /home/pi/ros_ws/devel/lib/raspicam_node/raspicam_node __name:=raspicam_node __log:=/home/pi/.ros/log/dc2f88fc-f841-11e7-b831-b827eb75bfae/raspicam_node-2.log].
log file: /home/pi/.ros/log/dc2f88fc-f841-11e7-b831-b827eb75bfae/raspicam_node-2*.log

Performance issues with raspicam_node

Hi, thanks a lot for contributing raspicam_node. Just out of curiosity what kind of publishing rates do you experience with this node running on a Raspberry Pi 3 Module. I currently have it running with a frequency of ~3.8 hz. Was wondering if this was the expected publishing rate. What kind of publishing rate are you experiencing.

The current set up I have is:
Master machine: Ubuntu 16.04.2 LTS
Raspi: Ubuntu 16.04.2MATE

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.