[SOLVED] Randomly moving objects and collision detection and LIDAR sensors


#1

Hey :smile:

I created my own environment and added some objetcs (boxes, spheres, cylinders). Now i want to move those objects randomly in the scene, with a predefined speed, as well as detect collisions between those objects so that they do not get stuck upon each other but rather move apart after collision.
I am aware that i have to implement different states in a State Machine. However, i have almost no experience with the SMACH framework which leads to my question:
Are there any good examples/templates with State Machines including moving objects or collision detection, so i can get more familiar with this topic?

Any help is very much appreciated. I already had a look at the SMACH tutorial section in the guide book, but thought there might already exist some experiments facing the same challenge.

Thanks a lot,
Michael


#2

Dear Michael,

Did you find this tutorial here from our guide book?
https://developer.humanbrainproject.eu/docs/projects/HBP%20Neurorobotics%20Platform/2.0/nrp/tutorials/experiment/index.html

Hope this helps
Axel


#3

Dear Axel,

thanks for your help. I already read through this tutorial and managed to implement my own State Machine. All of my objects already move randomly within my environment.
The next part I have to solve is the collision detection between the Husky robot and the dynamic obstacles, which I haven’t found any tutorials for.
Do you have some literature suggestions for me to read through aswell?

Best,
Michael


#4

Dear Michael,

The collisions are taken care of as long as your objects have a collision model. The physics, though, are calculated only if the static tag is not set in the sdf model.

Axel


#5

Hi Axel,

thanks for your help, I guess I have everything i need!

Best
Michael


#6

You can also check for objects in collision from the gazebo topic /gazebo/default/physics/contacts. You can print gazebo messages published on that topic to the screen by running this command:

gz topic -e /gazebo/default/physics/contacts

This will give you information about objects that are in contact. Here’s an example of a printed message when two objects are in contacts:

 time {
 sec: 883
 nsec: 65000000
 }
 contact {
  collision1: "cricket_ball::link::collision"
  collision2: "google_table::table::VIS_base_link_table_1"
  position {
    x: 0.11189242865928939
    y: 0.75254470920056571
    z: 0.80541542439723779
  }
  normal {
    x: 1.575676779922674e-08
    y: -0.0020068037381166218
    z: 0.99999798636735093
  }
  depth: 2.482008419418813e-05
  wrench {
    body_1_name: "cricket_ball::link::collision"
    body_1_id: 6
    body_2_name: "google_table::table::VIS_base_link_table_1"
    body_2_id: 13
    body_1_wrench {
      force {
        x: 6.4642662686211782e-05
        y: -0.00061505375368574506
        z: 0.54482728522598123
      }
      torque {
        x: 1.7565163713578517e-05
        y: -2.5286415452464464e-06
        z: 3.4584856813321769e-05
      }
    }
    body_2_wrench {
      force {
        x: -6.5644156477974387e-05
        y: -0.00047216259739397715
        z: -0.54482742767763892
      }
      torque {
        x: -0.41050065145078374
        y: 0.060925683580282233
        z: -3.7930036771771328e-05
      }
    }
  }
  time {
     sec: 883
    nsec: 66000000
  }
  world: "default"
  mechanicalContact: true
}

I’m sure you can listen to gazebo messages from a python script, which would allow you to use this information in transfer functions, but I haven’t looked into that.

Hope this helps.


#7

Hi,

thanks for your reply, this helps a lot!

Another problem I am facing is the following (maybe also related to collisions):
I’d like to add a LIDAR sensor to the Husky robot, in order to be able to get distances to every obstacles sensed by the robot. I found a Gazebo tutorial (https://www.clearpathrobotics.com/2013/11/husky-simulation-in-gazebo/) which gives help on how to add such a LIDAR sensor to the Husky robot. However, if you look at the link I need to find a .urdf.xacro file of the Husky robot in which i can add the lines printed in the tutorial. The problem is, I couldn’t find such a file. The only Husky-robot-specific files I found were the model.config & model.sdf files (on $HBP/Models/husky_model/ path).

Do you have any suggestions how I could implement the suggested LIDAR sensor into the model.sdf file, in order to add it to our Husky robot? Unfortunately, I am not very familiar with ROS and Gazebo, that’s why I’m asking for some help here :slight_smile: .

Thank you very much,
Michael


#8

Hi Michael,

The .xacro file is just an xml with macros that generates other xml files, such as urdf or sdf. You could either add the sensor in a .xacro file (for example here https://github.com/husky/husky/tree/kinetic-devel/husky_description/urdf ) and then generate an sdf as seen here http://answers.gazebosim.org/question/2818/please-help-how-to-open-urdfxacro-file-in-gazebo/

Another alternative would be to add the lidar sensor directly to the sdf as seen here http://gazebosim.org/tutorials?tut=guided_i1

Let me know how it goes, hope it helps.

Regards,
Manos


#9

Hi Manos,

thanks for these links. I’ll definetly try out the last one, adding the LIDAR sensor directly into the robots sdf file.

I will keep you updated on how it is going.
Thanks again :smile:

Best,
Michael


#10

Hey again,

I managed to add the LIDAR sensor to the model.sdf file of my Husky robot and attached it via a fixed joint.
However, I could use some help on how to implement the Transfer Function, in order to actually use the sensor, constantly turn it around and get the measured sensor values.
It would be great, If you could help me out here!

Best,
Michael


#11

Hi Michael,

Great news for the sensor, but I am afraid that what you just did only added the geometrical model on top of the husky, and does not yet receive data from the sensor. If that is the case, I would recommend looking for another tutorial which shows how you can read the data of the lidar sensor from ROS. Once you have the data in ROS it is trivial to get it from a TF. Let me know when you have made some progress or if you are stuck.

Regards,
Manos


#12

Hi Manos,

unfortunately, I couldn’t find any further information on how to receive the sensor data.
My question is:
Can i implement the Gazebo plugin as explained in the last two chapters of the tutorial you gave me (http://gazebosim.org/tutorials?tut=guided_i1), connect it to ROS and then somehow use it in the NRP transfer function?

I don’t have any experience with ROS or Gazebo, so I’m quiet stuck here :smiley:

Thanks,
Michael


#13

Hi Michael,

Maybe you are lucky and ROS already knows that the sensor data is there. That would be the most fortunate case, so l would recommend to first check. A simple test can be done either from the ROS console in the NRP or from just a linux terminal. You just type

rostopic list

And this shows you a list of the available topics that you can read robot sensory data from. If you are lucky in the list there should appear something like /gazebo/robot/laser/scan , or anything at all relevant to the lidar. If you cannot find anything paste the output of the command here and I can take a look.

Regards,
Manos


#14

Hi Manos,

/base_bumper

/bl_bumper

/br_bumper

/clock

/fl_bumper

/fr_robot_bumper

/gazebo/contact_point_data

/gazebo/default/physics/contacts

/gazebo/link_states

/gazebo/model_states

/gazebo/set_link_state

/gazebo/set_model_state

/husky/cmd_vel

/husky/wheel_speeds

/joint_states

/odom

/ros_cle_simulation/0/lifecycle

/ros_cle_simulation/cle_error

/ros_cle_simulation/logs

/ros_cle_simulation/status

/rosout

/rosout_agg

/tf

That’s all i found, unfortunately nothing similar to what you wrote.
Thanks for your help :smile:

Regards,
Michael


#15

Hi Michael,

You can try adding these lines (modified accordingly to your model) to the sdf of the model, and pray that it works

<sensor name='head_hokuyo_sensor' type='ray'>
    <visualize>0</visualize>
    <update_rate>50</update_rate>
    <ray>
      <scan>
        <horizontal>
          <samples>360</samples>
          <resolution>1</resolution>
          <min_angle>-1.5708</min_angle>
          <max_angle>1.5708</max_angle>
        </horizontal>
      </scan>
      <range>
        <min>0.1</min>
        <max>30</max>
        <resolution>0.01</resolution>
      </range>
      <noise>
        <type>gaussian</type>
        <mean>0</mean>
        <stddev>0.01</stddev>
      </noise>
    </ray>
    <plugin name='gazebo_ros_head_hokuyo_controller' filename='libgazebo_ros_laser.so'>
      <topicName><YOUR MODEL NAMESPACE HERE, FOR EXAMPLE husky>/laser/scan</topicName>
      <frameName>lms100</frameName>
    </plugin>
    <pose frame=''>0.16 0 0.15 0 -0 0</pose>
  </sensor>

If that does not work, then I am afraid you will have to write your own code, which might be tricky.

Regards,
Manos


#16

Hi Manos,

thanks for your effort!
I’m pretty sure it works properly.
I added the plugin lines to my sensors .sdf file:

 <!-- Add a ray sensor, and give it a name -->
    <sensor type="ray" name="sensor">

      <!-- Position the ray sensor based on the specification. Also rotate
           it by 90 degrees around the X-axis so that the <horizontal> rays
           become vertical -->
      <pose>0 0 -0.004645 1.5707 0 0</pose>

      <!-- Enable visualization to see the rays in the GUI -->
      <visualize>true</visualize>

      <!-- Set the update rate of the sensor -->
      <update_rate>30</update_rate>
      <ray>
        <noise>
          <!-- Use gaussian noise -->
          <type>gaussian</type>
          <mean>0.0</mean>
          <stddev>0.02</stddev>
        </noise>
        <!-- The scan element contains the horizontal and vertical beams.
             We are leaving out the vertical beams for this tutorial. -->
        <scan>

          <!-- The horizontal beams -->
          <horizontal>
            <!-- The velodyne has 32 beams(samples) -->
            <samples>32</samples>

            <!-- Resolution is multiplied by samples to determine number of
                 simulated beams vs interpolated beams. See:
                 http://sdformat.org/spec?ver=1.6&elem=sensor#horizontal_resolution
                 -->
            <resolution>1</resolution>

            <!-- Minimum angle in radians -->
            <min_angle>-0.53529248</min_angle>

            <!-- Maximum angle in radians -->
            <max_angle>0.18622663</max_angle>
          </horizontal>
        </scan>

        <!-- Range defines characteristics of an individual beam -->
        <range>

          <!-- Minimum distance of the beam -->
          <min>0.05</min>

          <!-- Maximum distance of the beam -->
          <max>70</max>

          <!-- Linear resolution of the beam -->
          <resolution>0.02</resolution>
        </range>
    </ray>
    <plugin name='gazebo_ros_head_hokuyo_controller' filename='libgazebo_ros_laser.so'>
      <topicName>husky/laser/scan</topicName>
      <frameName>lms100</frameName>
    </plugin>
    </sensor>

My transfer function now looks like this:

@nrp.MapRobotSubscriber("lidar", Topic('/husky/laser/scan', sensor_msgs.msg.LaserScan))
@nrp.Robot2Neuron()
def read_lidar_sensor (t, lidar):
    clientLogger.info(lidar.value.ranges)

which prints me a vector of 32 range-values. I guess thats because there are 32 beams in my sensor.
So far so good.
However, i don’t see the laser beams visualized inside the NRP. Whenever i load the sensor model into an empty gazebo world, all 32 beams are shown, which i would like to have for better visualization. Do you have any further ideas on how to accomplish that?

Anyways, thank you very much for your assistance!
Best,
Michael


#17

Hi Michael,

I am very happy that you managed to get the sensor data. For the visualization part, I am afraid there is no easy way to do it in the NRP. I would ask for @lguyot to also comment if he has another idea. If you want to go down the long road of implementing the visualization yourself we could guide you through it and then you could contribute to our code!

Easy solution would be to have a gazebo client running simultaneously with the platform and then you could see the rays there, of course with the overhead of running two separate computationally intensive processes.

Regards,
Manos


#18

Hey @ManosAngelidis,

thanks again!
I would be interested in implementing the visualization myself. However, I am currently working on a project with due date on Sunday, 24th September, and am learning for exams aswell. As soon as I am done with my exams (start of October) I would like to try the implementation out, if you could guide me through it :smile:

As for now, I would like to try the easy solution: Is it possible, to run a gazebo client simultaneously with the platform and seeing in gazebo everything that happens in the NRP simulation (together with the functionality implemented in State Machines and Transfer Functions)? Or would it just be a visualization of the robot with its beams?

Thanks again for your great support!

Regards,
Michael


#19

Hi Michael,

We could discuss in technical detail when you have some time. In fact we are actively looking for master students to work with us on their thesis. If you are interested I would be more than happy to discuss.

Back to our topic, running standalone gazebo is done from the command line. Just type gzclient and this should open a gazebo GUI. Whatever you do to the simulation from the NRP (State Machines, TFs etc) is immediately reflected to the gazebo client.

Regards,
Manos


#20

Hi Manos,

that sounds great.
I will definetly try out the gazebo client!

However, I found another little issue:
I can listen to the lidar sensor from inside the ROS terminal via:

rostopic echo /husky/laser/scan

as well as the ranges measured by the sensor via:

rostopic echo /husky/laser/scan

With my transfer function looking like this:

@nrp.MapRobotSubscriber("lidar", Topic('/husky/laser/scan', sensor_msgs.msg.LaserScan))
@nrp.Robot2Neuron()
def read_lidar_sensor (t, lidar):
    clientLogger.info(lidar.value)

I get an output like the following:

  header: 
  seq: 0
  stamp: 
    secs: 261
    nsecs: 120000000
  frame_id: "lms100"
angle_min: -1.57079994678
angle_max: 1.57079994678
angle_increment: 0.165347367525
time_increment: 0.0
scan_time: 0.0
range_min: 0.0500000007451
range_max: 70.0
ranges: [11.119016647338867, 11.276110649108887, 11.76264762878418, 12.633127212524414, 13.717867851257324, 13.573426246643066, 12.840919494628906, 11.745536804199219, 11.089377403259277, 10.800564765930176, 10.780200004577637, 11.114788055419922, 11.700382232666016, 12.87326717376709, 13.52707576751709, 13.617573738098145, 12.547259330749512, 11.655122756958008, 11.20589542388916, 11.018685340881348]
intensities: [4.611424017015466e+24, 0.0, 3.587324068671532e-43, 4.200548419618381e-37, 1.4048981859558812e-31, 1.4048981859558812e-31, 1.464617060559576e-31, 0.0, 1.3831035802961553e-31, 1.3833631294488169e-31, 0.0, 89128.9609375, 1.421413411387194e-31, 1.6815581571897805e-44, 1.825119699469524e-12, 1.825119699469524e-12, 1.4547222193120943e-31, 4.183307740144345e-37, 4.136909290640148e-37, 1.6815581571897805e-44]

If i now want to specifically ask the range values, I tried modifying the Transfer Function to the following:

@nrp.MapRobotSubscriber("lidar", Topic('/husky/laser/scan', sensor_msgs.msg.LaserScan))
@nrp.Robot2Neuron()
def read_lidar_sensor (t, lidar):
    clientLogger.info(lidar.value.ranges)

The problem is, sometimes i receive an array given all measured ranges, sometimes i get an error: "NoneType object has no attribute “ranges”. It did work the other day, without me changing anything. I feel like if I listen to the topic in the ROS terminal simultaneously, I can ask the range values inside the Transfer Function via lidar.value.ranges, which seems kind of strange.
Do you see my mistake here?

Thank you for your great support!
Best,
Michael