[SOLVED] Uploading custom robots/models


#1

Hi NRP Community,

I am having a bit of trouble understanding on how to upload custom models. Specifically, I am looking to upload a UAV model to the platform. The NRP tutorials seem to show a drone with a DVS camera. This is the timestamped video I am referring to: https://youtu.be/ohbBo1qYrU8?t=92.

I am running the docker install of the nrp 3.2.0 on ubuntu 18.0

  1. Is the drone model in the video something that is publicly available?

  2. I have attempted to upload other gazebo models I found online to the NRP platform but they all seem to error out. I managed to connect to the backend and see the directory structure of the existing models. Even after finding models that mimic that structure, I have had no luck.

  3. How does one add a DVS plugin to the robot? I cannot seem to find any tutorials regarding this.

  4. Is there a public space where people can share their NRP robots/environments?


#2

Hi,

  1. The drone in the video was used for a demo and notincluded in the public models repo
  2. The public space are the Models and the Experiments repositories that you can gladly fork and submit to.

We can though gladly provide you with some information where to find a suitable drone model and how to add a DVS plugin in the next message on this thread. Now, I also take note for the dev team to add the drone model and experiment as a template for the NRP since you’re not the first one to ask.

https://hbpneurorobotics.atlassian.net/browse/NUIT-285

Best regards
Axel


#3

Hi Axel,

Thanks for your response!

It would be amazing if you could share how to go about finding a drone model and attach a DVS plugin! Also, thank you for raising the issue with the dev team!

Best,
Nishant


#4

Hi,

We are preparing a stripped off version of our experiment with the Hector drone (https://github.com/RAFALAMAO/hector_quadrotor_noetic)
and will send it to you by today.

Best


#5

Hi Axel,

This is very kind of you. I am grateful for your help and looking forward :slight_smile:

Best,
Nishant


#6

Here is a template experiment with a Hector drone, an attached DVS camera and a transfer function that displays the output of the camera.

To set up the experiment :
Rename both attached files with .zip
Import the zipped experiment (droneDVS.zip)
Install the drone plugin in the GazeboRosPackage, following those instructions : https://github.com/RAFALAMAO/hector_quadrotor_noetic (if you have the docker install, you need to do this step in the backend container)
In order to control the drone, you need to add a namespace in your plugin : in the package hector_quadrator_noetic/hector_quadrator/hector_quadrator_controller, you need to add the namespace quadrator before the namespace
controller, in the controller.yaml in the params folder. In the launch folder, replace the controller.launch content by :
<launch>
<rosparam file="$(find hector_quadrotor_controller)/params/controller.yaml" />

<node name=“controller_spawner” ns=“quadrotor” pkg=“controller_manager” type=“spawner” respawn=“false” output=“screen” args="
controller/twist"/>
</launch>

Don’t forget to build and source the drone plugin you installed.

You also need to import the drone model in the NRP : extract quadrotor_model.zip in the NRP/Models and execute copy-to-storage.sh script which is in the same folder. If you have the docker install, you must do this in the backend container, and you must also go in the frontend container, go to NRP/nrpBackendProxy, and run : npm run update_template_models.

The drone is static by default, you can change this and other properties (including the DVS camera parameters) in quadrotor/model.sdf in the experiment. The drone can be controlled with twist topics, and need to be raised first to engage motors ( the z linear component of the twist topic has to be positive).

droneDVS.pdf (58.5 KB)
quadrotor_model.pdf (245.2 KB)


#7

Hi Axel,

Thank you for so much for your quick and thorough response. I will work with all of this when I get a chance. Hopefully, when I get to this task, I can reach back out for any clarification.

Thanks again for this, it is super helpful and you are very kind


#8

Looks like everything installed and builds smoothly.

However, I believe I am doing something wrong when uploading the quadrotor_model.

  1. I extract the quadrotor_model.zip file in the nrp/Models dir in the backend container

  2. I execute the copy-to-storage.sh script

  3. I execute npm run update_template_models in the frontend container

  4. I restart both containers

The templates UI does not display the drone model. I also noticed that when I executed update_template_models on the frontend container it didn’t insert the drone model in.

Anyways, I tried to work around this by manually uploading from the UI. However, when I execute the simulation it then says:

"Error loading some assets

quadrotor::base_link::base_link_fixed_joint_lump__sonar_link_visual_1

quadrotor::base_link::base_link_visual

Some assets have not been found!
"

Do you have any suggestions?

Thank You


#9

Hi,

Looks like you need more in-depth help. Please post your last msg as a ticket in
https://support.humanbrainproject.eu/
(neurorobotics section)
and we will help you more directly.

best


#10

Hi,

We wrote a much more detailed guide how to install this experiment, see attached. This works also in docker installations, just connect to your backend container.

Thanks for your ticket in the support channel. Try this guide and post here if it does not work for you.

Best regards

Hector_Quadrotor_NRP (1).pdf (94.2 KB)


#12

Hi there,

Thanks for the PDF. I was able to get the model uploaded correctly and load the simulation environment.

I’m now trying to move the drone

However, I think I’m running into a permissions issue:

Edit: To clarify: sending the publish command in the built-in ROS terminal works fine. It is the TF editor where I cannot seem to publish. To add on, the display_rendered_full video stream is not showing any event data, it looks exactly like the /quadrotor/classic/camera stream


#13

Hi naswani,
your function transfer is not correct. You can get some inspiration from this tranfer function that I am using with the drone :

from geometry_msgs.msg import Twist
from geometry_msgs.msg import Wrench
from rosgraph_msgs.msg import Clock
@nrp.MapRobotPublisher(‘straight_trajectory’, Topic(’/quadrotor/cmd_vel’, Twist))

def straight_trajectory(t, straight_trajectory):
#going up to engage motors
if t < 5 :
new_cmd = geometry_msgs.msg.Twist(linear=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z=2),angular=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z= 0.0))
straight_trajectory.send_message(new_cmd)
else :
new_cmd = geometry_msgs.msg.Twist(linear=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z=0),angular=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z= 1.0))
straight_trajectory.send_message(new_cmd)

Concerning the dvs plugin, one issue might be the very high threshold of the event camera. You can lower it in quadrotor/model.sdf in the Experiment, in the tag . I would say 10 is a good standard value. If it displays only the classic camera strem, it may be because of topics with wrong name, in the model.sdf or in the transfer function. Let me know if that helps or if you have other questions.
Jules


#14

Ticket 285 has been implemented and is waiting for deployment


#15

Ticket 285 is closed.