Indoor Autonomous Flight with Cube, ROS and Aruco Board (Ongoing)

Edit 24 Aug 2021: I’m no longer working on this, because I realized this requires constant Aruco in sight otherwise the copter will lose its position data. I think this is not feasible to use as we need aruco placed on ground everywhere it flies. Also the flying altitude is limited as it may lose the sight from bad focusing.

Hi, does anyone remember the indoor autonomous flight tutorial in ardupilot.org ?

It was done by Raspberry Pi 3 with Pixhawk1. I am rewriting the tutorial with newer hardware, which are Pi 3B and the Cube. I hope my tutorial is easy to follow for everybody such that anyone can set it up easily. No more tutorials inside another tutorial and confusing hyperlinks!

Remark:
In this setup, a dummy GPS is set and the aruco marker board is used for positioning. This means that the GPS failsafe is actually never disabled. If the camera lost its sight on aruco board for too long, the positioning system will be turned off. You will need to restart the cube to recover the system. Therefore, I suggest using small/tiny drone for testing. I am working on possible solutions such as adding more aruco marker on ground.

Contents:
1. Hardware Setup and ROS Installation / Using the ROS in Pi
2. SSH Setup in Companion Computer and Remote Computer
3. Time Synchronization / Pi Camera and Cube Parameter Setup
4. Install Necessary Packages in ROS
5. Prepare Your Aruco Board / Camera Calibration
6. Visualization with RVIZ / Robot model
7. Program the Autonomous Path

1 Like

Install ROS on Raspberry Pi

This tutorial is referenced to the ardupilot.org

Setup is as shown below:

Components
FMU The Cube Black
Companion Computer Raspberry Pi 3B
System Ubuntu Mate 16.04
Camera Pi Camera v2
SD card At least 8 GB
Power Source for Companion Computer 5000mAh Portable USB Power Bank

You will need a keyboard, a mouse, and a HDMI monitor to setup your pi.

Prepare the Operation System on Pi

You can find the detailed installation procedure here but it is troublesome.

You may direct download the workable image at https://www.raspberrypi.org/forums/viewtopic.php?p=1295937#p1295937

Use SD Card Formatter to clear everything on your card and burn the image by balenaEtcher.

Plug the SD card into your Pi and turn it on. Follow the instructions to finish all initial settings.

Install ROS

Detail instruction can be found on ROS.org

Setup your computer to accept software from packages.ros.org.

sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'

Set up your keys

sudo apt-key adv --keyserver hkp://ha.pool.sks-keyservers.net:80 --recv-key 421C365BD9FF1F717815A3895523BAEEB01FA116

Make sure your Debian package index is up-to-date

sudo apt-get update
sudo apt-get upgrade

Install ROS Desktop-Full

sudo apt-get install ros-kinetic-desktop-full 

Initialize rosdep

sudo rosdep init
rosdep update

Environment setup

echo "source /opt/ros/kinetic/setup.bash" >> ~/.bashrc
source ~/.bashrc

Install dependencies for building packages

sudo apt install python-rosinstall python-rosinstall-generator python-wstool build-essential

Install MAVROS

Detailed instructions can be found here

Or you may type these commands in the terminal to install

sudo apt-get install ros-kinetic-mavros ros-kinetic-mavros-extras
wget https://raw.githubusercontent.com/mavlink/mavros/master/mavros/scripts/install_geographiclib_datasets.sh
chmod a+x install_geographiclib_datasets.sh
./install_geographiclib_datasets.sh

Install RQT tools

sudo apt-get install ros-kinetic-rqt ros-kinetic-rqt-common-plugins ros-kinetic-rqt-robot-plugins

Install catkin tools

sudo apt-get install python-catkin-tools

Setup the Cube

Connect the Cube to ground station (eg: Mission Planner). Go to Config/Tunning > Full Parameter List and modify the following parameters:

SERIAL2_PROTOCOL = 1	to enable MAVLink on the serial port.
SERIAL2_BAUD = 57		baud rate can be up to 921000.
LOG_BACKEND_TYPE = 3	dataflash log files enable.

Connect your Raspberry Pi to the Cube

There are 2 ways to connect them.

Connect via USB

Simply plug connect the Raspberry Pi and the Cube with a USB-to-microUSB cable. The Raspberry Pi should be able to recognize the device.

Connected USB devices can be checked by running the command in terminal as shown below:

for sysdevpath in $(find /sys/bus/usb/devices/usb*/ -name dev); do
    (
        syspath="${sysdevpath%/dev}"
        devname="$(udevadm info -q name -p $syspath)"
        [[ "$devname" == "bus/"* ]] && continue
        eval "$(udevadm info -q property --export -p $syspath)"
        [[ -z "$ID_SERIAL" ]] && continue
        echo "/dev/$devname - $ID_SERIAL"
    )
done

Remember the tty port of the fmu. It will be used later.

Connect via GPIO

Modify one of the telem cable from the Cube

cube_and_pi

IT IS NOT RECOMMMANDED to connect the 5V pin of the Cube with Pi as Cube is not a proper power source. But if you still want to, connect them as shown in the image.

For Raspberry Pi 3B and newer models, the Bluetooth module occupied UART serial port. To disable the Bluetooth, type the following command in the terminal:

sudo nano /boot/config.txt

Then, add the following lines at the end of config.txt:

dtoverlay=pi-disable-bt
enable_uart=1

Type

sudo nano /boot/cmdline.txt

Modify the file to the following:

dwc_otg.lpm_enable=0 console=tty1 root=/dev/mmcblk0p2 rootfstype=ext4 elevator=deadline fsck.repair=yes rootwait quiet splash plymouth.ignore-serial-consoles

Type

sudo raspi-config

Select Advanced Options > Interfacing Options. Disable serial login shell and keep serial interface enabled.

The Cube will not be recognized as an USB device. The port can be check by typing the following commands in terminal:

ls /dev/tty*

/dev/ttyS0 or /dev/ttyAMA0 should be used to communicate with the Cube. For my case, it is /dev/ttyS0

1 Like

Run ROS on Raspberry Pi

This tutorial is referenced to the ardupilot.org

Baud Rate Selection

Raspberry Pi can communicate with the cube for up to 921600 baud rate. However, sometimes ROS could not receive the heartbeat after launched. This may be because of the inaccurate clock on Raspberry Pi. Select a slower baud rate will solve the problem. Selecting 57600 baud rate is most stable and this rate should be enough for general purposes.

Launch ROS

Open a terminal. Type:

roslaunch mavros apm.launch fcu_url:=/dev/ttyS0:57600

you may change the launch setting according to you own setting

roslaunch mavros apm.launch fcu_url:=<tty port>:<baud rate>

If you connect the cube and pi by UART, there may be a chance that the terminal shows No Heartbeat Packet Received. This may because the Bluetooth service has not been disabled. You can completely disable the Bluetooth function by typing sudo systemctl disable hciuart in the terminal.

If permission is denied is shown after roslaunch, type

sudo chmod 666 /dev/<tty port>

You may change it according to your connected tty port:

sudo chmod 666 /dev/

Set Stream Rate to Enable Data Stream

Open a new terminal. Type:

rosservice call /mavros/set_stream_rate 0 10 1

You can set it in this format according to your application:

rosservice call /mavros/set_stream_rate 0 <frequency in Hz> 1

ROS Topics

Type the following to the terminal to see current available topics:

rostopic list

To see the data in a topic, type:

rostopic echo <topic>

Set Default Connection Port

Open a terminal and type:

roscd mavros
cd launch
sudo nano apm.launch

Find the line <arg name="fcu_url" default="/dev/ttyS0:57600" /> and modify it according to your need in appropriate format:

default="<tty port>:<baud rate>"

After setting, the launch command can be simplified to

roslaunch mavros apm.launch
4 Likes

Just curious why mavlink1 original setup was using mavlink2

Do you mean SERIAL1_PROTOCOL and SERIAL2_PROTOCOL?

The original setup is mavlink1

SSH Setup in Companion Computer and Remote Computer

This tutorial is referenced to the ardupilot.org

SSH is necessary ,otherwise the ROS and program on Raspberry are unobservable during the flight.

First, connect both computers to the same LAN (eg: same WIFI network).

Setup

In Your Raspberry Pi

Check the IP address of the device by open a terminal and type:

ifconfig 

Install OpenSSH Server

sudo apt-get install openssh-server openssh-client

Enable SSH service on Raspberry Pi

raspi-config

then select allow ssh

In Your Remote Computer

Check the ip address of the device by open a terminal and type:

ifconfig 

Install OpenSSH Server

sudo apt-get install openssh-server openssh-client

Check the Connection

In Your Raspberry Pi

Open a terminal, type:

ping <ip address of remote computer>
In Your Remote Computer

Open a terminal, type:

ping <ip address of Raspberry Pi>

Define Name for Devices

Open a terminal, type:

sudo nano /etc/hosts

Then, type the ip address address and hostname into the file with format <ip address> <hostname>

192.168.xx.xxx pi-name

Such that the IP address can be simplified to a name when using SSH service

Connect the Raspberry Pi

In Your Remote Computer

Open a terminal, type:

ssh <user name>@<device name>

OR

ssh <user name>@<ip address of Pi>

If login is successful, resources on Rasbperry Pi will be accessible from the remote computer.

Type exit to the terminal to exit the SSH server.

Notice that you need to login again for every new terminal.

1 Like

Clock/Time synchronization

Confirm Setting is Correct on Raspberry Pi

Open a terminal, type:

roscd mavros
cd launch
nano apm_config.yaml

Config the connection and time settings are same as following:

conn:
  timesync_rate: 10.0
  system_time_rate: 1.0

time:
  timesync_mode: MAVLINK

Such that the Raspberry Pi will synchronize once for every seconds

Ensure sys_time plugin is installed on mavros

roscd mavros
nano mavros_plugins.xml

Connect the Cube to ground station (eg: Mission Planner). Go to Config/Tunning > Full Parameter List and modify the following parameter:

BRD_RTC_TYPES = 2	(Mavlink system time)

If there is no BRD_RTC_TYPES in Mission Planner, update your Mission Planner to a newer version.

Set Time Interval with Mavproxy

Time interval can also be set by Mavproxy if it is installed on the Raspberry Pi.

Connect to Mavproxy and type:

module load system_time

Check current time of the device by typing:

time 

Set update interval by typing:

system_time set interval <sec>

Pi Camera and Cube Parameter Setup

Enable the Pi Camera

Detailed procedure can be found here

Connect your Camera to the camera port at the centre of Raspberry Pi

Open a terminal, type:

sudo apt-get update
sudo apt-get upgrade
sudo apt-get install raspi-config rpi-update

Type

sudo raspi-config

then select Enable Camera

Open a terminal and type:

sudo nano /boot/config.txt

Ensure the setting is same as below:

start_x=1
gpu_mem=128

Run

rpi-update

Test the Camera

Open a terminal and type:

raspivid -p 0,0,640,480 -t 0

A 640x480 window should pop up at the top-left corner and shows the real time image from camera.

Mount the Camera

You are supposed to mount your camera in default orientation:
(Pi-Camera-front

If you need to mount the camera in different orientation, you may modify the parameters in a GUI. This will be mentioned in the part Run all Components.

Since the system calculates position of the camera, not the aircraft itself. It is recommended to mount the camera at the bottom, and face to the ground.

Setup the Cube

Connect the Cube to ground station (eg: Mission Planner). Go to Config/Tunning > Full Parameter List and modify the following parameters:

AHRS_EKF_TYPE = 2
BRD_RTC_TYPES = 2
EK2_ENABLE = 1
EK3_ENABLE = 0
EK2_GPS_TYPE = 3
EK2_POSNE_M_NSE = 0.1
EK2_VELD_M_NSE = 0.1
EK2_VELNE_M_NSE = 0.1
GPS_TYPE = 0
COMPASS_USE = 0
COMPASS_USE2 = 0
COMPASS_USE3 = 0
SERIAL2_BAUD = 57		(the serial port used to connect to Raspberry Pi)
SERIAL2_PROTOCOL = 1
SYSID_MYGCS = 1			(to accept control from mavros)
VISO_TYPE = 0
2 Likes

Install Necessary Packages in ROS

Install raspicam_node

Detailed procedure can be found here

Installation

Detailed procedure can be found here

Add source to apt lists

sudo sh -c 'echo "deb https://packages.ubiquityrobotics.com/ubuntu/ubiquity xenial main" > /etc/apt/sources.list.d/ubiquity-latest.list'

Add signing key to trusted list

sudo apt-key adv --keyserver hkp://ha.pool.sks-keyservers.net:80 --recv-key C3032ED8

Run

sudo apt-get update
Build catkin_ws

Detailed procedure can be found here
Jump to next session if you have already installed catkin_ws

Install pre-built package

sudo apt-get install ros-kinetic-catkin

Source your environment

source /opt/ros/kinetic/setup.bash

Create a catkin workspace

mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin_make

Source your workspace

nano ~/.bashrc

Add source ~/catkin_ws/devel/setup.bash at the bottom of file.
OR
Run source devel/setup.bash in terminal everytime before running any package in catkin_ws.

Build from Source

Go to your catkin_ws

cd ~/catkin_ws/src

Download the source for raspicam_node

git clone https://github.com/UbiquityRobotics/raspicam_node.git

Create recognizable file for ROS dependencies

sudo touch /etc/ros/rosdep/sources.list.d/30-ubiquity.list
sudo nano /etc/ros/rosdep/sources.list.d/30-ubiquity.list

Add the following line to the file

yaml https://raw.githubusercontent.com/UbiquityRobotics/rosdep/master/raspberry-pi.yaml

Run rosdep update

Install the ros dependencies

cd ~/catkin_ws
rosdep install --from-paths src --ignore-src --rosdistro=kinetic -y

Compile the code

catkin_make
Run the Node to Confirm Success Installation

Most of the camera in the current market shoud be Pi Camera v2

For a V2.x camera, run roslaunch raspicam_node camerav2_1280x960.launch

For a V1.x camera, run roslaunch raspicam_node camerav1_1280x720.launch

Install OpenCV

Detailed procedure can be found here. You may check the time estimation for each procedure in the link.
This part is tested workable on 3B and 3B+ only.

Warning: Compiling OpenCV is a CPU-intensive task. Make sure your Raspberry Pi has radiators and a fan (or place a powerful external fan next to it) in order to avoid overheating. The Raspberry Pi would not die from overheating, but overheating will throttle its CPU performance, potentially increasing build time.

Raspberry Pi will stop responding for several times. Reboot the Pi to continue its work.

Make sure you have enough time to stick with the Raspberry Pi.

Confirm OS is up-to-date

sudo apt-get update
sudo apt-get upgrade

Install useful utilities

sudo apt-get install screen
sudo apt-get install htop

Install dependencies

sudo apt-get install build-essential cmake pkg-config
sudo apt-get install libjpeg-dev libtiff5-dev libjasper-dev libpng12-dev
sudo apt-get install libavcodec-dev libavformat-dev libswscale-dev libv4l-dev
sudo apt-get install libxvidcore-dev libx264-dev
sudo apt-get install libgtk2.0-dev libgtk-3-dev
sudo apt-get install libatlas-base-dev gfortran

Install Python 3

sudo apt-get install python3-dev

Install pip3

sudo apt-get install python3-pip

Get OpenCV 3.2.0 source code

wget -O opencv.zip https://github.com/opencv/opencv/archive/3.2.0.zip
wget -O opencv_contrib.zip https://github.com/opencv/opencv_contrib/archive/3.2.0.zip
unzip opencv.zip
unzip opencv_contrib.zip

Install Numpy, Scipy

sudo pip3 install numpy scipy

Compile OpenCV

cd ~/opencv-3.2.0/
mkdir build
cd build

cmake -D CMAKE_BUILD_TYPE=RELEASE \
    -D CMAKE_INSTALL_PREFIX=/usr/local \
    -D INSTALL_PYTHON_EXAMPLES=ON \
    -D OPENCV_EXTRA_MODULES_PATH=~/opencv_contrib-3.2.0/modules \
    -D BUILD_EXAMPLES=ON ..

Build OpenCV

The Raspberry Pi will overheat if there is no proper cooling. Reboot the Pi and run the command again if it overheated. The progress will be keep after reboot.

Use all 4 cores to build

make -j4

Once OpenCV builds successfully, continue the installation:

sudo make install
sudo ldconfig
sudo apt-get update

Reboot the Raspberry Pi after installation

sudo reboot

Confirm Success Installation

Open a terminal, type:

python3

Then run import cv2 and cv2.__version__. The version of OpenCV will be shown.

Install aruco_gridboard

Detailed procedure can be found here

If OpenCV extra module failed to build, rebuild them with the following codes:

cd <opencv_build_directory>
cmake -DOPENCV_EXTRA_MODULES_PATH=<opencv_contrib>/modules <opencv_source_directory>
make -j4
Rebuild vision_opencv from source

Clone vision_opencv in your catkin_ws/scr and rebuild everything

cd ~/catkin_ws/src
git clone https://github.com/ros-perception/vision_opencv.git
cd ..
catkin_make -DOpenCV_DIR=/path-to-build-opencv-3.2

If building failed, try

catkin_make -DOpenCV_DIR=/path-to-build-opencv-3.2 -DENABLE_PRECOMPILED_HEADERS=OFF

Install dependancies if any of them is missing during the build

sudo apt-get install ros-kinetic-cv-bridge
sudo apt-get install ros-kinetic-vision-opencv
Check cv_bridge version

Open a terminal and type:

cd ~/catkin_ws/devel/lib
ldd libcv_bridge.so | grep opencv

The following result should be shown:

libopencv_core.so.3.2 => /home/jokla/Software/opencv-3.2.0/build/lib/libopencv_core.so.3.2 (0x00007f7acb240000)
libopencv_imgcodecs.so.3.2 => /home/jokla/Software/opencv-3.2.0/build/lib/libopencv_imgcodecs.so.3.2 (0x00007f7acaffe000)
libopencv_imgproc.so.3.2 => /home/jokla/Software/opencv-3.2.0/build/lib/libopencv_imgproc.so.3.2 (0x00007f7ac97ca000)
Clone aruco_gridboard

I modify a lot of things start from here. If you want to know what have I done, following my instructions. If you just want it to work on your system, clone my fork for simplicity.

Simple build:

cd ~/catkin_ws/src
git clone https://github.com/AlphenLai/aruco_gridboard.git
cd ..
catkin_make
source devel/setup.bash

Then skip to the next session.

Detailed build:

Clone it into your catkin_ws and build with catkin_make

cd ~/catkin_ws/src
git clone https://github.com/jokla/aruco_gridboard.git
cd ..
catkin_make
source devel/setup.bash

The catkin_make should failed at this point.

Go to ~/catkin_ws/src/aruco_gridboard/data/detector_params.yml and ~/catkin_ws/src/aruco_gridboard/src/node.cpp
Open them with text editor and replace cornerRefinementMethod with doCornerRefinement

Confirm Success Installation

Add the new OpenCV to the path

cd ~/catkin_ws
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/jokla/Software/opencv-3.2.0/build/lib/

Launch the detection node

roslaunch aruco_gridboard detection.launch
1 Like

Prepare the Aruco Board

This tutorial is referenced to the ardupilot.org

Remark

The example grid board assume you are flying a tiny drone. Otherwise, please create your own board by yourself.

Check the IDs You Need

Open the file
~/catkin_ws/src/aruco_gridboard/data/layout-my.yaml

and you will see the following configuration:

ids:
 - 104
 - 111
 - 106
 - 105
 - 101
 - 110
 - 103
 - 102
 - 107
 - 109
 - 100
 - 108

These 12 IDs represent the aruco tags that will be recognized by the system. Edit them if you want to work on a customized board.

Download the Board

The aruco board can be downloaded from here if you do not want to modify it.

Make Your Own Board

Go to
~/catkin_ws/src/aruco_gridboard/src

Compile the create_board.cpp by

g++ -ggdb create_board.cpp -o createboard `pkg-config --cflags --libs opencv`

Then, run createboard to generate the image file and content of yaml file.

./createboard

A message will list all neccessary input parameters.

Parameters
w number of markers in x direction
h number of markers in y direction
l length of marker (in pixel)
s space between markers (in pixel)
d type of marker
f ID of first marker (not available if you are not using my fork)

Example input:

./create_board_wID board.jpg -w=10 -h=6 -l=700 -s=140 -d=16 -f=100

An image board.jpg will be created. Copy the ouput from the terminal and create a yaml file. A yaml file layout_<markers in x direction>x<markers in y direction>_<markerLength>.yaml will be automatically create if my fork was used.

Put the new yaml file to ~/catkin_ws/src/aruco_gridboard/data
Open detection_rpicam.launch and change the parameter

<param name="board_path" value="$(find aruco_gridboard)/data/layout-my.yaml" />

to your new yaml file.

Print your image.

Remark

The size of your grid board determine the range of your aircraft. As mentioned before, the grid board act as mock GPS system. If the board is too small, it will easily lose its sight on markers. It is recommended to create the markers big enough for your aircraft too see at its altitude. At least 4 markers should be visible by the aircraft.


Camera Calibration

This tutorial is referenced to the ROS.org

Prepare the checkerboard

Download the checkerboard here

The checkerboard is already with known dimension, which are 108mm squares. To print it on a A4 size paper, open the printing option dialog before printing and adjust the percentage size. I diminished the checkerboard to 20% of its original size, so the squares became 21.6mm.

Calibration

Make sure raspicam_node has been installed in Raspberry Pi

launch the ROS node and enable raw image

roslaunch raspicam_node camerav2_1280x960_10fps.launch enable_raw:=true

Run the Calibration Program

rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.0216 image:=/raspicam_node/image camera:=/raspicam_node

or

rosrun camera_calibration cameracalibrator.py --size 8x6 --square <square size> image:=/raspicam_node/image camera:=/raspicam_node

if you have a customized checkerboard

Remark

​ if camera_calibration is not found, run:
​ sudo apt-get install ros-kinetic-image-pipelline ​
​ you will need to remake the raspicam_node after running the command.

A calibration window which highlight the checkerboard will pop up. Rotate, tite, and move the checkerboard slowly until the X, Y, and Size bar are filled up. When Calibration button lights, press it. The window will freeze but it is working. Wait for it patiently.

Something like this will be shown in the terminal:

D =  [-0.33758562758914146, 0.11161239414304096, -0.00021819272592442094, -3.029195446330518e-05]
K =  [430.21554970319971, 0.0, 306.6913434743704, 0.0, 430.53169252696676, 227.22480030078816, 0.0, 0.0, 1.0]
R =  [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
P =  [1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0]
# oST version 5.0 parameters


[image]

width
640

height
480

[narrow_stereo/left]

camera matrix
430.215550 0.000000 306.691343
0.000000 430.531693 227.224800
0.000000 0.000000 1.000000

distortion
-0.337586 0.111612 -0.000218 -0.000030 0.0000

rectification
1.000000 0.000000 0.000000
0.000000 1.000000 0.000000
0.000000 0.000000 1.000000

projection
1.000000 0.000000 0.000000 0.000000
0.000000 1.000000 0.000000 0.000000
0.000000 0.000000 1.000000 0.000000

Switch back to the calibration window and click commit to save the result.

Remark

​ If Segmentation fault is shown after commit, check if this file exist:

	~/.ros/camera_info/camerav2_1280x960.yam

ROS need to be re-launched to apply the calibration data.

Troubleshooting

If package not found is shown, run:

sudo apt-get install ros-kinetic-rqt-reconfigure

If found no plugin matching is shown, run:

rqt --force-discover

and select the reconfigure in the pop up window

1 Like

Okay I was just checking

In another page in ardupilot.org. It tells you to set mavlink1.

Both of them should work. It is an option. I have tested both mavlink1 and mavlink2 will work.

1 Like

Run all Components

Correct Some Codes Before First Run

If you cloned the simple build, no need to edit the file

Edit the file ~/catkin_ws/src/aruco_gridboard/data/layout-my.yaml

Change the line

<param name="camera_info_url" value="package://raspicam_node/camera_info/camerav2_640x480.yaml"/>

to

<param name="camera_info_url" value="package://raspicam_node/camera_info/camerav2_1280x960.yaml"/>

This is because the camera was calibrated with resolution of 1280x960.

Start all ROS Nodes

Connect the aircraft to ground station (eg: Mission Planner) on your remote computer.

Switch back to the Raspberry Pi. Use SSH to login to the Raspberry Pi in every terminal if you are not testing on ground.

Launch raspberry pi camera node module:

roslaunch aruco_gridboard detection_rpicam.launch

Launch mavros module:

sudo chmod 666 <tty port>
roslaunch mavros apm.launch

OR

sudo chmod 666 <tty port>
roslaunch mavros apm.launch fcu_url:=<tty port>:<baud rate>

if you did not set default communication port.

Wait for the EK2 ready, until Request for home position shows up. Open a new terminal, run:

rosservice call /mavros/set_stream_rate 0 10 1
rosrun aruco_gridboard set_origin.py

to set data rate and origin point of your drone. After the python script start, the aircraft should be located in somewhere in Italy.

For the first time or if you want to configure the camera, open a new terminal and run:

rosrun rqt_reconfigure rqt_reconfigure

A window will show up and camera parameters can be modified here. For example, select these 2 checkboxes as show below if you mounted the camera upside down.

Open a new terminal, run:

rosrun rviz rviz -d catkin_ws/src/aruco_gridboard/data/aruco_grid.rviz

The rviz should be opened and two 3-axis pose should be at the middle of the screen. They represent the /local_position/pose and vision_position/pose topics, which are the position relative to the aruco board.

Go to the left top corner and find the image tag. Change the topic to /image/raw. The real time image from camera will be shown in the bottom-left corner.

You will see that pose of vision_position/pose change according to the angle of aruco board observed by camera. Pose of vision_position/pose corresponds to the EKF measurement on the Cube. Because of the EKF, vision_position/pose will not change immediately but slowly move to vision_position/pose.

There will be a small delay between the actual movement and pose. The delay can be reduced by increasing the stream rate but please consider the processing power of Raspberry Pi.

The vision_position/pose may not be a flat standing even if the camera is flat. The problem may be located in image processing part. It is still functional anyway.


Load Robot Model to RVIZ

If you want to learn more about URDF model, please visit ROS.org

Install URDF_tutorial Package

Open a terminal and run:

sudo apt install ros-kinetic-urdf-tutorial

Test if Successfully Installed

Go to the package location

roscd urdf_tutorial

Run the default URDF

roslaunch urdf_tutorial display.launch model:=urdf/01-myfirst.urdf

You should be able to see a red cylinder at the middle of RVIZ.

Add Necessary Files

If you cloned the simple build, skip to next session.

Go to my github and clone the following files/folders to the corresponding folders in ~/catkin_ws/src/aruco_gridboard/

data/aruco_grid_model.rviz
launch/detection_rpicam_model.launch
meshes
script/pose_to_tf.py
urdf

Prepare Robot Model

The robot models can be built in the urdf file or load a mesh file into urdf file. If you are using a Raspberry Pi, it probobly would not have enough processing power to load a mesh file.

You can read the urdf_tutorial on ROS.org or look into ~/catkin_ws/src/aruco_gridboard/urdf/test.urdf to learn how urdf works.

You may also download a mesh file, such as .STL, from other websites and load into your urdf file.

To do this, put your mesh file in ~/catkin_ws/src/aruco_gridboard/meshes

Then, change the content of ~/catkin_ws/src/aruco_gridboard/urdf/aircraft.urdf to:

<?xml version="1.0"?>
<robot name="aircraft">
  <link name="robot_link">
	<visual>
      <geometry>
		<mesh filename="file://$(arg meshDirectory)part1.STL"/>
      </geometry>
    </visual>
  </link>
</robot>

which “part1.STL” is the file name of your mesh file
~/catkin_ws/src/aruco_gridboard/urdf/aircraft.urdf and ~/catkin_ws/src/aruco_gridboard/meshes/20_min_drone.STL are drawn by me. You may use them as default.

Solidworks

If you used Solidworks to draw the STL file, put it in the meshes folder and run:

sed -i 's/^solid/robot/' *

before using it.

Run Everything

Connect the aircraft to ground station (eg: Mission Planner) on your remote computer.

Switch back to the Raspberry Pi. Use SSH to login to the Raspberry Pi in every terminal if you are not testing on ground
Open a new terminal, run:

roslaunch aruco_gridboard detection_rpicam_model.launch

Open a new terminal, run:

sudo chmod 666 <tty port>
~/catkin_ws$ roslaunch mavros apm.launch

OR

sudo chmod 666 <tty port>
~/catkin_ws$ roslaunch mavros apm.launch fcu_url:=<tty port>:<baud rate>

if you did not set default communication port.

Wait for the EK2 ready, until Request for home position shows up. Open a new terminal, run:

rosservice call /mavros/set_stream_rate 0 10 1
~/catkin_ws$ rosrun aruco_gridboard set_origin.py

After the python script start, the aircraft should be located in somewhere in Italy.

Convert the /mavros/local_position/pose to tf datatype

rosrun aruco_gridboard pose_to_tf.py

Open a new terminal, run:

rosrun rviz rviz -d catkin_ws/src/aruco_gridboard/data/aruco_grid_model.rviz

A robot model will be bound to the 3-axis pose of /mavros/local_position/pose