Avoidance!

Hello,
I wanted to start a thread in which we can communicate cool ways of adding in avoidance to the drones with the Cube. The cube is absolutely awesome flight controller that puts DJI’s FC to sleep. But the only problem is that it needs to have the same capabilities as the DJI and to do that we need to create some very cool software and avoidance systems.

This isn’t very formal but I have been using dronekit and stereo camera as my method for avoidance.
I havent used many deep learning techniques but if anyone has one SPEAK UP!

I have seen some people with lidar systems and that isn’t a bad idea however I think it requires too many lidars and the 360 lidars seem to be a little expensive and not high quality. Prove me wrong?

Any way please let me know your ideas and help everyone make some amazing drones!!!

2 Likes

There are already works on object avoidance in ardupilot

check this:
https://ardupilot.org/dev/docs/code-overview-object-avoidance.html

1 Like

I think what @t3drones is trying to say is about the interface, where the data from the sensor is displayed. DJI has a very nice way to show you when we are near to something. on the other hand, he have QGC, where you can use just one data sensor ont eh HL, and on the QGC on atablet, the the data is not showed correclty. In Solex, we havn´t this opction.

The worst is no one help on this issues, I have posted my problem with this, and no one here have give me a hand with that.

I have been testing the 360 lidar and it seems to be working, very similar feeling to DJI, but it works in all directions. Also cool, that it is used in auto missions.

That’s great! What 360 LiDAR are you using at the moment and what kind of ranges can you expect?

That is absolutely correct! What I am talking about is a way to interface with these sensors in a way that isn’t necessarily used directly through the cube but will tell the cube what to do. If you have seen “guidance” is what dji used for avoidance. What that does is offload some of the sensors to a computer and then tell the flight controller(Naza) where to move. That is the approach I took by using a sensor with a computer(raspi) coding it with dronekit where to go when something gets close. Is that the best way? I have no idea that’s what I’m here to figure out with everyone. Also using this dronekit we can make some very cool graphic user interfaces with a desktop conputer or an app where we can code in complicated missions and run them with through our phones or laptops. I know that is a lot and I’m not the best at explaining but I’m trying. So far the only path I see to make this drone better is to incorporate onboard computers that can use high quality sensors like stereo vision to create a very good avoidance system.
-Tim

1 Like

I am using the lightware sf40/c sensor. Supposed to have a range of 100m.

1 Like