Hoop-Passing Drone

Yoga Satwik Chappidi, Nitin Vegesna, Evan Ellis, Aryan Jain

Introduction

We combine trajectory prediction and motion planning to autonomously fly a DJI Tello drone through a rolled hula-hoop. By using a motion capture room, we are able to exactly localize the hoop, infer its velocity, and predict its trajectory. The drone's position is determined and a goal point is converted to the drone's frame of reference.

Our initial plan was to fly the drone through a thrown hoop, however, the slow speed of the drone (1 m/s) made it impossible for the drone to reach the hoop before it hit the ground. We expect that a faster drone would be able to fly through a hoop mid-air.

This system is applicable in any tracking feature, such as in sports games, where cameras must follow players and the ball, and determine the best pose to capture the game. It could also be useful for emergency dynamic obstacle avoidance, where there are sudden obstacles that are in the way, and the drone must figure out the best way to avoid it by inferring its path. In our problem, we aim to go to the point, but the underlying tracking is similar.

To encourage future development, we release our code at https://github.com/festusev/HulaHoopDrone.

Goal

Figure 1. The desired behavior of the system.

Design

We had three main objectives/design criteria:

  • Drone to pass through hoop accurately and reliably; the drone should ideally pass through the middle of the hoop.
  • Drone to efficiently pass through hoop. Drone should be able to pass through the hoop when feasible, and not always take the safest trajectory
  • Ensure adaptability based on hoop throws. Able to adapt to different hoop trajectories, in terms of speed and apex height.
System Design Diagram

Figure 2: Input/Output flow of the system. Vicon publishes MoCap data at 125Hz.

We made several design decisions that were critical for successfully flying through the hoop:

Reflectors Configuration

Motion Capture relies on infrared (IR) reflectors which are attached to an object to track it. We attached four reflectors symmetrically to the hoop, and designated the mean position as the position of the hoop's center. Determining a marker configuration for the drone was more tricky because we had to maintain the weight distribution, or the drone would be unstable while flying. It was also important to have an asymmetric configuration for the Vicon tracking system to be able to identify the orientation of the drone. If the reflectors are in a symmetric configuration, there may be two or even four possible orientations of the drone.

Trajectory Prediction

To predict the trajectory of the hoop, we kept a buffer of its previous positions. Assuming that the errors in the position estimate were Gaussian and that the trajectory was best fit by a parabola, we used least squares to estimate the hoop's initial position, velocity, and acceleration. Using these parameters, we predicted the hoop's positions for the next five seconds. We experimented with different strategies for clipping the position buffer, where there is a tradeoff between speed and accuracy, as well as a danger of using points that are from when the hoop was at rest.

Feasible Point Selection

We experimented with several methods for determining the goal point that the drone would fly through. To compensate for the slow speed of the drone, we used the point the hoop would be at in three seconds. This gave the highest success rate because the drone had time to accelerate and reach the position simultaneously with the hoop.

Implementation

Hardware Setup and Localization

We used the following hardware:

  • 69 cm diameter hoop for the drone to fly through
  • Drone: DJI Tello – controlled using Tello SDK, in order to fly autonomously through a predicted location of the hoop in the future
  • MoCap Room: To get exact coordinates of drone and hoop

On VICON Tracker App, we could create objects for the hoop and drone based on the markers and create the body coordinate frame as well.

Localizing the Hoop:

  • We attached 4 IR-reflecting markers to the hoop
  • Define the center of mass to be the average of these 4 points, which will be the midpoint of the hoop
  • Configure Vicon to publish this position to a rostopic. Vicon publishes points at ~125 Hz.

Localizing the Drone:

  • We attached 5 IR-reflecting markers to the drone in an asymmetrical manner
  • Asymmetrical configuration reduces ambiguity in marker identification of body
  • Define the origin to be the top marker on the marker configuration. Configure Vicon to publish this position to a rostopic.

Figure 3: Drone and hoop with IR reflectors.

Vicon Tracker App

Figure 4: Visualization of drone and hoop objects on Vicon

Drone Control and Trajectory Prediction for Thrown Hoop

Controlling the drone using Tello SDK:

  • Use DJITelloPy, a wrapper of the Tello SDK
  • Connect laptop to the Tello Wifi
  • Useful commands:
    • send_rc_control(left_right_velocity, forward_backward_velocity, up_down_velocity, yaw_velocity)
    • go_xyz_speed(x, y, z, speed)
  • Use ROS to transform Vicon world coordinates into drone coordinates

The steps of the entire process for the hoop-passing controller:

  1. Drone Takeoff
  2. Drone Warm-up (move forward as fast as possible)
  3. Hoop is thrown into the air
  4. Hoop Trajectory Prediction
  5. Drone uses trajectory to determine point to go to
  6. Drone control using Tello SDK to go to in the direction of the desired point at a desired velocity

Predicting the Hoop’s Trajectory

Hoop Trajectory Prediction

Figure 5: The motion along each axis can be modeled as a parabola. Trying to find a parameter matrix that satisfies the above equation.

Using least squares, we are solving for unknowns in $$x(t) = x_0 + v_0t + \frac{1}{2}at^2$$

6 unknowns: 3 for initial velocity, and 3 for initial position. Acceleration is known!

$$\beta = (X^T X)^{-1}X^T Y$$

$$\beta = \begin{bmatrix} 0 & v_x & p_x \\ 0 & v_y & p_y \\ g & v_z & p_z \end{bmatrix}$$

Synchronized Hoop Throwing Prediction (left: vicon, right: irl)

Deciding on a Point to Fly Through:

  • Point that the hoop will be at .1 seconds before hitting the ground
  • Gives the drone the most time to reach it
    • The drone is slow, so this is important
0.1 secs

Figure 7: Illustration of the intent of our picked feasible point

Challenges with the thrown hoop-passing problem:

  • The drone is slow
    • Max allowed speed is 1 m/s
  • Gravity is strong
    • The hoop is only in the air for at most 2 seconds
  • MoCap room loses tracking if the hoop goes above a certain height (~2m)
  • Although the drone acknowledges a command, it acts on it after some random time
    • Tries to solve this using forward warmup
  • Least Squares with few data points is susceptible to noise, even on the order of millimeters

We couldn’t get the hoop-throwing pass through to work. So we tried rolling it instead.

Drone Control and Trajectory Prediction for Rolling Hoop

We used least squares to predict the rolling hoop trajectory:

  • 4 unknowns: 2 for initial velocity, and 2 for initial position
    • Acceleration is known — assumed to be constantly 0 (no friction)!
    • We are trying to solve for unknowns in $$x(t) = x_0 + v_0t$$
  • First approach:
    • Determine the required velocity to go to a future point (time is known) and go at the required velocity – due to latency, required velocity isn’t enough
  • First approach does not work, since required velocity is not enough, due to variable latency, and even with rolling hoop, drone is very slow in general
    • Go to feasible point (within bounds of command) at top speed
  • Feasible point is the first point that satisfies the requirement of the required velocity being the cap of $1 m/s$. This can be determined by finding the distance between the point and drone's current position...

$$\beta = \begin{bmatrix} v_x & p_x \\ v_y & p_y \end{bmatrix}$$

Synchronized Hoop Rolling Prediction (left: vicon, right: irl)

Results

The drone reliably flew through the rolling hoop, from any starting configuration. We were able to send the drone to the correct position for a thrown hoop, however it was too slow and would arrive after the hoop had landed.

Conclusion

As mentioned, we could only get the hoop to fly through a rolling hoop. There are certain constraints still, such as the speed of the hoop, and the distance between the hoop and the drone. This is solely due to the hardware constraints of the drone. Due to this issue, we were not able to meet the design criteria of getting the drone to fly through a hoop in the air.

We were still able to predict the hoop trajectory well, and get the drone to fly in the right direction of the "feasible" point. However, due to the latency and its inability to fly quickly, the drone would always be too late.

We encountered difficulties with the drone control, since it is very limited in control (can only control velocity of drone, no pitch/roll control), and it has some variable time latency. It was difficult handling the noise in the control commands, plus the delay for the drone to follow the command.

One hack/flaw we used was to simply pick the first feasible point that is within the required velocity limit, which may not lead to the most smooth hoop passing trajectory of the drone. However, it should still work reliably, since we throw the hoop in front of the drone, so the plane of the hoop where the drone can pass will likely always provide an angle for the drone to pass through it.

Takeaways and Improvements:

  • Use a faster drone in the future
  • Filter the points used for least squares
    • Have worked on this more and made progress in getting better trajectory predictions, particularly for the thrown hoop case
  • Implement low-latency, real-time control policy, beyond the go_xyz function
  • Fuse MoCap data for accurate control
  • Implemented Kalman filter for better target estimation – needs improvement
  • Use the camera mounted on the drone to predict the hoop’s location based on its radius and forecast its trajectory
  • Freefall detection for hoop trajectory prediction - “Press Enter to Go” workaround

Team

Nitin

EECS MS student interested in the intersection of Deep Learning and Robotics, with experience in ROS and machine learning.

Satwik

EECS MS student with a passion for real-world robotics applications and experience developing autonomous drone software stacks for various competitions.

Evan

4th Year EECS Undergrad student studying empowerment, reward learning, and reinforcement learning.

Aryan

5th Year EECS MS student interested in Reinforcement Learning, Robotics, and Computer Vision.

Additional Materials