-
Notifications
You must be signed in to change notification settings - Fork 2.5k
[FEATURE] Add a Drone Controller and Test Scrips #1602
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
This is exactly what I want to do! But isn't using the remote even more complicated? Why not just use the keyboard? Also, can you add some classic test cases: circles, figure 8s, etc.? |
Hi @XXLiu-HNU. For the second, if you want to do some tracking tasks, trajectory planning or reinforcement learning is necessary. For the later, you can apply this controller into existing RL examples |
|
@yun-long do you have advice on this controller? |
|
Very nice results, good job! I can take a look at the implementation at a later stage, probably this weekend. |
| if (torch.any(self.has_nan)): | ||
| print("get_quat NaN env_idx:", torch.nonzero(self.has_nan).squeeze()) | ||
| self.body_quat_inv[self.has_nan] = inv_quat(self.body_quat[self.has_nan]) | ||
| self.reset(self.has_nan.nonzero(as_tuple=False).flatten()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When calling reset(), only one parameter is passed in
|
@YifeiNie Thanks a lot for your contribution. |
|
@yun-long Thanks for the feedback. |
|
@YifeiNie Hello, I tested your latest uploaded code and after modifying some typos in the output_crf23.mp4 |
|
@XXLiu-HNU Sorry about the drone dashing upward in simulation, wrong parameters caused this issue. I have create a new commit to resolve this problem, you can check it.
|
|
Hi @XXLiu-HNU, Thanks for your test. result.mp4 |
Looks great! I've used your controller (with rate control) for training tasks and it works great. Thanks for your excellent work. |
|
I added a simple drone flying in a circle example, hoping it can enrich the drone's functionality. output.mp4Click to view code# circle_example.py
import os
import torch
import math
import yaml
import genesis as gs
from pid import PIDcontroller
from odom import Odom
from mavlink_sim import rc_command
def gs_rand_float(lower, upper, shape, device):
return (upper - lower) * torch.rand(size=shape, device=device) + lower
class TrackerEnv:
def __init__(self, num_envs, show_viewer=False):
self.num_envs = num_envs
self.rendered_env_num = min(10, self.num_envs)
self.device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
# Define dt and circular trajectory parameters
self.dt = 0.01
self.circle_radius = 1.0
self.circle_omega = 0.5
self.drone_height = 1.0
self.circle_center = torch.tensor([0.0, 0.0, self.drone_height], device=self.device)
# create scene
self.scene = gs.Scene(
sim_options=gs.options.SimOptions(dt=self.dt, substeps=2),
viewer_options=gs.options.ViewerOptions(
max_FPS=100,
camera_pos=(3.0, 0.0, 3.0),
camera_lookat=(0.0, 0.0, 1.0),
camera_fov=40,
),
vis_options=gs.options.VisOptions(rendered_envs_idx=list(range(self.rendered_env_num))),
rigid_options=gs.options.RigidOptions(
dt=self.dt,
constraint_solver=gs.constraint_solver.Newton,
enable_collision=True,
enable_joint_limit=True,
),
show_viewer=show_viewer,
profiling_options=gs.options.ProfilingOptions(show_FPS=False)
)
# add plane
self.scene.add_entity(gs.morphs.Plane())
# add drone
self.initial_angle = gs_rand_float(0, 2 * math.pi, (self.num_envs,), self.device)
# Set the drone's initial position on the circle to prevent initial abrupt movements
self.drone_init_pos = torch.zeros((self.num_envs, 3), device=self.device)
self.drone_init_pos[:, 0] = self.circle_center[0] + self.circle_radius * torch.cos(self.initial_angle)
self.drone_init_pos[:, 1] = self.circle_center[1] + self.circle_radius * torch.sin(self.initial_angle)
self.drone_init_pos[:, 2] = self.circle_center[2]
self.drone_init_quat = torch.tensor([1,0,0,0], device=self.device).repeat(self.num_envs, 1)
self.drone = self.scene.add_entity(gs.morphs.Drone(file="examples/drone/controller/drone_urdf/drone.urdf"))
script_dir = os.path.dirname(os.path.abspath(__file__))
with open(os.path.join(script_dir, "config/pos_ctrl_eval/flight.yaml"), "r") as file:
self.pos_ctrl_config = yaml.load(file, Loader=yaml.FullLoader)
self.set_drone_imu()
self.set_drone_controller()
# Build scene
self.scene.build(n_envs=num_envs)
# initialize buffers
self.episode_length_buf = torch.zeros((self.num_envs,), device=self.device, dtype=torch.int)
# Set initial state for drone drone
self.drone.set_pos(self.drone_init_pos)
self.drone.set_quat(self.drone_init_quat)
def step(self):
# Increment episode length
self.episode_length_buf += 1
# Get target position on the circular trajectory
circle_traj = self.get_circle_traj()
# Call PID controller to compute actions
self.drone.controller.step(circle_traj)
# Physics simulation step
self.scene.step()
# Log or print status
current_pos = self.drone.get_pos()
print(f"Step: {self.episode_length_buf[0].item():d}, Curr Pos: ({current_pos[0,0]:.2f}, {current_pos[0,1]:.2f}, {current_pos[0,2]:.2f}), Target Pos: ({circle_traj[0,0]:.2f}, {circle_traj[0,1]:.2f}, {circle_traj[0,2]:.2f})")
def get_circle_traj(self):
"""
Calculates the target position on a circular trajectory based on the current step count.
"""
# Convert steps to radians
# self.episode_length_buf is a tensor of shape (num_envs,)
angle = self.initial_angle + self.episode_length_buf.float() * self.dt * self.circle_omega
# Calculate new x and y coordinates
x = self.circle_center[0] + self.circle_radius * torch.cos(angle)
y = self.circle_center[1] + self.circle_radius * torch.sin(angle)
z = self.circle_center[2] * torch.ones_like(x)
t = torch.zeros_like(x)
# Return target position tensor, shape (num_envs, 4)
return torch.stack([x, y, z, t], dim=1)
def set_drone_imu(self):
odom = Odom(
num_envs = self.num_envs,
device = self.device
)
odom.set_drone(self.drone)
setattr(self.drone, 'odom', odom)
def set_drone_controller(self):
pid = PIDcontroller(
num_envs = self.num_envs,
rc_command= rc_command,
odom = self.drone.odom,
config = self.pos_ctrl_config,
device = self.device,
controller = "position",
use_rc = False,
)
pid.set_drone(self.drone)
setattr(self.drone, 'controller', pid)
if __name__ == "__main__":
gs.init()
env = TrackerEnv(num_envs=1, show_viewer=True)
for i in range(2000):
env.step() |
Description
This PR updated a drone PID controller includes angle, angle rate and position controller, which implementing adjustable parameters with YAML Files. These controller use a modified urdf drone model which is similar to FPVs. Moreover, a odom and mavlink simulation class is implemented, with a modified betaflight firmware and a remote controller, users can play a simple FPV demo in genesis.
Motivation and Context
Genesis provides a drone interface and based on this, several great contributions has been accepted including motion testing and RL. However, there is no suitable controller for the drone, which enhance the gap between simulation to real world since most quadrotors need a low-level controller to control the motors because control motors rpm directly is not stable and lack of both hardware and software support in FPV community.
A simple controller can solve problems above in some cases, and the packaging of the controller and odometry is also more user-friendly.
How Has This Been / Can This Be Tested?
See README.md
Screenshots (if appropriate):
Use_RC_to_control_a.drone.mp4