Skip to content

Conversation

@YifeiNie
Copy link

@YifeiNie YifeiNie commented Aug 19, 2025

Description

This PR updated a drone PID controller includes angle, angle rate and position controller, which implementing adjustable parameters with YAML Files. These controller use a modified urdf drone model which is similar to FPVs. Moreover, a odom and mavlink simulation class is implemented, with a modified betaflight firmware and a remote controller, users can play a simple FPV demo in genesis.

Motivation and Context

Genesis provides a drone interface and based on this, several great contributions has been accepted including motion testing and RL. However, there is no suitable controller for the drone, which enhance the gap between simulation to real world since most quadrotors need a low-level controller to control the motors because control motors rpm directly is not stable and lack of both hardware and software support in FPV community.
A simple controller can solve problems above in some cases, and the packaging of the controller and odometry is also more user-friendly.

How Has This Been / Can This Be Tested?

See README.md

  • Use RC to control the sim drone by:
    python examples/drone/controller/eval/rc_FPV_eval.py
    
  • Try to get the target with no planning, thus has poor performance
    python examples/drone/controller/eval/pos_ctrl_eval.py
    

Screenshots (if appropriate):

Use_RC_to_control_a.drone.mp4

@XXLiu-HNU
Copy link

This is exactly what I want to do! But isn't using the remote even more complicated? Why not just use the keyboard? Also, can you add some classic test cases: circles, figure 8s, etc.?

@YifeiNie
Copy link
Author

This is exactly what I want to do! But isn't using the remote even more complicated? Why not just use the keyboard? Also, can you add some classic test cases: circles, figure 8s, etc.?

Hi @XXLiu-HNU.
For the first question, since keyboard only has 0 and 1 (press and release), this may not fully test the controller. Moreover, for simulation, drones are mostly at offboard mode, manual input is not necessary. And if you really need to use keyboard as controller, exanples/drone/interactive_drone.py will give you some references.

For the second, if you want to do some tracking tasks, trajectory planning or reinforcement learning is necessary. For the later, you can apply this controller into existing RL examples examples/drone/hover_train.py or GenesisDroneEnv for detail. And if the code owner accept this PR, future updates and maintenance will be provided.

@YilingQiao
Copy link
Collaborator

@yun-long do you have advice on this controller?

@yun-long
Copy link
Contributor

yun-long commented Sep 3, 2025

Very nice results, good job! I can take a look at the implementation at a later stage, probably this weekend.

if (torch.any(self.has_nan)):
print("get_quat NaN env_idx:", torch.nonzero(self.has_nan).squeeze())
self.body_quat_inv[self.has_nan] = inv_quat(self.body_quat[self.has_nan])
self.reset(self.has_nan.nonzero(as_tuple=False).flatten())

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When calling reset(), only one parameter is passed in

@yun-long
Copy link
Contributor

yun-long commented Sep 12, 2025

@YifeiNie Thanks a lot for your contribution.
Could you create a PR to the GenesisPlayground instead?
I feel this PR is more suitable for the new established GenesisPlayground repo.
I can help with the migration.

@YifeiNie
Copy link
Author

@yun-long Thanks for the feedback.
A new pr has been created for GenesisPlayground repo, please check it when you get a chance.

@XXLiu-HNU
Copy link

@YifeiNie Hello, I tested your latest uploaded code

python examples/drone/controller/eval/pos_ctrl_eval.py

and after modifying some typos in the flight.yaml, I found that the effect was very poor after the drone took off.

output_crf23.mp4

@YifeiNie
Copy link
Author

@XXLiu-HNU Sorry about the drone dashing upward in simulation, wrong parameters caused this issue. I have create a new commit to resolve this problem, you can check it.
But the performance of the position controller is still poor. As I mentioned in this conversation, it should be noted that position controller is just a demo which cannot work by just use random target points as input since just position PID controller cannot replace a planner.
If you want to use the position controller, an executable trajectory generated by a motion planner is necessary. I think you can:

  1. Sample the trajectory uniformly or in some other better ways
  2. Fine tune the parameters for the position controller to adapt to your sampling method
  3. Input the sampled, dense and continuous way-points to the position controller

@YifeiNie
Copy link
Author

YifeiNie commented Sep 14, 2025

Hi @XXLiu-HNU, Thanks for your test.
I fixed this by change the position controller to a cascaded structure, now it can work althrough not good enough, you can have a try!
And this update will also be synchronized to the PR (#28) for GenesisPlayground repo.

result.mp4

@XXLiu-HNU
Copy link

Hi @XXLiu-HNU, Thanks for your test. I fixed this by change the position controller to a cascaded structure, now it can work althrough not good enough, you can have a try! And this update will also be synchronized to the PR (#28) for GenesisPlayground repo. https://github.com/user-attachments/assets/f2a5ced1-36a6-408c-a522-e3b3d00edca2

Looks great! I've used your controller (with rate control) for training tasks and it works great. Thanks for your excellent work.

@XXLiu-HNU
Copy link

XXLiu-HNU commented Sep 15, 2025

I added a simple drone flying in a circle example, hoping it can enrich the drone's functionality.

output.mp4
Click to view code
# circle_example.py
import os
import torch
import math
import yaml
import genesis as gs
from pid import PIDcontroller
from odom import Odom
from mavlink_sim import rc_command

def gs_rand_float(lower, upper, shape, device):
    return (upper - lower) * torch.rand(size=shape, device=device) + lower

class TrackerEnv:
    def __init__(self, num_envs, show_viewer=False):
        self.num_envs = num_envs
        self.rendered_env_num = min(10, self.num_envs)
        self.device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
        
        # Define dt and circular trajectory parameters
        self.dt = 0.01
        self.circle_radius = 1.0
        self.circle_omega = 0.5
        self.drone_height = 1.0
        self.circle_center = torch.tensor([0.0, 0.0, self.drone_height], device=self.device)

        # create scene
        self.scene = gs.Scene(
            sim_options=gs.options.SimOptions(dt=self.dt, substeps=2),
            viewer_options=gs.options.ViewerOptions(
                max_FPS=100,
                camera_pos=(3.0, 0.0, 3.0),
                camera_lookat=(0.0, 0.0, 1.0),
                camera_fov=40,
            ),
            vis_options=gs.options.VisOptions(rendered_envs_idx=list(range(self.rendered_env_num))),
            rigid_options=gs.options.RigidOptions(
                dt=self.dt,
                constraint_solver=gs.constraint_solver.Newton,
                enable_collision=True,
                enable_joint_limit=True,
            ),
            show_viewer=show_viewer,
            profiling_options=gs.options.ProfilingOptions(show_FPS=False)
        )

        # add plane
        self.scene.add_entity(gs.morphs.Plane())

        # add drone
        self.initial_angle = gs_rand_float(0, 2 * math.pi, (self.num_envs,), self.device)
        
        # Set the drone's initial position on the circle to prevent initial abrupt movements
        self.drone_init_pos = torch.zeros((self.num_envs, 3), device=self.device)
        self.drone_init_pos[:, 0] = self.circle_center[0] + self.circle_radius * torch.cos(self.initial_angle)
        self.drone_init_pos[:, 1] = self.circle_center[1] + self.circle_radius * torch.sin(self.initial_angle)
        self.drone_init_pos[:, 2] = self.circle_center[2]
        
        self.drone_init_quat = torch.tensor([1,0,0,0], device=self.device).repeat(self.num_envs, 1)

        self.drone = self.scene.add_entity(gs.morphs.Drone(file="examples/drone/controller/drone_urdf/drone.urdf"))
        
        script_dir = os.path.dirname(os.path.abspath(__file__))
        with open(os.path.join(script_dir, "config/pos_ctrl_eval/flight.yaml"), "r") as file:
            self.pos_ctrl_config = yaml.load(file, Loader=yaml.FullLoader)

        self.set_drone_imu()
        self.set_drone_controller()

        # Build scene
        self.scene.build(n_envs=num_envs)

        # initialize buffers
        self.episode_length_buf = torch.zeros((self.num_envs,), device=self.device, dtype=torch.int)
        
        # Set initial state for drone drone
        self.drone.set_pos(self.drone_init_pos)
        self.drone.set_quat(self.drone_init_quat)

    def step(self):
        # Increment episode length
        self.episode_length_buf += 1
        
        # Get target position on the circular trajectory
        circle_traj = self.get_circle_traj()
        
        # Call PID controller to compute actions
        self.drone.controller.step(circle_traj)
        
        # Physics simulation step
        self.scene.step()
        
        # Log or print status
        current_pos = self.drone.get_pos()
        print(f"Step: {self.episode_length_buf[0].item():d}, Curr Pos: ({current_pos[0,0]:.2f}, {current_pos[0,1]:.2f}, {current_pos[0,2]:.2f}), Target Pos: ({circle_traj[0,0]:.2f}, {circle_traj[0,1]:.2f}, {circle_traj[0,2]:.2f})")

    def get_circle_traj(self):
        """
        Calculates the target position on a circular trajectory based on the current step count.
        """
        # Convert steps to radians
        # self.episode_length_buf is a tensor of shape (num_envs,)
        angle = self.initial_angle + self.episode_length_buf.float() * self.dt * self.circle_omega
        
        # Calculate new x and y coordinates
        x = self.circle_center[0] + self.circle_radius * torch.cos(angle)
        y = self.circle_center[1] + self.circle_radius * torch.sin(angle)
        z = self.circle_center[2] * torch.ones_like(x)
        t = torch.zeros_like(x)

        # Return target position tensor, shape (num_envs, 4)
        return torch.stack([x, y, z, t], dim=1)


    def set_drone_imu(self):
        odom = Odom(
            num_envs = self.num_envs,
            device = self.device
        )
        odom.set_drone(self.drone)
        setattr(self.drone, 'odom', odom)

    def set_drone_controller(self):
        pid = PIDcontroller(
            num_envs = self.num_envs,
            rc_command= rc_command,
            odom = self.drone.odom,
            config = self.pos_ctrl_config,
            device = self.device,
            controller = "position",
            use_rc = False,
        )
        pid.set_drone(self.drone)
        setattr(self.drone, 'controller', pid)


if __name__ == "__main__":
    gs.init()
    env = TrackerEnv(num_envs=1, show_viewer=True)
    for i in range(2000):
        env.step()

@duburcqa duburcqa marked this pull request as draft September 23, 2025 08:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants