Skip to content

Conversation

@diegoferigo-rai
Copy link

Description

This PR updates the pyproject.toml configuration to improve compatibility with tooling such as black and pyright. The changes affect both standalone usage of these tools and their integration when invoked from IDEs like VSCode.

This allows users who are not interested in autogenerating the settings.json file to have their IDEs configured out of the box.

This approach could be also extended to other tools like flake, pylint, etc.

Note

This PR targets feature/newton, which is the branch I'm currently interested. It can be backported to main if needed.

Type of change

  • New feature (non-breaking change which adds functionality)

Checklist

  • I have read and understood the contribution guidelines
  • I have run the pre-commit checks with ./isaaclab.sh --format
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • I have updated the changelog and the corresponding version in the extension's config/extension.toml file
  • I have added my name to the CONTRIBUTORS.md or my name already exists there

@Mayankm96
Copy link
Contributor

@ClemensSchwarke Following up here with our discussion to switch to ruff entirely :)

@diegoferigo-rai
Copy link
Author

@ClemensSchwarke Following up here with our discussion to switch to ruff entirely :)

I'd be super happy about a ruff migration. Let me know what works best for you. I can update this PR to introduce a ruff configuration compatible with black and isort, or close this and wait for your upcoming PRs.

@ClemensSchwarke
Copy link
Collaborator

ClemensSchwarke commented Oct 10, 2025

Hey @diegoferigo-rai, I think it would be super nice if you could add the ruff configuration for the parts you are interested in. I can take care of the remaining integration.

@diegoferigo-rai
Copy link
Author

The transition blackruff formatting seems straightforward, only a few minor formatting changes are needed.

However, the isortruff migration is more complex. IsaacLab defines a long list of custom import-sorting rules and sections. Are all of these custom rules actually required?

It would be helpful if someone could provide context on why the default isort or ruff’s built-in linter isn't sufficient for this project.

From my perspective, many of these custom rules seem like a workaround for other underlying issues rather than something inherently required.

IsaacLab/pyproject.toml

Lines 10 to 64 in f52aa98

# Order of imports
sections = [
"FUTURE",
"STDLIB",
"THIRDPARTY",
"ASSETS_FIRSTPARTY",
"FIRSTPARTY",
"EXTRA_FIRSTPARTY",
"TASK_FIRSTPARTY",
"LOCALFOLDER",
]
# Extra standard libraries considered as part of python (permissive licenses
extra_standard_library = [
"numpy",
"h5py",
"open3d",
"torch",
"tensordict",
"bpy",
"matplotlib",
"gymnasium",
"gym",
"scipy",
"hid",
"yaml",
"prettytable",
"toml",
"trimesh",
"tqdm",
"torchvision",
"transformers",
"einops" # Needed for transformers, doesn't always auto-install
]
# Imports from Isaac Sim and Omniverse
known_third_party = [
"isaacsim.core.api",
"isaacsim.replicator.common",
"omni.replicator.core",
"pxr",
"omni.kit.*",
"warp",
"carb",
"Semantics",
]
# Imports from this repository
known_first_party = "isaaclab"
known_assets_firstparty = "isaaclab_assets"
known_extra_firstparty = [
"isaaclab_rl",
"isaaclab_mimic",
]
known_task_firstparty = "isaaclab_tasks"
# Imports from the local folder
known_local_folder = "config"

@ClemensSchwarke
Copy link
Collaborator

I am not sure of the reasons for this isort setup (maybe @Mayankm96 can help), but I think it should be fairly easy to have a similar configuration for ruff.

@Mayankm96
Copy link
Contributor

I think back then isort was sorting them out alphabetically (don't remember a 100%). I wanted to group the imports based on their types (as expressed in the list) to make it easier to read through the imports.

@Mayankm96
Copy link
Contributor

Closing this in favor of #4329 and #4328

@Mayankm96 Mayankm96 closed this Jan 4, 2026
@github-project-automation github-project-automation bot moved this from Ready to Done in Isaac Lab Jan 4, 2026
@diegoferigo-rai
Copy link
Author

Thanks @Mayankm96 for handling this! Using black for formatting was straightforward, but porting the isort rules felt too time-consuming, so I kept procrastinating it. Really happy to see it finally landing upstream!

I still personally feel that custom isort rules can do more harm than good. The Python community is used to the three import blocks: standard library, external libraries, and internal modules. Custom rules tend to break the principle of least astonishment without adding much benefit. But that's just my opinion :)

@Mayankm96
Copy link
Contributor

Thank you for the feedback! I agree, I'd also like to avoid them.

For now, it is better to avoid mixing all Omniverse-related dependencies with other pip-installable ones. Keeping them separate makes it clearer where each dependency is coming from. Omniverse libraries are only available at runtime, so separating them helps a bit. We’re also actively trying to reduce these dependencies, and hopefully, at some point, we won’t need to group them separately at all :)

Would you prefer removing the separate “standard libraries” grouping and treating them all as third-party? That seems reasonable to me.

@diegoferigo-rai
Copy link
Author

diegoferigo-rai commented Jan 5, 2026

Omniverse libraries are only available at runtime

I understand your point, and I will come back to this aspect later. My personal view is that changing the order of imports is usually only beneficial in a limited set of cases, typically when circular imports occur frequently. In practice, this often points to underlying package design issues. Reordering imports can help work around the symptoms, but it does not really address the root cause.

Even if this is not problematic in this specific case, the current isort configuration merges standard library modules with pip installable dependencies, which I find confusing from a Python user perspective. The usual and widely understood convention is to have three blocks: (1) standard library imports, (2) external dependencies declared in pyproject.toml, and (3) internal modules. This structure makes it easy to reason about dependencies and to spot missing ones, and it is what most people expect when reading Python code.

I do see the rationale for treating isaaclab* packages differently from isaacsim|omni|carb|..., and similar runtime provided modules. That said, I would personally limit the special handling to those packages only (introducing two new blocks below (2)?), and keep the rest of the import structure aligned with the standard convention. When import ordering becomes too custom or complex, it can give the impression that something is off, even when it is not, and it tends to increase maintenance overhead.


Coming back to the point about Omniverse libraries being available only at runtime, this is indeed true. While there are valid reasons for this design, it is also somewhat confusing and limiting. In particular, it prevents the use of tools like pyright or other static analysis tools, since the modules are not visible to the interpreter before execution.

There is a possible workaround. If isaacsim is installed from pip in a virtual environment or conda environment, I developed an internal script that symlinks all extensions, regardless of how deeply nested they are, effectively exposing their modules to the Python interpreter. This does not affect or alter how isaacsim resolves or loads its modules at runtime, it only makes them visible externally to the Python tooling ecosystem. With this setup, tools like pyright can resolve the imports correctly, which enables static analysis and improves the developer experience (e.g. VSCode autocompletion without custom paths).

Here below the script.

flatten_isaacsim_extensions.py
import collections
import logging
import os
import pathlib
import site

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(pathlib.Path(__file__).name)


def find_nested_python_packages(root_path: pathlib.Path) -> set[pathlib.Path]:
    """
    Recursively find all nested Python packages within the given root path.

    A Python package is identified by the presence of an `__init__.py` file.

    Args:
        root_path: The root directory to search for packages.

    Returns:
        A set of paths to the found packages.
    """

    nested_packages = set()

    # Find all __init__.py files, sort by path length to ensure parents are processed before children.
    for p in sorted(
        root_path.rglob("**/__init__.py"), key=lambda x: (len(str(x)), str(x))
    ):
        # Ignore pip prebundle directories.
        if "pip_prebundle" in p.parts:
            logger.debug(f"Skipping pip_prebundle directory: {p}")
            continue

        # Avoid adding subpackages of already found packages.
        if any(str(pkg) in str(p) for pkg in nested_packages):
            continue

        nested_packages.add(p.parent)

    return nested_packages


def get_site_packages_dir() -> pathlib.Path:
    """
    Get the site-packages directory for the current Python environment.

    Returns:
        The path to the site-packages directory.
    """

    conda_prefix = os.environ.get("CONDA_PREFIX", None)
    logger.debug(f"CONDA_PREFIX={conda_prefix}")

    if conda_prefix is None:
        raise RuntimeError("CONDA_PREFIX environment variable is not set")

    site_packages_dirs = site.getsitepackages(prefixes=[conda_prefix])

    if len(site_packages_dirs) > 1:
        raise RuntimeError(
            f"Multiple site-packages directories found: {site_packages_dirs}"
        )

    return pathlib.Path(site_packages_dirs[0]).resolve()


def find_nested_python_packages_in_extensions(
    root_path: pathlib.Path, target_packages: set[str]
) -> dict[str, set[tuple[pathlib.Path, pathlib.Path]]]:
    """
    Find all Python packages corresponding to the target packages within the given root extension path.

    Args:
        root_path: The extension root directory to search for target packages.
        target_packages: A set of target package names to look for.

    Returns:
        A dictionary of sets of paths for each target package.
        The set contains tuples of (absolute_path_to_extension, relative_path_to_package).

    Note:
        This is useful for finding nested python packages of extension-based projects
        like IsaacSim.
    """

    nested_python_packages_dict = collections.defaultdict(set)

    # Find all nested python subpackages within the root path.
    # Subpackages are the Python packages included in IsaacSim extensions.
    for python_package in sorted(
        find_nested_python_packages(root_path),
        key=lambda x: (len(str(x)), str(x)),
    ):
        for target_package in target_packages:
            # Check if the target package is in the package path.
            if target_package not in python_package.relative_to(root_path).parts:
                continue

            # Get the relative path between the root path and the Python package.
            package_relative = python_package.relative_to(root_path)
            assert sum([p == target_package for p in package_relative.parts]) == 1

            # Split the relative path into:
            # - absolute_path_to_extension: the path to the extension root.
            # - relative_path_to_package: relative path from the extension root to the target package.
            idx = package_relative.parts.index(target_package)
            absolute_path_to_extension = root_path / pathlib.Path(
                *package_relative.parts[:idx]
            )
            relative_path_to_package = pathlib.Path(*package_relative.parts[idx:])

            # Add to the dictionary the information about the found package.
            nested_python_packages_dict[target_package].add(
                (absolute_path_to_extension, relative_path_to_package)
            )

    return nested_python_packages_dict


def find_isaacsim_python_packages() -> dict[
    str, set[tuple[pathlib.Path, pathlib.Path]]
]:
    """
    Find all Python packages within IsaacSim extensions.

    Returns:
        A dictionary combining the results of searching multiple root paths for
        nested Python packages within IsaacSim extensions. The output dictionary
        matches the format of `find_nested_python_packages_in_extensions`.
    """

    # Create a flat Python package structure for the following packages
    # nested within IsaacSim extensions.
    target_packages = {"isaacsim", "omni", "carb"}

    site_packages_dir = get_site_packages_dir()
    python_packages_dict = collections.defaultdict(set)

    # Search the following root paths for extensions.
    # An extension is a nested python package (a folder containing __init__.py).
    for root_path in {
        site_packages_dir / "isaacsim" / "exts",
        site_packages_dir / "isaacsim" / "extscache",
        site_packages_dir / "isaacsim" / "kit" / "kernel" / "py",
    }:
        if not root_path.is_dir():
            print(f"Directory not found, skipping: {root_path}")
            continue

        # Find all nested python packages within the root path.
        new_python_packages_dict = find_nested_python_packages_in_extensions(
            root_path=root_path,
            target_packages=target_packages,
        )

        # Merge the new packages into the main dictionary.
        for k, v in new_python_packages_dict.items():
            python_packages_dict[k].update(v)

    return python_packages_dict


def flatten_isaacsim_extensions() -> None:
    """
    Flatten the directory structure of IsaacSim extensions.

    This function finds all nested Python packages within IsaacSim extensions
    and creates symlinks in the site-packages directory to flatten the structure.

    This is useful to expose to the Python environment the packages that are nested
    within IsaacSim extensions, allowing for easier imports and simplifying discovering
    for IDE autocompletion.
    """

    # Find all nested python packages within IsaacSim extensions.
    python_packages_dict = find_isaacsim_python_packages()

    created_symlinks = set()

    for _, package_paths in python_packages_dict.items():
        # Sort by path length to ensure parents are processed before children.
        for prefix, package_path in sorted(
            package_paths, key=lambda x: (len(f"{x[0]}/{x[1]}"), f"{x[0]}/{x[1]}")
        ):
            # The symlink destination is the site-packages directory + the relative path to the package.
            symlink = get_site_packages_dir() / package_path

            # The symlink target is the absolute path to the package within the extension.
            symlink_target = prefix / package_path

            # Create the parent directories if they don't exist.
            symlink.parent.mkdir(exist_ok=True, parents=True)

            if symlink in created_symlinks:
                logging.debug("Symlink already created in this run, skipping:", symlink)
                continue

            if symlink.exists() and symlink.is_symlink():
                logging.debug("Symlink already exists, removing:", symlink)
                symlink.unlink()

            if symlink.exists() and symlink.is_dir():
                logging.debug("Directory already exists, skipping:", symlink)
                continue

            # Create the symlink.
            # Note: it would be better to create a relative symlink, but storing symlinks of folders
            # inside other symlinked folders could create problems. Since the pixi environment is not
            # meant to be relocatable, we can use absolute symlinks for simplicity.
            assert not symlink.exists(), symlink
            symlink.symlink_to(symlink_target)

            created_symlinks.add(symlink)
            logging.info(f"Created symlink: file={symlink}")
            logging.info(f"Created symlink: dest={symlink_target}\n")


if __name__ == "__main__":
    flatten_isaacsim_extensions()

@Mayankm96
Copy link
Contributor

Awesome! I made an issue to continue this proposal there :)

#4336

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants