-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Store black configuration in pyproject.toml and enhance pyright support
#3619
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Store black configuration in pyproject.toml and enhance pyright support
#3619
Conversation
Signed-off-by: Diego Ferigo <[email protected]>
Signed-off-by: Diego Ferigo <[email protected]>
|
@ClemensSchwarke Following up here with our discussion to switch to ruff entirely :) |
I'd be super happy about a |
|
Hey @diegoferigo-rai, I think it would be super nice if you could add the ruff configuration for the parts you are interested in. I can take care of the remaining integration. |
|
The transition However, the It would be helpful if someone could provide context on why the default isort or ruff’s built-in linter isn't sufficient for this project. From my perspective, many of these custom rules seem like a workaround for other underlying issues rather than something inherently required. Lines 10 to 64 in f52aa98
|
|
I am not sure of the reasons for this isort setup (maybe @Mayankm96 can help), but I think it should be fairly easy to have a similar configuration for ruff. |
|
I think back then isort was sorting them out alphabetically (don't remember a 100%). I wanted to group the imports based on their types (as expressed in the list) to make it easier to read through the imports. |
|
Thanks @Mayankm96 for handling this! Using black for formatting was straightforward, but porting the isort rules felt too time-consuming, so I kept procrastinating it. Really happy to see it finally landing upstream! I still personally feel that custom isort rules can do more harm than good. The Python community is used to the three import blocks: standard library, external libraries, and internal modules. Custom rules tend to break the principle of least astonishment without adding much benefit. But that's just my opinion :) |
|
Thank you for the feedback! I agree, I'd also like to avoid them. For now, it is better to avoid mixing all Omniverse-related dependencies with other pip-installable ones. Keeping them separate makes it clearer where each dependency is coming from. Omniverse libraries are only available at runtime, so separating them helps a bit. We’re also actively trying to reduce these dependencies, and hopefully, at some point, we won’t need to group them separately at all :) Would you prefer removing the separate “standard libraries” grouping and treating them all as third-party? That seems reasonable to me. |
I understand your point, and I will come back to this aspect later. My personal view is that changing the order of imports is usually only beneficial in a limited set of cases, typically when circular imports occur frequently. In practice, this often points to underlying package design issues. Reordering imports can help work around the symptoms, but it does not really address the root cause. Even if this is not problematic in this specific case, the current isort configuration merges standard library modules with pip installable dependencies, which I find confusing from a Python user perspective. The usual and widely understood convention is to have three blocks: (1) standard library imports, (2) external dependencies declared in I do see the rationale for treating Coming back to the point about Omniverse libraries being available only at runtime, this is indeed true. While there are valid reasons for this design, it is also somewhat confusing and limiting. In particular, it prevents the use of tools like There is a possible workaround. If isaacsim is installed from pip in a virtual environment or conda environment, I developed an internal script that symlinks all extensions, regardless of how deeply nested they are, effectively exposing their modules to the Python interpreter. This does not affect or alter how isaacsim resolves or loads its modules at runtime, it only makes them visible externally to the Python tooling ecosystem. With this setup, tools like pyright can resolve the imports correctly, which enables static analysis and improves the developer experience (e.g. VSCode autocompletion without custom paths). Here below the script. flatten_isaacsim_extensions.pyimport collections
import logging
import os
import pathlib
import site
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(pathlib.Path(__file__).name)
def find_nested_python_packages(root_path: pathlib.Path) -> set[pathlib.Path]:
"""
Recursively find all nested Python packages within the given root path.
A Python package is identified by the presence of an `__init__.py` file.
Args:
root_path: The root directory to search for packages.
Returns:
A set of paths to the found packages.
"""
nested_packages = set()
# Find all __init__.py files, sort by path length to ensure parents are processed before children.
for p in sorted(
root_path.rglob("**/__init__.py"), key=lambda x: (len(str(x)), str(x))
):
# Ignore pip prebundle directories.
if "pip_prebundle" in p.parts:
logger.debug(f"Skipping pip_prebundle directory: {p}")
continue
# Avoid adding subpackages of already found packages.
if any(str(pkg) in str(p) for pkg in nested_packages):
continue
nested_packages.add(p.parent)
return nested_packages
def get_site_packages_dir() -> pathlib.Path:
"""
Get the site-packages directory for the current Python environment.
Returns:
The path to the site-packages directory.
"""
conda_prefix = os.environ.get("CONDA_PREFIX", None)
logger.debug(f"CONDA_PREFIX={conda_prefix}")
if conda_prefix is None:
raise RuntimeError("CONDA_PREFIX environment variable is not set")
site_packages_dirs = site.getsitepackages(prefixes=[conda_prefix])
if len(site_packages_dirs) > 1:
raise RuntimeError(
f"Multiple site-packages directories found: {site_packages_dirs}"
)
return pathlib.Path(site_packages_dirs[0]).resolve()
def find_nested_python_packages_in_extensions(
root_path: pathlib.Path, target_packages: set[str]
) -> dict[str, set[tuple[pathlib.Path, pathlib.Path]]]:
"""
Find all Python packages corresponding to the target packages within the given root extension path.
Args:
root_path: The extension root directory to search for target packages.
target_packages: A set of target package names to look for.
Returns:
A dictionary of sets of paths for each target package.
The set contains tuples of (absolute_path_to_extension, relative_path_to_package).
Note:
This is useful for finding nested python packages of extension-based projects
like IsaacSim.
"""
nested_python_packages_dict = collections.defaultdict(set)
# Find all nested python subpackages within the root path.
# Subpackages are the Python packages included in IsaacSim extensions.
for python_package in sorted(
find_nested_python_packages(root_path),
key=lambda x: (len(str(x)), str(x)),
):
for target_package in target_packages:
# Check if the target package is in the package path.
if target_package not in python_package.relative_to(root_path).parts:
continue
# Get the relative path between the root path and the Python package.
package_relative = python_package.relative_to(root_path)
assert sum([p == target_package for p in package_relative.parts]) == 1
# Split the relative path into:
# - absolute_path_to_extension: the path to the extension root.
# - relative_path_to_package: relative path from the extension root to the target package.
idx = package_relative.parts.index(target_package)
absolute_path_to_extension = root_path / pathlib.Path(
*package_relative.parts[:idx]
)
relative_path_to_package = pathlib.Path(*package_relative.parts[idx:])
# Add to the dictionary the information about the found package.
nested_python_packages_dict[target_package].add(
(absolute_path_to_extension, relative_path_to_package)
)
return nested_python_packages_dict
def find_isaacsim_python_packages() -> dict[
str, set[tuple[pathlib.Path, pathlib.Path]]
]:
"""
Find all Python packages within IsaacSim extensions.
Returns:
A dictionary combining the results of searching multiple root paths for
nested Python packages within IsaacSim extensions. The output dictionary
matches the format of `find_nested_python_packages_in_extensions`.
"""
# Create a flat Python package structure for the following packages
# nested within IsaacSim extensions.
target_packages = {"isaacsim", "omni", "carb"}
site_packages_dir = get_site_packages_dir()
python_packages_dict = collections.defaultdict(set)
# Search the following root paths for extensions.
# An extension is a nested python package (a folder containing __init__.py).
for root_path in {
site_packages_dir / "isaacsim" / "exts",
site_packages_dir / "isaacsim" / "extscache",
site_packages_dir / "isaacsim" / "kit" / "kernel" / "py",
}:
if not root_path.is_dir():
print(f"Directory not found, skipping: {root_path}")
continue
# Find all nested python packages within the root path.
new_python_packages_dict = find_nested_python_packages_in_extensions(
root_path=root_path,
target_packages=target_packages,
)
# Merge the new packages into the main dictionary.
for k, v in new_python_packages_dict.items():
python_packages_dict[k].update(v)
return python_packages_dict
def flatten_isaacsim_extensions() -> None:
"""
Flatten the directory structure of IsaacSim extensions.
This function finds all nested Python packages within IsaacSim extensions
and creates symlinks in the site-packages directory to flatten the structure.
This is useful to expose to the Python environment the packages that are nested
within IsaacSim extensions, allowing for easier imports and simplifying discovering
for IDE autocompletion.
"""
# Find all nested python packages within IsaacSim extensions.
python_packages_dict = find_isaacsim_python_packages()
created_symlinks = set()
for _, package_paths in python_packages_dict.items():
# Sort by path length to ensure parents are processed before children.
for prefix, package_path in sorted(
package_paths, key=lambda x: (len(f"{x[0]}/{x[1]}"), f"{x[0]}/{x[1]}")
):
# The symlink destination is the site-packages directory + the relative path to the package.
symlink = get_site_packages_dir() / package_path
# The symlink target is the absolute path to the package within the extension.
symlink_target = prefix / package_path
# Create the parent directories if they don't exist.
symlink.parent.mkdir(exist_ok=True, parents=True)
if symlink in created_symlinks:
logging.debug("Symlink already created in this run, skipping:", symlink)
continue
if symlink.exists() and symlink.is_symlink():
logging.debug("Symlink already exists, removing:", symlink)
symlink.unlink()
if symlink.exists() and symlink.is_dir():
logging.debug("Directory already exists, skipping:", symlink)
continue
# Create the symlink.
# Note: it would be better to create a relative symlink, but storing symlinks of folders
# inside other symlinked folders could create problems. Since the pixi environment is not
# meant to be relocatable, we can use absolute symlinks for simplicity.
assert not symlink.exists(), symlink
symlink.symlink_to(symlink_target)
created_symlinks.add(symlink)
logging.info(f"Created symlink: file={symlink}")
logging.info(f"Created symlink: dest={symlink_target}\n")
if __name__ == "__main__":
flatten_isaacsim_extensions() |
|
Awesome! I made an issue to continue this proposal there :) |
Description
This PR updates the
pyproject.tomlconfiguration to improve compatibility with tooling such asblackandpyright. The changes affect both standalone usage of these tools and their integration when invoked from IDEs like VSCode.This allows users who are not interested in autogenerating the
settings.jsonfile to have their IDEs configured out of the box.This approach could be also extended to other tools like flake, pylint, etc.
Note
This PR targets
feature/newton, which is the branch I'm currently interested. It can be backported tomainif needed.Type of change
Checklist
pre-commitchecks with./isaaclab.sh --formatconfig/extension.tomlfileCONTRIBUTORS.mdor my name already exists there