- [2026/02/11] π₯ MolmoSpaces Code for scene conversion, grasp generation, teleoperation, and benchmark evaluation.
- [2026/02/11] π₯ Datasets for assets and scenes in MJCF and USDa format,
- [2026/02/11] π₯ Benchmark for 8 tasks, including pick, open, and close tasks in JSONs
- [Coming Soon] π₯ Code for scripted planners, data generation, and benchmark creation
Installing molmospaces is easy!
First, set up a conda environment with Python 3.10:
conda create -n mlspaces python=3.10
conda activate mlspacesThen, clone and install the project:
git clone git@github.com:allenai/molmospaces.git
cd molmospacesYou can either use uv
uv pip install -e .[dev,grasp]The installation options are:
devinstalls dependencies for code developmentgraspinstalls dependencies for the grasp generation pipelinehousegeninstalls dependencies for house generation pipeline from iTHOR, ProcTHOR, or Holodeck JSONs
You may wish to specify some environment variables to configure behavior.
Environment variables beginning with the MLSPACES prefix can be used to customize MolmoSpaces behavior.
| Environment Variable | Effect | Default |
|---|---|---|
MLSPACES_ASSETS_DIR |
Where to place downloaded assets | ../assets relative to molmo-spaces directory |
MLSPACES_AUTO_INSTALL |
Update assets without prompting | True |
MLSPACES_FORCE_INSTALL |
Override existing assets | True |
MLSPACES_PINNED_ASSETS_FILE |
A .json file containing pinned versions for each asset, used to override the versions specified in molmo_spaces_constants.py. |
Molmospaces provides scenes, objects, robots, and benchmarks. These can be downloaded using an asset manager to automatically fetch and version-control asset dependencies.
A number of assets are provided; this overview explains the naming of the assets in code:
| Type | Code Name | Paper Name | Desciption | Size |
|---|---|---|---|---|
| objects | thor | hand-crafted kitchen assets ~1.1k | ||
| objects | objaverse | converted Objaverse assets ~130k | ||
| scenes | ithor | MSCrafted | hand-crafted, may articulated assets | |
| scenes | procthor-10k | MSProc | procedurally generated with THOR assets | |
| scenes | procthor-objaverse | MSProcObja | procedurally generated with Objaverse assets | |
| scenes | holodeck | MSMultiType | LLM generated with Objaverse assets | |
| benchmark | MS-Bench v1 | base benchmark for atomic tasks |
Scene downloading. Assuming we have exported some convenient MLSPACES_ASSETS_DIR, we can install our first scene by:
from molmo_spaces.utils.lazy_loading_utils import install_scene_with_objects_and_grasps_from_path
from molmo_spaces.molmo_spaces_constants import get_scenes
install_scene_with_objects_and_grasps_from_path(get_scenes("ithor", "train")["train"][1])and view it with
python -m mujoco.viewer --mjcf $MLSPACES_ASSETS_DIR/scenes/ithor/FloorPlan1_physics.xmlObject downloading. All thor objects are downloaded, extracted and symlinked upon instantiation of the resource manager. If we want to download some asset of, e.g., category "apple", we can do so like:
import random
from pprint import pprint
from molmo_spaces.utils.object_metadata import ObjectMeta
from molmo_spaces.utils.lazy_loading_utils import install_uid
annotation = ObjectMeta.annotation()
# We exclude thor assets, which are installed
apple_annotations = [anno for anno in annotation.values() if "apple" in anno["category"].lower() and anno["isObjaverse"]]
random_apple = random.choice(apple_annotations)
print("Object annotation:")
pprint(random_apple)
apple_model_path = install_uid(random_apple["assetId"])
print(f"Object downloaded and symlinked to {apple_model_path}")Please refer to this README.md for instructions
on how to setup and use the MolmoSpaces assets in IsaacSim.
Please refer to this README.md for instructions
on how to setup and use the MolmoSpaces assets in ManiSkill.
The pinned assets file should have the same structure as DATA_TYPE_TO_SOURCE_TO_VERSION in molmo_spaces_constants.py. For example:
{
"robots": {
"franka_droid": "20260127"
},
"scenes": {
"ithor": "20251217"
}
}Currently, installing and running the benchmark is only supported in MuJoCo simulator.
export MLSPACES_ASSETS_DIR=/path/to/symlink/resources
python -m molmo_spaces.molmo_spaces_constantspython molmo_spaces/evaluation/eval_main.py \
molmo_spaces.evaluation.configs.evaluation_configs:PiPolicyEvalConfig \
--benchmark_dir assets/benchmarks/path-to-benchmark/ \
--task_horizon_steps 500For more information, please refer to the instruction in benchmark.
To control a robot via phone based teleoperation do the following (only iPhones supported).
- Install TeleDex from the App Store see here.
- Run the datagen pipeline with the teleop policy
python molmo_spaces/evaluation/eval_main.py \ molmo_spaces.evaluation.configs.evaluation_configs:TeleopPolicyEvalConfig \ --benchmark_dir assets/bench/path-to-bnechmark.json \ --task_horizon_steps 1000
- Ensure your phone and the machine running the pipeline are connected to the same network.
- Scan the QR-Code that shows up using the app (or manually enter the ip:port) while connected to a similar network. Example terminal output:
TeleDex Session Starting on port 8888... Session Started. Details: IP Address: xxx.xxx.xx.xxx Port: 8888 Waiting for a device to connect... - Start teleoperating!
- Click the Toggle to Grasp
- Click the Button to go to the next episode
Before committing, ensure your code is formatted:
ruff format .Generating type stubs for mujoco and open3d and saving them in the typings folder
pybind11-stubgen mujoco -o ./typings/- Documentation for the viewer can be found here, there are many keyboard shortcuts.
- If you have red boxes on top of your objects, go to the left panel and toggle
Group Enable > Site groups > Site 0 - Interact with objects by double-clicking > Ctrl + right mouse drag. (only with active viewers, not passive ones)
Robot base conventions: +x=forward, +y=left, +z=up
Robot parallel-jaw gripper conventions: +z=forward, fingers open along y axis
The codebase is licensed under Apache 2.0. The public MolmoSpaces data endpoint is available here. The public MolmoSpaces Isaac data endpoint is available here. The Objaverse subsets in these buckets are licensed under ODC-BY 1.0. All other data subsets are licensed under CC BY 4.0. The artifacts are intended for research and educational use in accordance with Ai2's Responsible Use Guidelines.
The xml files have been modified from the original versions provided by the following sources:
- mujoco_menagerie / franka_fr3 - Developed by Franka Robotics
- mujoco_menajerie / robotiq_2f85_v4 - Copyright (c) 2013, ROS-Industrial
- Rainbow Robotics / rby1-sdk - Copyright 2024-2025 Rainbow Robotics
- CAP Gripper - Copyright (c) 2026 NYU Generalizable Robotics and AI Lab (GRAIL)
@misc{molmospaces2026,
title={MolmoSpaces: A Large-Scale Open Ecosystem for Robot Navigation and Manipulation},
author={Yejin Kim and Wilbert Pumacay and Omar Rayyan and Max Argus and Winson Han and Eli VanderBilt and Jordi Salvador and Abhay Deshpande and Rose Hendrix and Snehal Jauhri and Shuo Liu and Nur Muhammad Mahi Shafiullah and Maya Guru and Arjun Guru and Ainaz Eftekhar and Karen Farley and Donovan Clay and Jiafei Duan and Piper Wolters and Alvaro Herrasti and Ying-Chun Lee and Georgia Chalvatzaki and Yuchen Cui and Ali Farhadi and Dieter Fox and Ranjay Krishna},
year={2026},
}


