Repository Summary
Checkout URI | https://github.com/metadriverse/metadrive.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-03-20 |
Dev Status | UNMAINTAINED |
CI status | No Continuous Integration |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (0)
Good First Issues (0) Pull Requests to Review (0) |
Packages
Name | Version |
---|---|
metadrive_example_bridge | 0.0.0 |
README
MetaDrive: an Open-source Driving Simulator for AI and Autonomy Research
MetaDrive is a driving simulator with the following key features:
- Compositional: It supports synthesising infinite scenes with various road maps and traffic settings or loading real-world driving logs for the research of generalizable RL.
- Lightweight: It is easy to install and run on Linux/Windows/MacOS with sensor simulation support. It can run up to +1000 FPS on a standard PC.
- Realistic: Accurate physics simulation and multiple sensory input including point cloud, RGB/Depth/Semantic images, top-down semantic map and first-person view images.
π Quick Start
Install MetaDrive via:
git clone https://github.com/metadriverse/metadrive.git
cd metadrive
pip install -e .
You can verify the installation of MetaDrive via running the testing script:
# Go to a folder where no sub-folder calls metadrive
python -m metadrive.examples.profile_metadrive
Note that please do not run the above command in a folder that has a sub-folder called ./metadrive
.
π Examples
We provide examples to demonstrate features and basic usages of MetaDrive after the local installation.
There is an .ipynb
example which can be directly opened in Colab.
Also, you can try examples in the documentation directly in Colab! See more details in Documentations.
Single Agent Environment
Run the following command to launch a simple driving scenario with auto-drive mode on. Press W, A, S, D to drive the vehicle manually.
python -m metadrive.examples.drive_in_single_agent_env
Run the following command to launch a safe driving scenario, which includes more complex obstacles and cost to be yielded.
python -m metadrive.examples.drive_in_safe_metadrive_env
Multi-Agent Environment
You can also launch an instance of Multi-Agent scenario as follows
python -m metadrive.examples.drive_in_multi_agent_env --env roundabout
``` accepts following parmeters: `roundabout` (default), `intersection`, `tollgate`, `bottleneck`, `parkinglot`, `pgmap`.
Adding
```--top_down
``` can launch top-down pygame renderer.
### Real Environment
Running the following script enables driving in a scenario constructed from nuScenes dataset or Waymo dataset.
```bash
python -m metadrive.examples.drive_in_real_env
The default real-world dataset is nuScenes. Use
``` to visualize Waymo scenarios.
Traffic vehicles can not response to surrounding vchicles if directly replaying them.
Add argument
```--reactive_traffic
``` to use an IDM policy control them and make them reactive.
Press key
```r
``` for loading a new scenario, and
```b
``` or
```q
``` for switching perspective.
### Basic Usage
To build the RL environment in python script, you can simply code in the Farama Gymnasium format as:
```python
from metadrive.envs.metadrive_env import MetaDriveEnv
env = MetaDriveEnv(config={"use_render": True})
obs, info = env.reset()
for i in range(1000):
obs, reward, terminated, truncated, info = env.step(env.action_space.sample())
if terminated or truncated:
env.reset()
env.close()
π« Documentations
Please find more details in: https://metadrive-simulator.readthedocs.io
Running Examples in Doc
The documentation is built with .ipynb
so every example can run locally
or with colab. For Colab running, on the Colab interface, click βGitHub,β enter the URL of MetaDrive:
https://github.com/metadriverse/metadrive, and hit the search icon.
After running examples, you are expected to get the same output and visualization results as the documentation!
For example, hitting the following icon opens the source .ipynb
file of the documentation section: Environments.
π References
If you use MetaDrive in your own work, please cite:
@article{li2022metadrive,
title={Metadrive: Composing diverse driving scenarios for generalizable reinforcement learning},
author={Li, Quanyi and Peng, Zhenghao and Feng, Lan and Zhang, Qihang and Xue, Zhenghai and Zhou, Bolei},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2022}
}
Acknowledgement
The simulator can not be built without the help from Panda3D community and the following open-sourced projects:
- panda3d-simplepbr: https://github.com/Moguri/panda3d-simplepbr
- panda3d-gltf: https://github.com/Moguri/panda3d-gltf
- RenderPipeline (RP): https://github.com/tobspr/RenderPipeline
- Water effect for RP: https://github.com/kergalym/RenderPipeline
- procedural_panda3d_model_primitives: https://github.com/Epihaius/procedural_panda3d_model_primitives
- DiamondSquare for terrain generation: https://github.com/buckinha/DiamondSquare
- KITSUNETSUKI-Asset-Tools: https://github.com/kitsune-ONE-team/KITSUNETSUKI-Asset-Tools