Datasets:

Languages:
English
ArXiv:
License:
Dataset Viewer

The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

The missing infrastructure for Physical AI post-training in AD. Open-source. Production-validated.

License ModelScope GitHub Hugging Face

Joint effort by OpenDriveLab at The University of Hong Kong, Huawei Inc. and Shanghai Innovation Institute (SII).

Highlights

  • A post-training framework for Physical AI: Systematically addresses the long-tail safety-critical data scarcity problem in autonomous driving.
  • Data-driven long-tail discovery: Failure-prone scenarios are automatically identified from real-world driving logs by the pre-trained agent itself โ€” no manual design, no synthetic perturbations.
  • Photorealistic interactive simulation via 3D Gaussian Splatting (3DGS): Each discovered scenario is reconstructed into a fully controllable, real-time-renderable simulation environment.
  • Behavior-driven scenario generation: Leverages Behavior World Model (BWM) to generalize and synthesize diverse traffic variations from long-tail scenarios, expanding sparse safety-critical events into a dense, learnable distribution.
  • RL-based post-training on safety-critical rollouts substantially outperforms scaling pre-training data alone โ€” competitive with a ~10x increase in pre-training data.
  • Production-scale validation: Deployed on a mass-produced ADAS platform trained on 80,000+ hours of driving logs, reducing collision rate by up to 45.5% and achieving zero disengagements in a 200 km on-road test.

News

  • [2026/04/09] Official data release.

๐Ÿ“ฆ Dataset Overview

This dataset uses a modular data structure where each subsystem (AlgEngine, SimEngine) has its own data requirements while sharing common formats.

Module Function Data Types
Raw Data nuPlan & OpenScene base datasets Sensor data, maps, annotations
AlgEngine End-to-end model training & evaluation Preprocessed annotations, ckpts, caches
SimEngine Closed-loop simulation environments Scene assets, config files
WorldEngine/
โ””โ”€โ”€ data/                          # Main data directory
   โ”œโ”€โ”€ raw/                       # Raw datasets (nuPlan, OpenScene)
   โ”œโ”€โ”€ alg_engine/                # AlgEngine-specific data
   โ””โ”€โ”€ sim_engine/                # SimEngine-specific data

๐Ÿ“‚ Directory Structure

1๏ธโƒฃ Raw Data (data/raw/)

Click to expand full directory structure

After downloading the nuPlan and OpenScene raw datasets, set up the following structure via symlinks (ln -s):

data/raw/
โ”œโ”€โ”€ nuplan/                        # nuPlan raw dataset
โ”‚   โ””โ”€โ”€ dataset/
โ”‚      โ”œโ”€โ”€ maps/                  # HD maps (required for all modules)
โ”‚      โ”‚   โ”œโ”€โ”€ nuplan-maps-v1.0.json
โ”‚      โ”‚   โ”œโ”€โ”€ us-nv-las-vegas-strip/
โ”‚      โ”‚   โ”œโ”€โ”€ us-ma-boston/
โ”‚      โ”‚   โ”œโ”€โ”€ us-pa-pittsburgh-hazelwood/
โ”‚      โ”‚   โ””โ”€โ”€ sg-one-north/
โ”‚      โ””โ”€โ”€ nuplan-v1.1/
โ”‚          โ”œโ”€โ”€ sensor_blobs/      # Camera images and LiDAR
โ”‚          โ””โ”€โ”€ splits/            # Train/val/test splits
โ”‚   
โ”‚
โ””โ”€โ”€ openscene-v1.1/                # OpenScene dataset (nuPlan-based)
    โ”œโ”€โ”€ sensor_blobs/
    โ”‚   โ”œโ”€โ”€ trainval/              # Training sensor data
    โ”‚   โ””โ”€โ”€ test/                  # Test sensor data
    โ””โ”€โ”€ meta_datas/
        โ”œโ”€โ”€ trainval/              # Training metadata
        โ””โ”€โ”€ test/                  # Test metadata

2๏ธโƒฃ AlgEngine Data (data/alg_engine/)

Click to expand full directory structure

Data for end-to-end model training and evaluation:

data/alg_engine/
โ”œโ”€โ”€ openscene-synthetic/           # Synthetic data generated by SimEngine (need to generate)
โ”‚   โ”œโ”€โ”€ sensor_blobs/
โ”‚   โ”œโ”€โ”€ meta_datas/
โ”‚   โ””โ”€โ”€ pdms_pkl/
โ”‚
โ”œโ”€โ”€ ckpts/                         # Pre-trained model checkpoints
โ”‚   โ”œโ”€โ”€ bevformerv2-r50-t1-base_epoch_48.pth
โ”‚   โ”œโ”€โ”€ e2e_vadv2_50pct_ep8.pth
โ”‚   โ”œโ”€โ”€ track_map_nuplan_r50_navtrain_100pct_bs1x8.pth
โ”‚   โ””โ”€โ”€ track_map_nuplan_r50_navtrain_50pct_bs1x8.pth
โ”‚
โ”œโ”€โ”€ pdms_cache/                    # Pre-computed PDM metric caches
โ”‚   โ”œโ”€โ”€ pdm_8192_gt_cache_navtest.pkl
โ”‚   โ””โ”€โ”€ pdm_8192_gt_cache_navtrain.pkl
โ”‚
โ”œโ”€โ”€ merged_infos_navformer/        # Preprocessed annotations
โ”‚   โ”œโ”€โ”€ nuplan_openscene_navtest.pkl
โ”‚   โ””โ”€โ”€ nuplan_openscene_navtrain.pkl
โ”‚
โ””โ”€โ”€ test_8192_kmeans.npy          # K-means clustering for PDM

3๏ธโƒฃ SimEngine Data (data/sim_engine/)

Click to expand full directory structure

Data for closed-loop simulation:

data/sim_engine/
โ”œโ”€โ”€ assets/                        # Scene assets for simulation
โ”‚   โ”œโ”€โ”€ navtest
โ”‚   โ”‚   โ”œโ”€โ”€ assets
โ”‚   โ”‚   โ””โ”€โ”€ configs
โ”‚   โ”œโ”€โ”€ navtrain/
โ”‚   โ””โ”€โ”€ navtest_failures/
โ”‚
โ””โ”€โ”€ scenarios/                     # Scenario configurations
    โ”œโ”€โ”€ original/                  # Original logged scenarios
    โ”‚   โ”œโ”€โ”€ navtest_failures/
    โ”‚   โ”œโ”€โ”€ navtrain_50pct_collision/
    โ”‚   โ”œโ”€โ”€ navtrain_ep_per1/
    โ”‚   โ”œโ”€โ”€ navtrain_failures_per1/
    โ”‚   โ””โ”€โ”€ navtrain_hydramdp_failures/
    โ”‚
    โ””โ”€โ”€ augmented/                 # Augmented scenarios (from BWM)
        โ”œโ”€โ”€ navtrain_50pct_collision/
        โ”œโ”€โ”€ navtrain_50pct_ep_1pct/
        โ””โ”€โ”€ navtrain_50pct_offroad/

โš™๏ธ Environment Setup

Configure the following environment variables for proper data access:

Quick Configuration

# Add to ~/.bashrc or ~/.zshrc
export WORLDENGINE_ROOT="/path/to/WorldEngine"
export NUPLAN_MAPS_ROOT="${WORLDENGINE_ROOT}/data/raw/nuplan/maps"
export PYTHONPATH=$WORLDENGINE_ROOT:$PYTHONPATH

Apply Changes

source ~/.bashrc  # or source ~/.zshrc

๐Ÿ’ก Tip: After adding the above to your shell config file, these environment variables will be automatically loaded every time you open a new terminal.


๐Ÿ“– Usage

Quick Start

Follow these steps to set up the dataset:

Step Action Description
1 Download dataset Use Hugging Face Hub or Git Clone
2 Extract scene assets Extract split archives in data/sim_engine/assets/
3 Set environment variables Configure WORLDENGINE_ROOT and related paths
4 Create symlinks Link raw datasets (if needed)
5 Verify installation Run the quick test script

Detailed Setup

4. Create Symlinks (Optional)

If you have already downloaded nuPlan and OpenScene data, use symlinks to avoid data duplication:

cd WorldEngine/data/raw
ln -s /path/to/nuplan nuplan
ln -s /path/to/openscene-v1.1 openscene-v1.1
cd openscene-v1.1
ln -s ../nuplan/maps maps

Next Steps

After dataset setup, refer to the main project documentation:


๐Ÿ“ Citation

If this project is helpful to your research, please consider citing:


If you use the Render Assets (MTGS), please also cite:

@article{li2025mtgs,
  title={MTGS: Multi-Traversal Gaussian Splatting},
  author={Li, Tianyu and Qiu, Yihang and Wu, Zhenhua and Lindstr{\"o}m, Carl and Su, Peng and Nie{\ss}ner, Matthias and Li, Hongyang},
  journal={arXiv preprint arXiv:2503.12552},
  year={2025}
}

If you use the scenario data generated by Behavior World Model (BWM), please also cite:

@inproceedings{zhou2025nexus,
  title={Decoupled Diffusion Sparks Adaptive Scene Generation},
  author={Zhou, Yunsong and Ye, Naisheng and Ljungbergh, William and Li, Tianyu and Yang, Jiazhi and Yang, Zetong and Zhu, Hongzi and Petersson, Christoffer and Li, Hongyang},
  booktitle={ICCV},
  year={2025}
}
@article{li2025optimization,
  title={Optimization-Guided Diffusion for Interactive Scene Generation},
  author={Li, Shihao and Ye, Naisheng and Li, Tianyu and Chitta, Kashyap and An, Tuo and Su, Peng and Wang, Boyang and Liu, Haiou and Lv, Chen and Li, Hongyang},
  journal={arXiv preprint arXiv:2512.07661},
  year={2025}
}

If you find AlgEngine well, please cite as well:

@ARTICLE{11353028,
  author={Liu, Haochen and Li, Tianyu and Yang, Haohan and Chen, Li and Wang, Caojun and Guo, Ke and Tian, Haochen and Li, Hongchen and Li, Hongyang and Lv, Chen},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={Reinforced Refinement With Self-Aware Expansion for End-to-End Autonomous Driving}, 
  year={2026},
  volume={48},
  number={5},
  pages={5774-5792},
  keywords={Adaptation models;Self-aware;Autonomous vehicles;Pipelines;Planning;Training;Reinforcement learning;Uncertainty;Data models;Safety;End-to-end autonomous driving;reinforced finetuning;imitation learning;motion planning},
  doi={10.1109/TPAMI.2026.3653866}}

๐Ÿ“„ License

This dataset is released under the CC-BY-NC-SA-4.0 license.

Terms of Use

  • โœ… Allowed: Modification, distribution, private use
  • ๐Ÿ“ Required: Attribution, share alike
  • โš ๏ธ Restricted: No commercial use; copyright and license notices must be retained

๐Ÿ”— Related Links

Resource Link
๐Ÿ  Project Home WorldEngine GitHub
๐Ÿค— Hugging Face Dataset Page
๐Ÿ“ฆ ModelScope Dataset Page
๐Ÿ’ฌ Discussions Hugging Face Discussions
๐Ÿ“– Full Documentation Documentation
๐ŸŽจ Scene Reconstruction MTGS Repository

๐Ÿ“ง Contact

For questions or suggestions, feel free to reach out:


โญ If you find WorldEngine useful, please consider giving us a Star! โญ

Thank you for your support of the WorldEngine project!

Downloads last month
101

Papers for OpenDriveLab/WorldEngine