Project

MedOpenClaw

Auditable Medical Imaging Agents Reasoning over Uncurated Full Studies

*Shared senior authors.

1 Technical University of Munich (TUM) 2 TUM University Hospital 3 LMU Munich 4 Imperial College London
5 University of Oxford 6 Carnegie Mellon University 7 National University of Singapore 8 Munich Center for Machine Learning

MedOpenClaw studies medical-imaging agents over uncurated full studies, exposing complex clinical interactions and task execution as auditable runtime traces.

Project website v1. Status: Project under active development. We are currently building toward our v2 release. Last update: March 30, 2026

From pre-selected inputs to auditable full-study workflows

Main figure of MedOpenClaw
MedOpenClaw shifts from static image settings toward full-study interaction loops with visible traces and grounded final answers.

Representative behaviors on full studies

Main Demo

Brain Tumor Localization and Differentiation

Study-level inspection for tumor localization together with differentiation across uncurated, full-volume imaging data.

Demo 02

Longitudinal Analysis of Tumor Size

Before-versus-after comparison for longitudinal change, centered on tumor size and study-to-study evolution.

Demo 03

Adjust to the Most Informative Tumor View

A focused viewer-control example where the agent adjusts the view to the slice or perspective that makes the tumor most evident.

Demo 04

Failure Case: Segmentation Failure

An explicit failure case that makes segmentation breakdowns visible instead of hiding them behind a final prediction.

Auditable runtime traces

Representative auditable traces reconstructed from MedOpenClaw runtime behavior
Representative trace visualizations reconstructed from auditable runtime behavior.

BibTeX

@misc{shen2026medopenclawauditablemedicalimaging,
  title={MedOpenClaw: Auditable Medical Imaging Agents Reasoning over Uncurated Full Studies},
  author={Weixiang Shen and Yanzhu Hu and Che Liu and Junde Wu and Jiayuan Zhu and Chengzhi Shen and Min Xu and Yueming Jin and Benedikt Wiestler and Daniel Rueckert and Jiazhen Pan},
  year={2026},
  eprint={2603.24649},
  archivePrefix={arXiv},
  primaryClass={cs.CV},
  url={https://arxiv.org/abs/2603.24649},
}