Skip to main content

Decision Forests

All research related to Decision Forests. A full list of topics is available on my research page.

  1. Minimizing and quantifying uncertainty in AI-informed decisions: Applications in medicine

    Samuel D. Curtis*, Sambit Panda*, Adam Li*, Haoyin Xu, Yuxin Bai, Itsuki Ogihara, Eliza O’Reilly, Yuxuan Wang, Lisa Dobbyn, Maria Popoli, Janine Ptak, Nadine Nehme, Natalie Silliman, Jeanne Tie, Peter Gibbs, Lan T. Ho-Pham, Bich N. H. Tran, Thach S. Tran, Tuan V. Nguyen, Ehsan Irajizad, Michael Goggins, Christopher L. Wolfgang, Tian-Li Wang, Ie-Ming Shih, Amanda Fader, Anne Marie Lennon, Ralph H. Hruban, Chetan Bettegowda, Lucy Gilbert, Kenneth W. Kinzler, Nickolas Papadopoulos, Bert Vogelstein, Joshua T. Vogelstein, Christopher Douville
    PNAS, 2025

    Introduces MIGHT, which helps quantify the amount of predictive information in very high-dimensional data. This was then used to develop and evaluate a biomedical assay to detect cancer early.

  2. 📝 Simplest Streaming Trees

    Haoyin Xu, Jayanta Dey, Sambit Panda, Joshua T. Vogelstein
    arXiv, 2023

    Developed a streaming algorithm for decision trees based on the simplest possible extension of them.

  3. 📝 Learning Interpretable Characteristic Kernels via Decision Forests

    Sambit Panda*, Cencheng Shen*, Joshua T. Vogelstein
    arXiv, 2023

    Demonstrates the kernel derived from random forest is characteristic and develops a hypothesis test based on that fact (KMERF).

  4. 📝 When are Deep Networks really better than Decision Forests at small sample sizes, and how?

    Haoyin Xu, Kaleab A. Kinfu, Will LeVine, Sambit Panda, Jayanta Dey, Michael Ainsworth, Yu-Chung Peng, Madi Kusmanov, Florian Engert, Christopher M. White, Joshua T. Vogelstein, Carey E. Priebe
    arXiv, 2021

    Illustrates that forest based methods excel at tabular data classification at small sample sizes while networks excel at larger sample sizes.