TabPFN
| TabPFN | |
|---|---|
| Developers | Noah Hollmann, Samuel Müller, Lennart Purucker, Arjun Krishnakumar, Max Körfer, Shi Bin Hoo, Robin Tibor Schirrmeister, Frank Hutter, Leo Grinsztajn, Klemens Flöge, Oscar Key & Sauraj Gambhir [1] |
| Initial release | September 16, 2023[2][3] |
| Stable release | v2.5
/ November 6, 2025 |
| Written in | Python[3] |
| Operating system | Linux, macOS, Microsoft Windows[3] |
| Type | Machine learning |
| License | Apache License 2.0 |
| Website | github |
TabPFN (Tabular Prior-data Fitted Network) is a machine learning model for tabular datasets proposed in 2022. It uses a transformer architecture.[1] It is intended for supervised classification and regression analysis on small- to medium-sized datasets, e.g., up to 10,000 samples.[1] TabPFN-2.5 is the latest version of the foundation model.
History
[edit]TabPFN was first introduced in a 2022 pre-print and presented at ICLR 2023.[2] TabPFN v2 was published in 2025 in Nature (journal) by Hollmann and co-authors.[1] The source code is published on GitHub under a modified Apache License and on PyPi.[4] Writing for ICLR blogs, McCarter states that the model has attracted attention due to its performance on small dataset benchmarks.[5] TabPFN v2.5, the next generation of the foundation model, was released on November 6, 2025.[6]
Prior Labs, founded in 2024, aims to commercialize TabPFN.[7]
Overview and pre-training
[edit]TabPFN supports classification, regression and generative tasks.[1] It leverages "Prior-Data Fitted Networks"[8] models to model tabular data.[1] By using a transformer pre-trained on synthetic tabular datasets,[2][5] TabPFN avoids benchmark contamination and costs of curating real-world data.[2]
TabPFN v2 was pre-trained on approximately 130 million such datasets.[1] Synthetic datasets are generated using causal models or Bayesian neural networks; this can include simulating missing values, imbalanced data, and noise.[1] Random inputs are passed through these models to generate outputs, with a bias towards simpler causal structures.[1] During pre-training, TabPFN predicts the masked target values of new data points given training data points and their known targets, effectively learning a generic learning algorithm that is executed by running a neural network forward pass.[1] The new dataset is then processed in a single forward pass without retraining.[2] The model's transformer encoder processes features and labels by alternating attention across rows and columns.[9] TabPFN v2 handles numerical and categorical features, missing values, and supports tasks like regression and synthetic data generation.[1]
Since TabPFN is pre-trained, in contrast to other deep learning methods, it does not require costly hyperparameter optimization.[9]
Research
[edit]TabPFN is the subject of on-going research. Applications for TabPFN have been investigated for domains such as chemoproteomics,[10] insurance risk classification,[11] and metagenomics.[12]
Limitations
[edit]TabPFN has been criticized for its "one large neural network is all you need" approach to modeling problems.[5] Further, its performance is limited in high-dimensional and large-scale datasets.[13]
See also
[edit]References
[edit]- ^ a b c d e f g h i j k Hollmann, N.; Müller, S.; Purucker, L. (2025). "Accurate predictions on small data with a tabular foundation model". Nature. 637 (8045): 319–326. Bibcode:2025Natur.637..319H. doi:10.1038/s41586-024-08328-6. PMC 11711098. PMID 39780007.
- ^ a b c d e Hollmann, Noah (2023). TabPFN: A transformer that solves small tabular classification problems in a second. International Conference on Learning Representations (ICLR).
- ^ a b c Python Package Index (PyPI) - tabpfn https://pypi.org/project/tabpfn/
- ^ PriorLabs/TabPFN, Prior Labs, 2025-06-22, retrieved 2025-06-23
- ^ a b c McCarter, Calvin (May 7, 2024). "What exactly has TabPFN learned to do? | ICLR Blogposts 2024". iclr-blogposts.github.io. Retrieved 2025-06-22.
- ^ "TabPFN-2.5 Model Report – Prior Labs". priorlabs.ai. Retrieved 2025-11-06.
- ^ Kahn, Jeremy (5 February 2025). "AI has struggled to analyze tables and spreadsheets. This German startup thinks its breakthrough is about to change that". Fortune.
- ^ Müller, Samuel (2022). Transformers can do Bayesian inference. International Conference on Learning Representations (ICLR).
- ^ a b McElfresh, Duncan C. (8 January 2025). "The AI tool that can interpret any spreadsheet instantly". Nature. 637 (8045): 274–275. Bibcode:2025Natur.637..274M. doi:10.1038/d41586-024-03852-x. PMID 39780000.
- ^ Offensperger, Fabian; Tin, Gary; Duran-Frigola, Miquel; Hahn, Elisa; Dobner, Sarah; Ende, Christopher W. am; Strohbach, Joseph W.; Rukavina, Andrea; Brennsteiner, Vincenth; Ogilvie, Kevin; Marella, Nara; Kladnik, Katharina; Ciuffa, Rodolfo; Majmudar, Jaimeen D.; Field, S. Denise; Bensimon, Ariel; Ferrari, Luca; Ferrada, Evandro; Ng, Amanda; Zhang, Zhechun; Degliesposti, Gianluca; Boeszoermenyi, Andras; Martens, Sascha; Stanton, Robert; Müller, André C.; Hannich, J. Thomas; Hepworth, David; Superti-Furga, Giulio; Kubicek, Stefan; Schenone, Monica; Winter, Georg E. (26 April 2024). "Large-scale chemoproteomics expedites ligand discovery and predicts ligand behavior in cells". Science. 384 (6694) eadk5864. Bibcode:2024Sci...384k5864O. doi:10.1126/science.adk5864. PMID 38662832.
- ^ Chu, Jasmin Z. K.; Than, Joel C. M.; Jo, Hudyjaya Siswoyo (2024). "Deep Learning for Cross-Selling Health Insurance Classification". 2024 International Conference on Green Energy, Computing and Sustainable Technology (GECOST). pp. 453–457. doi:10.1109/GECOST60902.2024.10475046. ISBN 979-8-3503-5790-5.
- ^ Perciballi, Giulia; Granese, Federica; Fall, Ahmad; Zehraoui, Farida; Prifti, Edi; Zucker, Jean-Daniel (10 October 2024). Adapting TabPFN for Zero-Inflated Metagenomic Data. Table Representation Learning Workshop at NeurIPS 2024.
- ^ "A Closer Look at TabPFN v2: Strength, Limitation, and Extension". arxiv.org. Retrieved 2025-07-08.