Foundation Models for Wireless: Pre-Training on Network Data at Scale
Dr. Ahmed Alkhateeb, Dr. Umut Demirhan
Arizona State University
Abstract
We introduce WirelessFM, a foundation model pre-trained on 100TB of diverse wireless network data including channel measurements, traffic patterns, KPIs, and configuration parameters from 50 operators worldwide. WirelessFM can be fine-tuned for any downstream wireless task with minimal data, achieving state-of-the-art results on 12 benchmark tasks including channel estimation, traffic prediction, and anomaly detection. The model reduces the data requirement for new task adaptation by 20x.
AI Summary
- WirelessFM foundation model pre-trained on 100TB of diverse network data.
- State-of-the-art results on 12 benchmark wireless tasks.
- 20x reduction in data required for new task adaptation.
- Data from 50 operators worldwide used in pre-training.
Key Findings
- 1Pre-training captures universal wireless patterns that transfer across tasks.
- 2Multi-modal pre-training on diverse data types outperforms single-modality approaches.
- 3The model discovers latent correlations between network metrics not obvious to domain experts.
Industry Implications
Democratizes wireless AI by reducing data and expertise requirements.
Could become the GPT-equivalent for the wireless/telecom domain.
Enables rapid prototyping of AI solutions for new network problems.
Read the Original Paper
Access the full paper on arXiv for complete methodology, results, and references.
Open on arXivRelated Papers
AI-Native Air Interface Design: End-to-End Learning for 6G Physical Layer
University of Stuttgart — 41 citations
AI + Network PapersDigital Twin Networks: AI-Driven Real-Time Network Simulation for 6G
Oulu University / Ruhr University Bochum — 29 citations
AI + Network PapersIntent-Based Network Management with Large Language Models
Universidad Carlos III de Madrid — 16 citations