AI/ML Papers15 min read5 citations

Efficient On-Device Training for Adaptive 6G Receivers

Dr. Junmo Kim, Prof. Youngchul Sung

KAIST

Jan 20, 2026View on arXiv

Abstract

We propose an efficient on-device training framework that enables 6G receiver neural networks to continuously adapt to changing channel conditions without cloud connectivity. Using a combination of pruned backpropagation and mixed-precision training, our approach reduces on-device training memory by 8x and energy by 5x compared to standard backpropagation, while maintaining adaptation quality. This enables continuous learning on mobile device chipsets with limited resources.

AI Summary

AI-Generated Summary
  • On-device training framework for adaptive 6G receiver neural networks.
  • 8x memory reduction and 5x energy reduction versus standard backpropagation.
  • Maintains adaptation quality while fitting mobile device constraints.
  • Enables continuous learning without cloud connectivity.

Key Findings

  • 1Pruned backpropagation skips gradient computation for 80% of parameters without accuracy loss.
  • 2Mixed-precision training at INT8 for most operations preserves model quality.
  • 3Adaptation converges within 100 iterations, taking less than 1 second on mobile hardware.

Industry Implications

Enables truly adaptive AI receivers for 6G without cloud dependency.

Supports the 6G vision of device-level intelligence.

Reduces latency and energy cost of AI model updates in the field.

On-Device TrainingAdaptive ReceiverEfficient TrainingMobile AI

Read the Original Paper

Access the full paper on arXiv for complete methodology, results, and references.

Open on arXiv

Related Papers