Official implementation for the AAAI'25 paper "Federated Foundation Models on Heterogeneous Time Series"
- π Overview
- β¨ Key Features
- π― Method
- π οΈ Installation
- β‘ Quick Start
- π Repository Structure
- π Citation
- π€ Acknowledgments
- π License
Training general-purpose time series foundation models across diverse domains is challenging due to severe statistical heterogeneity. FFTS tackles this with a federated learning formulation where each dataset owner is a client with its own local model.
| Challenge | Solution |
|---|---|
| β Client-specific local models | |
| β Shared knowledge alignment | |
| β Dual-side regularization | |
| β Unified adaptation architecture |
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Central Server β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Global Model Aggregation + Knowledge Alignment β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββΌββββββββββββββββ
βΌ βΌ βΌ
ββββββββββββ ββββββββββββ ββββββββββββ
β Client 1 β β Client 2 β β Client N β
β Domain β β Domain β β Domain β
ββββββββββββ ββββββββββββ ββββββββββββ
The resulting foundation model generalizes well across forecasting, imputation, and anomaly detection tasks.
graph TD
A[FFTS Framework] --> B[Federated Foundation Model]
A --> C[Client-Specific Local Models]
A --> D[Unified Adaptation Architecture]
A --> E[Learnable Time-Scale Weights]
B --> B1[Heterogeneous Dataset Support]
C --> C1[Domain-Specific Pattern Preservation]
D --> D1[Multi-Task Adaptation]
E --> E1[Temporal Pattern Learning]
style A fill:#6e42f5,color:#fff
style B fill:#e7f2ff
style C fill:#e7f2ff
style D fill:#e7f2ff
style E fill:#e7f2ff
π Feature Details
| Feature | Description | Benefit |
|---|---|---|
| π Federated Learning | Each dataset owner operates as an independent client | Privacy-preserving collaboration |
| π§ Client-Specific Models | Local models preserve dataset-specific characteristics | Better domain adaptation |
| π€ Knowledge Alignment | Client and server regularization align shared knowledge | Effective cross-domain learning |
| π― Unified Adaptation | Single architecture for multiple downstream tasks | Efficient fine-tuning |
| β° Learnable Time-Scale Weights | ATM module with adaptive temporal weights | Enhanced pattern recognition |
π§ Detailed Training Process
graph LR
A[Data Collection] --> B[Local Preprocessing]
B --> C[Client Model Training]
C --> D[Local Regularization]
D --> E[Model Upload]
E --> F[Server Aggregation]
F --> G[Global Regularization]
G --> H[Model Distribution]
H --> C
style A fill:#e1f5ff
style C fill:#e1f5ff
style F fill:#fff4e1
style H fill:#e1ffe1
Key Components:
-
Client-Side Training
- Local model optimization on private data
- Client-specific pattern preservation
- Local regularization for knowledge alignment
-
Server-Side Aggregation
- Secure model averaging
- Global knowledge alignment
- Federated model distribution
-
Regularization Mechanism
- Dual-side alignment (client + server)
- Balances shared vs. domain-specific knowledge
- Ensures generalization across domains
- Python >= 3.8
- PyTorch >= 2.0
- CUDA (for GPU acceleration, recommended)
# Clone the repository
git clone https://github.com/shengchaochen82/FFTS.git
cd FFTS
# Create virtual environment
python -m venv ffts_env
source ffts_env/bin/activate # On Windows: ffts_env\Scripts\activate
# Install dependencies
pip install -r requirements.txtπ¦ Requirements List
torch>=2.0.0
numpy>=1.21.0
pandas>=1.3.0
scikit-learn>=1.0.0
matplotlib>=3.4.0
tensorboard>=2.8.0# Download datasets from Monash Time Series Repo
# Visit: https://forecastingdata.org/
# Follow preprocessing steps in the notebook
jupyter notebook preprocessing.ipynbπ‘ Tip: The
preprocessing.ipynbnotebook provides unified preprocessing for all supported datasets.
# Basic pretraining command
python main.py \
--task pretrain \
--task_note demo_run \
--is_training 1 \
--algorithm FFTS \
--dataset weather \
--global_rounds 10 \
--local_epochs 5βοΈ Advanced Configuration
| Argument | Description | Default |
|---|---|---|
--task |
Task type (pretrain, forecasting, imputation, anomaly_detection) | pretrain |
--dataset |
Dataset name (weather, traffic, electricity, etc.) | weather |
--algorithm |
Federated algorithm (FFTS, FedAvg) | FFTS |
--global_rounds |
Number of federated learning rounds | 10 |
--local_epochs |
Local training epochs per client | 5 |
--batch_size |
Training batch size | 32 |
--learning_rate |
Learning rate | 0.001 |
--num_clients |
Number of federated clients | 10 |
python main.py \
--task long_term_forecast \
--task_note weather_forecast \
--is_training 1 \
--algorithm FFTS \
--dataset weather \
--pred_len 96 \
--seq_len 96python main.py \
--task imputation \
--task_note weather_impute \
--is_training 1 \
--algorithm FFTS \
--dataset weather \
--mask_rate 0.2python main.py \
--task anomaly_detection \
--task_note traffic_anomaly \
--is_training 1 \
--algorithm FFTS \
--dataset trafficFFTS/
βββ π data_provider/ # Dataset loading and preprocessing
β βββ data_base.py # Base dataset class
β βββ monash_data.py # Monash dataset loader
β βββ pre_loader.py # Data preprocessing utilities
β
βββ π flcore/ # Federated learning core components
β βββ servers/ # Server implementations
β β βββ serveravg.py # FedAvg server
β β βββ serverffts.py # FFTS server
β βββ clients/ # Client implementations
β β βββ clientbase.py # Base client class
β β βββ clientavg.py # FedAvg client
β βββ layers/ # Neural network layers
β βββ Transformer_EncDec.py
β βββ SelfAttention_Family.py
β βββ Embed.py
β
βββ π models/ # Model definitions
β βββ ffts_model.py # FFTS model architecture
β
βββ π utils/ # Utilities and helpers
β βββ metrics.py # Evaluation metrics
β βββ tools.py # Helper functions
β βββ losses.py # Loss functions
β βββ timefeatures.py # Time feature extraction
β
βββ π assest/ # Assets (images, diagrams)
β βββ difference.png # Architecture diagram
β βββ adaption.png # Adaptation architecture
β βββ pretrain_data.png # Dataset visualization
β
βββ preprocessing.ipynb # Unified preprocessing notebook
βββ main.py # Training entry point
βββ requirements.txt # Python dependencies
βββ LICENSE # MIT License
βββ README.md # This file
If you find this work useful, please cite our paper:
@inproceedings{chen2025federated,
title={Federated foundation models on heterogeneous time series},
author={Chen, Shengchao and Long, Guodong and Jiang, Jing and Zhang, Chengqi},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={39},
number={15},
pages={15839--15847},
year={2025},
organization={AAAI Press}
}Note
Development Status: We are continuously improving the codebase. Some interfaces may change as we enhance the framework.
This work was inspired and supported by:
- π Time-Series-Library - Excellent time series modeling framework
- π PFLlib - Personalized federated learning library
- π Monash Time Series Repository - Comprehensive time series datasets
This project is licensed under the MIT License - see the LICENSE file for details.
π Development Timeline
timeline
title FFTS Development Roadmap
section 2024
Dec 2024 : Paper accepted at AAAI 2025
Dec 2024 : Preprint posted on arXiv
section 2025
Jan 2025 : Pretraining datasets available
Jan 2025 : Preprocessing tutorials released
Aug 2025 : Codebase restructured
Aug 2025 : Learnable time-scale weights added
section Future
Q1 2026 : Extended experiments
Q2 2026 : Additional dataset support
Q3 2026 : Documentation enhancement
- Release core codebase
- Release detailed training tutorials
- Pretraining data download and tutorials
- Release AAAI 2025 paper
- Implement federated learning framework
- Add ATM module with learnable weights
- Time-Series-Library - A unified library for time series analysis
- PFLlib - Personalized federated learning library
- FederatedScope - A comprehensive federated learning platform


