Skip to content

[AAAI'25] The implementation of paper "Federated Foundation Models on Heterogeneous Time Series" | The first work to explore time series foundation models on federated setting.

License

Notifications You must be signed in to change notification settings

shengchaochen82/FFTS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

25 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

FFTS Logo Federated Foundation Models on Heterogeneous Time Series (FFTS)

AAAI 2025 | arXiv AAAI License: MIT

GitHub stars GitHub forks GitHub issues GitHub pull requests

Python PyTorch Federated Learning

Official implementation for the AAAI'25 paper "Federated Foundation Models on Heterogeneous Time Series"


πŸ“Œ Table of Contents


πŸš€ Overview

Overview

Training general-purpose time series foundation models across diverse domains is challenging due to severe statistical heterogeneity. FFTS tackles this with a federated learning formulation where each dataset owner is a client with its own local model.

🎯 Core Innovation

Challenge Solution
⚠️ Statistical heterogeneity across datasets βœ… Client-specific local models
⚠️ Domain-specific patterns loss βœ… Shared knowledge alignment
⚠️ Limited generalization βœ… Dual-side regularization
⚠️ Single-task limitations βœ… Unified adaptation architecture

πŸ”„ How It Works

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                     Central Server                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”‚
β”‚  β”‚  Global Model Aggregation + Knowledge Alignment     β”‚   β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                          β”‚
          β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
          β–Ό               β–Ό               β–Ό
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚ Client 1 β”‚    β”‚ Client 2 β”‚    β”‚ Client N β”‚
    β”‚  Domain  β”‚    β”‚  Domain  β”‚    β”‚  Domain  β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

The resulting foundation model generalizes well across forecasting, imputation, and anomaly detection tasks.


✨ Key Features

🌟 Highlights

graph TD
    A[FFTS Framework] --> B[Federated Foundation Model]
    A --> C[Client-Specific Local Models]
    A --> D[Unified Adaptation Architecture]
    A --> E[Learnable Time-Scale Weights]

    B --> B1[Heterogeneous Dataset Support]
    C --> C1[Domain-Specific Pattern Preservation]
    D --> D1[Multi-Task Adaptation]
    E --> E1[Temporal Pattern Learning]

    style A fill:#6e42f5,color:#fff
    style B fill:#e7f2ff
    style C fill:#e7f2ff
    style D fill:#e7f2ff
    style E fill:#e7f2ff
Loading
πŸ“‹ Feature Details
Feature Description Benefit
πŸ”— Federated Learning Each dataset owner operates as an independent client Privacy-preserving collaboration
🧠 Client-Specific Models Local models preserve dataset-specific characteristics Better domain adaptation
🀝 Knowledge Alignment Client and server regularization align shared knowledge Effective cross-domain learning
🎯 Unified Adaptation Single architecture for multiple downstream tasks Efficient fine-tuning
⏰ Learnable Time-Scale Weights ATM module with adaptive temporal weights Enhanced pattern recognition

🎯 Method

Architecture Overview

Architecture

Pretraining Datasets

Datasets

Training Pipeline

πŸ”§ Detailed Training Process
graph LR
    A[Data Collection] --> B[Local Preprocessing]
    B --> C[Client Model Training]
    C --> D[Local Regularization]
    D --> E[Model Upload]
    E --> F[Server Aggregation]
    F --> G[Global Regularization]
    G --> H[Model Distribution]
    H --> C

    style A fill:#e1f5ff
    style C fill:#e1f5ff
    style F fill:#fff4e1
    style H fill:#e1ffe1
Loading

Key Components:

  1. Client-Side Training

    • Local model optimization on private data
    • Client-specific pattern preservation
    • Local regularization for knowledge alignment
  2. Server-Side Aggregation

    • Secure model averaging
    • Global knowledge alignment
    • Federated model distribution
  3. Regularization Mechanism

    • Dual-side alignment (client + server)
    • Balances shared vs. domain-specific knowledge
    • Ensures generalization across domains

πŸ› οΈ Installation

Prerequisites

  • Python >= 3.8
  • PyTorch >= 2.0
  • CUDA (for GPU acceleration, recommended)

Setup

# Clone the repository
git clone https://github.com/shengchaochen82/FFTS.git
cd FFTS

# Create virtual environment
python -m venv ffts_env
source ffts_env/bin/activate  # On Windows: ffts_env\Scripts\activate

# Install dependencies
pip install -r requirements.txt

Dependencies

πŸ“¦ Requirements List
torch>=2.0.0
numpy>=1.21.0
pandas>=1.3.0
scikit-learn>=1.0.0
matplotlib>=3.4.0
tensorboard>=2.8.0

⚑ Quick Start

1️⃣ Prepare Data

# Download datasets from Monash Time Series Repo
# Visit: https://forecastingdata.org/

# Follow preprocessing steps in the notebook
jupyter notebook preprocessing.ipynb

πŸ’‘ Tip: The preprocessing.ipynb notebook provides unified preprocessing for all supported datasets.

2️⃣ Run Training

# Basic pretraining command
python main.py \
  --task pretrain \
  --task_note demo_run \
  --is_training 1 \
  --algorithm FFTS \
  --dataset weather \
  --global_rounds 10 \
  --local_epochs 5

3️⃣ Configuration Options

βš™οΈ Advanced Configuration
Argument Description Default
--task Task type (pretrain, forecasting, imputation, anomaly_detection) pretrain
--dataset Dataset name (weather, traffic, electricity, etc.) weather
--algorithm Federated algorithm (FFTS, FedAvg) FFTS
--global_rounds Number of federated learning rounds 10
--local_epochs Local training epochs per client 5
--batch_size Training batch size 32
--learning_rate Learning rate 0.001
--num_clients Number of federated clients 10

Example: Forecasting Task

python main.py \
  --task long_term_forecast \
  --task_note weather_forecast \
  --is_training 1 \
  --algorithm FFTS \
  --dataset weather \
  --pred_len 96 \
  --seq_len 96

Example: Imputation Task

python main.py \
  --task imputation \
  --task_note weather_impute \
  --is_training 1 \
  --algorithm FFTS \
  --dataset weather \
  --mask_rate 0.2

Example: Anomaly Detection

python main.py \
  --task anomaly_detection \
  --task_note traffic_anomaly \
  --is_training 1 \
  --algorithm FFTS \
  --dataset traffic

πŸ“ Repository Structure

FFTS/
β”œβ”€β”€ πŸ“‚ data_provider/           # Dataset loading and preprocessing
β”‚   β”œβ”€β”€ data_base.py           # Base dataset class
β”‚   β”œβ”€β”€ monash_data.py         # Monash dataset loader
β”‚   └── pre_loader.py          # Data preprocessing utilities
β”‚
β”œβ”€β”€ πŸ“‚ flcore/                  # Federated learning core components
β”‚   β”œβ”€β”€ servers/               # Server implementations
β”‚   β”‚   β”œβ”€β”€ serveravg.py       # FedAvg server
β”‚   β”‚   └── serverffts.py      # FFTS server
β”‚   β”œβ”€β”€ clients/               # Client implementations
β”‚   β”‚   β”œβ”€β”€ clientbase.py      # Base client class
β”‚   β”‚   └── clientavg.py       # FedAvg client
β”‚   └── layers/                # Neural network layers
β”‚       β”œβ”€β”€ Transformer_EncDec.py
β”‚       β”œβ”€β”€ SelfAttention_Family.py
β”‚       └── Embed.py
β”‚
β”œβ”€β”€ πŸ“‚ models/                  # Model definitions
β”‚   └── ffts_model.py          # FFTS model architecture
β”‚
β”œβ”€β”€ πŸ“‚ utils/                   # Utilities and helpers
β”‚   β”œβ”€β”€ metrics.py             # Evaluation metrics
β”‚   β”œβ”€β”€ tools.py               # Helper functions
β”‚   β”œβ”€β”€ losses.py              # Loss functions
β”‚   └── timefeatures.py        # Time feature extraction
β”‚
β”œβ”€β”€ πŸ“‚ assest/                  # Assets (images, diagrams)
β”‚   β”œβ”€β”€ difference.png         # Architecture diagram
β”‚   β”œβ”€β”€ adaption.png           # Adaptation architecture
β”‚   └── pretrain_data.png      # Dataset visualization
β”‚
β”œβ”€β”€ preprocessing.ipynb         # Unified preprocessing notebook
β”œβ”€β”€ main.py                     # Training entry point
β”œβ”€β”€ requirements.txt            # Python dependencies
β”œβ”€β”€ LICENSE                     # MIT License
└── README.md                   # This file

πŸ“œ Citation

If you find this work useful, please cite our paper:

@inproceedings{chen2025federated,
  title={Federated foundation models on heterogeneous time series},
  author={Chen, Shengchao and Long, Guodong and Jiang, Jing and Zhang, Chengqi},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={39},
  number={15},
  pages={15839--15847},
  year={2025},
  organization={AAAI Press}
}
πŸ“Š Altmetrics

arXiv Citation


🀝 Acknowledgments

Note

Development Status: We are continuously improving the codebase. Some interfaces may change as we enhance the framework.

This work was inspired and supported by:


πŸ“œ License

This project is licensed under the MIT License - see the LICENSE file for details.


πŸ“¬ Contact

  • Shengchao Chen - Bio
  • For questions, please open an issue

πŸ—ΊοΈ Roadmap

πŸš€ Development Timeline
timeline
    title FFTS Development Roadmap
    section 2024
        Dec 2024 : Paper accepted at AAAI 2025
        Dec 2024 : Preprint posted on arXiv
    section 2025
        Jan 2025 : Pretraining datasets available
        Jan 2025 : Preprocessing tutorials released
        Aug 2025 : Codebase restructured
        Aug 2025 : Learnable time-scale weights added
    section Future
        Q1 2026 : Extended experiments
        Q2 2026 : Additional dataset support
        Q3 2026 : Documentation enhancement
Loading

Completed βœ…

  • Release core codebase
  • Release detailed training tutorials
  • Pretraining data download and tutorials
  • Release AAAI 2025 paper
  • Implement federated learning framework
  • Add ATM module with learnable weights

πŸ”— Related Projects


⬆ Back to Top

Made with ❀️ by Shengchao Chen

Star History Chart

About

[AAAI'25] The implementation of paper "Federated Foundation Models on Heterogeneous Time Series" | The first work to explore time series foundation models on federated setting.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published