This repository contains the official implementation of our paper, accepted at Transactions on Machine Learning Research (TMLR), 2025.
To install and run the code locally:
git clone https://github.com/s-omranpour/HOT.git
cd HOT
pip install -r requirements.txtFirst, download and extract the datasets from here. Then, run the following example command:
python timeseries_main.py --data_path=data/timeseries --name=weather --attention_type=kronecker_product ...For a full list of available arguments, please refer to the timeseries_main.py file.
Run the following example command:
python medmnist_main.py --data_path=data/medmnist --name=organ --attention_type=kronecker_product ...The corresponding MedMNIST3D datasets will be downloaded automatically when running the script.
Run the following example command:
python segmentation_main.py --data_path=data/ssl4eo-l --task=cdl --attention_type=kronecker_product ...The SSL4EO-L dataset will be downloaded automatically when running the script.
If you find this work useful for your research, please consider citing our paper:
@article{omranpour2025hot,
title={Higher-Order Transformers with Kronecker-Structured Attention},
author={Omranpour, Soroush and Rabusseau, Guillaume and Rabbany, Reihaneh},
journal={Transactions on Machine Learning Research},
year={2025}
}