Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

MHCCL : Masked Hierarchical Cluster-Wise Contrastive Learning for Multivariate Time Series

Meng, Qianwen ; Qian, Hangwei LU ; Liu, Yong ; Cui, Lizhen ; Xu, Yonghui and Shen, Zhiqi (2023) 37th AAAI Conference on Artificial Intelligence, AAAI 2023 37. p.9153-9161
Abstract

Learning semantic-rich representations from raw unlabeled time series data is critical for downstream tasks such as classification and forecasting. Contrastive learning has recently shown its promising representation learning capability in the absence of expert annotations. However, existing contrastive approaches generally treat each instance independently, which leads to false negative pairs that share the same semantics. To tackle this problem, we propose MHCCL, a Masked Hierarchical Cluster-wise Contrastive Learning model, which exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for multivariate time series. Motivated by the observation that fine-grained clustering... (More)

Learning semantic-rich representations from raw unlabeled time series data is critical for downstream tasks such as classification and forecasting. Contrastive learning has recently shown its promising representation learning capability in the absence of expert annotations. However, existing contrastive approaches generally treat each instance independently, which leads to false negative pairs that share the same semantics. To tackle this problem, we propose MHCCL, a Masked Hierarchical Cluster-wise Contrastive Learning model, which exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for multivariate time series. Motivated by the observation that fine-grained clustering preserves higher purity while coarse-grained one reflects higher-level semantics, we propose a novel downward masking strategy to filter out fake negatives and supplement positives by incorporating the multi-granularity information from the clustering hierarchy. In addition, a novel upward masking strategy is designed in MHCCL to remove outliers of clusters at each partition to refine prototypes, which helps speed up the hierarchical clustering process and improves the clustering quality. We conduct experimental evaluations on seven widely-used multivariate time series datasets. The results demonstrate the superiority of MHCCL over the state-of-the-art approaches for unsupervised time series representation learning.

(Less)
Please use this url to cite or link to this publication:
author
; ; ; ; and
organization
publishing date
type
Chapter in Book/Report/Conference proceeding
publication status
published
subject
host publication
Proceedings of the 37th AAAI Conference on Artificial Intelligence
editor
Williams, Brian ; Chen, Yiling and Neville, Jennifer
volume
37
pages
9 pages
publisher
AAAI Press
conference name
37th AAAI Conference on Artificial Intelligence, AAAI 2023
conference location
Washington, United States
conference dates
2023-02-07 - 2023-02-14
external identifiers
  • scopus:85168234624
ISBN
9781577358800
DOI
10.1609/aaai.v37i8.26098
language
English
LU publication?
yes
id
ccecb081-570f-4a36-afb0-9888998d08a5
date added to LUP
2023-11-06 10:01:23
date last changed
2023-11-06 10:01:23
@inproceedings{ccecb081-570f-4a36-afb0-9888998d08a5,
  abstract     = {{<p>Learning semantic-rich representations from raw unlabeled time series data is critical for downstream tasks such as classification and forecasting. Contrastive learning has recently shown its promising representation learning capability in the absence of expert annotations. However, existing contrastive approaches generally treat each instance independently, which leads to false negative pairs that share the same semantics. To tackle this problem, we propose MHCCL, a Masked Hierarchical Cluster-wise Contrastive Learning model, which exploits semantic information obtained from the hierarchical structure consisting of multiple latent partitions for multivariate time series. Motivated by the observation that fine-grained clustering preserves higher purity while coarse-grained one reflects higher-level semantics, we propose a novel downward masking strategy to filter out fake negatives and supplement positives by incorporating the multi-granularity information from the clustering hierarchy. In addition, a novel upward masking strategy is designed in MHCCL to remove outliers of clusters at each partition to refine prototypes, which helps speed up the hierarchical clustering process and improves the clustering quality. We conduct experimental evaluations on seven widely-used multivariate time series datasets. The results demonstrate the superiority of MHCCL over the state-of-the-art approaches for unsupervised time series representation learning.</p>}},
  author       = {{Meng, Qianwen and Qian, Hangwei and Liu, Yong and Cui, Lizhen and Xu, Yonghui and Shen, Zhiqi}},
  booktitle    = {{Proceedings of the 37th AAAI Conference on Artificial Intelligence}},
  editor       = {{Williams, Brian and Chen, Yiling and Neville, Jennifer}},
  isbn         = {{9781577358800}},
  language     = {{eng}},
  pages        = {{9153--9161}},
  publisher    = {{AAAI Press}},
  title        = {{MHCCL : Masked Hierarchical Cluster-Wise Contrastive Learning for Multivariate Time Series}},
  url          = {{http://dx.doi.org/10.1609/aaai.v37i8.26098}},
  doi          = {{10.1609/aaai.v37i8.26098}},
  volume       = {{37}},
  year         = {{2023}},
}