The Recurrent Sticky Hierarchical Dirichlet Process Hidden Markov Model

Bibliographic Details
Title: The Recurrent Sticky Hierarchical Dirichlet Process Hidden Markov Model
Authors: Słupiński, Mikołaj, Lipiński, Piotr
Publication Year: 2024
Collection: Computer Science
Mathematics
Statistics
Subject Terms: Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Mathematics - Dynamical Systems, Statistics - Machine Learning
More Details: The Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) is a natural Bayesian nonparametric extension of the classical Hidden Markov Model for learning from (spatio-)temporal data. A sticky HDP-HMM has been proposed to strengthen the self-persistence probability in the HDP-HMM. Then, disentangled sticky HDP-HMM has been proposed to disentangle the strength of the self-persistence prior and transition prior. However, the sticky HDP-HMM assumes that the self-persistence probability is stationary, limiting its expressiveness. Here, we build on previous work on sticky HDP-HMM and disentangled sticky HDP-HMM, developing a more general model: the recurrent sticky HDP-HMM (RS-HDP-HMM). We develop a novel Gibbs sampling strategy for efficient inference in this model. We show that RS-HDP-HMM outperforms disentangled sticky HDP-HMM, sticky HDP-HMM, and HDP-HMM in both synthetic and real data segmentation.
Document Type: Working Paper
Access URL: http://arxiv.org/abs/2411.04278
Accession Number: edsarx.2411.04278
Database: arXiv
FullText Text:
  Availability: 0
CustomLinks:
  – Url: http://arxiv.org/abs/2411.04278
    Name: EDS - Arxiv
    Category: fullText
    Text: View this record from Arxiv
    MouseOverText: View this record from Arxiv
  – Url: https://resolver.ebsco.com/c/xy5jbn/result?sid=EBSCO:edsarx&genre=article&issn=&ISBN=&volume=&issue=&date=20241106&spage=&pages=&title=The Recurrent Sticky Hierarchical Dirichlet Process Hidden Markov Model&atitle=The%20Recurrent%20Sticky%20Hierarchical%20Dirichlet%20Process%20Hidden%20Markov%20Model&aulast=S%C5%82upi%C5%84ski%2C%20Miko%C5%82aj&id=DOI:
    Name: Full Text Finder (for New FTF UI) (s8985755)
    Category: fullText
    Text: Find It @ SCU Libraries
    MouseOverText: Find It @ SCU Libraries
Header DbId: edsarx
DbLabel: arXiv
An: edsarx.2411.04278
RelevancyScore: 1128
AccessLevel: 3
PubType: Report
PubTypeId: report
PreciseRelevancyScore: 1128.03063964844
IllustrationInfo
Items – Name: Title
  Label: Title
  Group: Ti
  Data: The Recurrent Sticky Hierarchical Dirichlet Process Hidden Markov Model
– Name: Author
  Label: Authors
  Group: Au
  Data: <searchLink fieldCode="AR" term="%22Słupiński%2C+Mikołaj%22">Słupiński, Mikołaj</searchLink><br /><searchLink fieldCode="AR" term="%22Lipiński%2C+Piotr%22">Lipiński, Piotr</searchLink>
– Name: DatePubCY
  Label: Publication Year
  Group: Date
  Data: 2024
– Name: Subset
  Label: Collection
  Group: HoldingsInfo
  Data: Computer Science<br />Mathematics<br />Statistics
– Name: Subject
  Label: Subject Terms
  Group: Su
  Data: <searchLink fieldCode="DE" term="%22Computer+Science+-+Machine+Learning%22">Computer Science - Machine Learning</searchLink><br /><searchLink fieldCode="DE" term="%22Computer+Science+-+Artificial+Intelligence%22">Computer Science - Artificial Intelligence</searchLink><br /><searchLink fieldCode="DE" term="%22Mathematics+-+Dynamical+Systems%22">Mathematics - Dynamical Systems</searchLink><br /><searchLink fieldCode="DE" term="%22Statistics+-+Machine+Learning%22">Statistics - Machine Learning</searchLink>
– Name: Abstract
  Label: Description
  Group: Ab
  Data: The Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM) is a natural Bayesian nonparametric extension of the classical Hidden Markov Model for learning from (spatio-)temporal data. A sticky HDP-HMM has been proposed to strengthen the self-persistence probability in the HDP-HMM. Then, disentangled sticky HDP-HMM has been proposed to disentangle the strength of the self-persistence prior and transition prior. However, the sticky HDP-HMM assumes that the self-persistence probability is stationary, limiting its expressiveness. Here, we build on previous work on sticky HDP-HMM and disentangled sticky HDP-HMM, developing a more general model: the recurrent sticky HDP-HMM (RS-HDP-HMM). We develop a novel Gibbs sampling strategy for efficient inference in this model. We show that RS-HDP-HMM outperforms disentangled sticky HDP-HMM, sticky HDP-HMM, and HDP-HMM in both synthetic and real data segmentation.
– Name: TypeDocument
  Label: Document Type
  Group: TypDoc
  Data: Working Paper
– Name: URL
  Label: Access URL
  Group: URL
  Data: <link linkTarget="URL" linkTerm="http://arxiv.org/abs/2411.04278" linkWindow="_blank">http://arxiv.org/abs/2411.04278</link>
– Name: AN
  Label: Accession Number
  Group: ID
  Data: edsarx.2411.04278
PLink https://login.libproxy.scu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&scope=site&db=edsarx&AN=edsarx.2411.04278
RecordInfo BibRecord:
  BibEntity:
    Subjects:
      – SubjectFull: Computer Science - Machine Learning
        Type: general
      – SubjectFull: Computer Science - Artificial Intelligence
        Type: general
      – SubjectFull: Mathematics - Dynamical Systems
        Type: general
      – SubjectFull: Statistics - Machine Learning
        Type: general
    Titles:
      – TitleFull: The Recurrent Sticky Hierarchical Dirichlet Process Hidden Markov Model
        Type: main
  BibRelationships:
    HasContributorRelationships:
      – PersonEntity:
          Name:
            NameFull: Słupiński, Mikołaj
      – PersonEntity:
          Name:
            NameFull: Lipiński, Piotr
    IsPartOfRelationships:
      – BibEntity:
          Dates:
            – D: 06
              M: 11
              Type: published
              Y: 2024
ResultId 1