Optimal fractional linear prediction with restricted memory

Tomas Skovranek*, Vladimir Despotovic, Zoran Peric

*Corresponding author for this work

Research output: Contribution to journalArticleResearchpeer-review

9 Citations (Scopus)

Abstract

Linear prediction is extensively used in modeling, compression, coding, and generation of speech signal. Various formulations of linear prediction are available, both in time and frequency domain, which start from different assumptions but result in the same solution. In this letter, we propose a novel, generalized formulation of the optimal low-order linear prediction using the fractional (non-integer) derivatives. The proposed fractional derivative formulation allows for the definition of predictor with versatile behavior based on the order of fractional derivative. We derive the closed-form expressions of the optimal fractional linear predictor with restricted memory, and prove that the optimal first-order and the optimal second-order linear predictors are only its special cases. Furthermore, we empirically prove that the optimal order of fractional derivative can be approximated by the inverse of the predictor memory, and thus, it is a priori known. Therefore, the complexity is reduced by optimizing and transferring only one predictor coefficient, i.e., one parameter less in comparison to the second-order linear predictor, at the same level of performance.

Original languageEnglish
Article number8676355
Pages (from-to)760-764
Number of pages5
JournalIEEE Signal Processing Letters
Volume26
Issue number5
DOIs
Publication statusPublished - May 2019
Externally publishedYes

Keywords

  • Fractional calculus
  • linear prediction
  • restricted memory
  • speech processing

Fingerprint

Dive into the research topics of 'Optimal fractional linear prediction with restricted memory'. Together they form a unique fingerprint.

Cite this