Type of Document Dissertation Author Zakaria, Gaguk Author's Email Address email@example.com URN etd-02242000-10250027 Title Cascade RLS with Subsection Adaptation Degree PhD Department Electrical and Computer Engineering Advisory Committee
Advisor Name Title Beex, A. A. Louis Committee Chair Ball, Joseph A. Committee Member Moose, Richard L. Committee Member Reed, Jeffrey Hugh Committee Member VanLandingham, Hugh F. Committee Member Keywords
- Adaptive Filtering
- AR Process
- Cascade Structure
Date of Defense 2000-02-14 Availability unrestricted AbstractCASCADE RLS WITH SUBSECTION ADAPTATION
Prof. A. A. (Louis) Beex, Chairman
The Bradley Department of Electrical and Computer Engineering
Speech coding or speech compression is one of the important aspects of speech communications nowadays. By coding the speech, the speed needed to transmit the digitized speech, called the bit rate, can be reduced. This means that for a certain speech communications channel, the lower the bit rate of the speech coding, the more communicating parties can be carried on that channel. This research has as its main application the extraction of the parameters of human speech for speech coding purposes.
We propose an RLS-based cascade adaptive filter structure that can significantly reduce the computational effort required by the RLS algorithm for inverse filtering types of applications. We named it the Cascade RLS with Subsection Adaptation (CRLS-SA) algorithm. The reduction in computational effort comes from the fact that, for inverse filtering applications, the gradients of each section in the cascade are almost uncorrelated with the gradients in other sections. Hence, the gradient autocorrelation matrix is assumed to be block diagonal. Since we use a second order filter for each section, the computation of the adaptation involves only the 2x2- gradient autocorrelation matrix for that section, while still being based on a global minimization criterion. The gradient signal of a section itself is defined as the derivative of the overall output error with respect to the coefficients of the particular section, which can be computed efficiently by passing the overall output of the cascade to a filter with coefficients that are derived from the coefficients of that section. The computational effort of the CRLS-SA algorithm is approximately 20*L*N/2, where L is the data record length and N is the order of the filter.
We analyze the convergence rate of the CRLS-SA algorithm based on the convergence time constant concept, which is the ratio of the condition number and the sensitivity. The CRLS- SA structure is shown to satisfy the DeBrunner-Beex conjecture which says that a structure with a smaller convergence time constant converges faster than a structure with a larger convergence time constant. We show that CRLS-SA converges faster than the Direct Form RLS (DFRLS) algorithm and that its convergence time constant is lower than that of the direct form. The convergence behavior is verified by looking at how fast the estimated system approaches the true system. Here we use the Itakura distance as the measure of closeness between the estimated and the true system. We show that the Itakura distance associated with the CRLS-SA algorithm approaches zero faster than that associated with the direct form RLS algorithm.
The CRLS-SA algorithm is applied in this dissertation to general linear prediction, to the direct adaptive computation of the LSF and their representation in quantized form using a split vector quantization (VQ) approach, and to the detection and tracking of the frequencies in signals consisting of multiple sinusoids in noise.
Filename Size Approximate Download Time (Hours:Minutes:Seconds)
28.8 Modem 56K Modem ISDN (64 Kb) ISDN (128 Kb) Higher-speed Access abstract.PDF 13.16 Kb 00:00:03 00:00:01 00:00:01 < 00:00:01 < 00:00:01 Acknowledgements.pdf 6.55 Kb 00:00:01 < 00:00:01 < 00:00:01 < 00:00:01 < 00:00:01 Biography.PDF 16.19 Kb 00:00:04 00:00:02 00:00:02 00:00:01 < 00:00:01 chapter1.PDF 27.30 Kb 00:00:07 00:00:03 00:00:03 00:00:01 < 00:00:01 chapter2.PDF 716.02 Kb 00:03:18 00:01:42 00:01:29 00:00:44 00:00:03 chapter3.PDF 210.97 Kb 00:00:58 00:00:30 00:00:26 00:00:13 00:00:01 chapter4.PDF 154.72 Kb 00:00:42 00:00:22 00:00:19 00:00:09 < 00:00:01 chapter5.PDF 355.26 Kb 00:01:38 00:00:50 00:00:44 00:00:22 00:00:01 chapter6.PDF 12.73 Kb 00:00:03 00:00:01 00:00:01 < 00:00:01 < 00:00:01 chapter7.PDF 62.25 Kb 00:00:17 00:00:08 00:00:07 00:00:03 < 00:00:01 list_of_abbreviations.PDF 9.05 Kb 00:00:02 00:00:01 00:00:01 < 00:00:01 < 00:00:01 list_of_figures.PDF 25.15 Kb 00:00:06 00:00:03 00:00:03 00:00:01 < 00:00:01 list_of_tables.PDF 7.82 Kb 00:00:02 00:00:01 < 00:00:01 < 00:00:01 < 00:00:01 table_of_contents.PDF 16.78 Kb 00:00:04 00:00:02 00:00:02 00:00:01 < 00:00:01 title_page.pdf 6.14 Kb 00:00:01 < 00:00:01 < 00:00:01 < 00:00:01 < 00:00:01
If you have questions or technical problems, please Contact DLA.