Current Issue : October - December Volume : 2013 Issue Number : 4 Articles : 5 Articles
This paper presents an efficient electrocardiogram (ECG) signals compression technique based on QRS detection, estimation, and\r\n2D DWT coefficients thresholding. Firstly, the original ECG signal is preprocessed by detecting QRS complex, then the difference\r\nbetween the preprocessed ECG signal and the estimated QRS-complex waveform is estimated. 2D approaches utilize the fact\r\nthat ECG signals generally show redundancy between adjacent beats and between adjacent samples. The error signal is cut and\r\naligned to form a 2-D matrix, then the 2-D matrix is wavelet transformed and the resulting wavelet coefficients are segmented\r\ninto groups and thresholded. There are two grouping techniques proposed to segment the DWT coefficients. The threshold level\r\nof each group of coefficients is calculated based on entropy of coefficients. The resulted thresholded DWT coefficients are coded\r\nusing the coding technique given in the work by (Abo-Zahhad and Rajoub, 2002). The compression algorithm is tested for 24\r\ndifferent records selected from the MIT-BIH Arrhythmia Database (MIT-BIH Arrhythmia Database). The experimental results\r\nshow that the proposed method achieves high compression ratio with relatively low distortion and low computational complexity\r\nin comparison with other methods....
Proper control of distillation columns requires estimating some key variables that are challenging to measure online (such as\r\ncompositions), which are usually estimated using inferential models. Commonly used inferential models include latent variable\r\nregression (LVR) techniques, such as principal component regression (PCR), partial least squares (PLS), and regularized canonical\r\ncorrelation analysis (RCCA). Unfortunately, measured practical data are usually contaminated with errors, which degrade the\r\nprediction abilities of inferential models. Therefore, noisy measurements need to be filtered to enhance the prediction accuracy of\r\nthesemodels.Multiscale filtering has been shown to be a powerful feature extraction tool. In this work, the advantages ofmultiscale\r\nfiltering are utilized to enhance the prediction accuracy of LVR models by developing an integrated multiscale LVR (IMSLVR)\r\nmodeling algorithmthat integratesmodeling and feature extraction.The idea behind the IMSLVRmodeling algorithmis to filter the\r\nprocess data at different decomposition levels, model the filtered data fromeach level, and then select the LVRmodel that optimizes\r\na model selection criterion. The performance of the developed IMSLVR algorithm is illustrated using three examples, one using\r\nsynthetic data, one using simulated distillation column data, and one using experimental packed bed distillation column data. All\r\nexamples clearly demonstrate the effectiveness of the IMSLVR algorithm over the conventional methods....
We used a physically motivated internal state variable plasticity/damage model containing a mathematical length scale to idealize\r\nthe material response in finite element simulations of a large-scale boundary value problem. The problem consists of a moving\r\nstriker colliding against a stationary hazmat tank car.Themotivations are (1) to reproduce with high fidelity finite deformation and\r\ntemperature histories, damage, and high rate phenomena that may arise during the impact accident and (2) to address the material\r\npostbifurcation regime pathological mesh size issues. We introduce the mathematical length scale in the model by adopting a\r\nnonlocal evolution equation for the damage, as suggested by Pijaudier-Cabot and Bazant in the context of concrete.We implement\r\nthis evolution equation into existing finite element subroutines of the plasticity/failuremodel.The results of the simulations, carried\r\nout with the aid of Abaqus/Explicit finite element code, show that the material model, accounting for temperature histories and\r\nnonlocal damage effects, satisfactorily predicts the damage progression during the tank car impact accident and significantly reduces\r\nthe pathological mesh size effects....
Computer-aided modeling and simulation are a crucial step in developing, integrating, and optimizing unit operations and\r\nsubsequently the entire processes in the chemical/pharmaceutical industry. This study details two methods of reducing the\r\ncomputational time to solve complex process models, namely, the population balance model which given the source terms can be\r\nvery computationally intensive. Population balance models are also widely used to describe the time evolutions and distributions of\r\nmany particulate processes, and its efficient and quick simulation would be very beneficial.The first method illustrates utilization\r\nof MATLABÃ¢â?¬â?¢s Parallel Computing Toolbox (PCT) and the second method makes use of another toolbox, JACKET, to speed up\r\ncomputations on the CPU andGPU, respectively. Results indicate significant reduction in computational time for the same accuracy\r\nusing multicore CPUs. Many-core platforms such as GPUs are also promising towards computational time reduction for larger\r\nproblems despite the limitations of lower clock speed and device memory. This lends credence to the use of highfidelity models (in\r\nplace of reduced order models) for control and optimization of particulate processes....
Amathematical model was developed for predicting the drying kinetics of spherical particles in a rotary dryer. Drying experiments\r\nwere carried out by drying fermented ground cassava particles in a bench scale rotary dryer at inlet air temperatures of 115Ã¢â?¬â??230Ã?Â°C,\r\nair velocities of 0.83 m/sÃ¢â?¬â??1.55 m/s, feed mass of 50Ã¢â?¬â??500 g, drum drive speed of 8 rpm, and feed drive speed of 100 rpmto validate the\r\nmodel. The data obtained from the experiments were used to calculate the experimental moisture ratio which compared well with\r\nthe theoretical moisture ratio calculated from the newly developed Abowei-Ademiluyi model. The comparisons and correlations\r\nof the results indicate that validation and performance of the established model are rather reasonable....
Loading....