Preferred Language
Articles
/
ijp-361
Adaptive inter frame compression using image segmented technique
...Show More Authors

The computer vision branch of the artificial intelligence field is concerned with developing algorithms for analyzing video image content. Extracting edge information, which is the essential process in most pictorial pattern recognition problems. A new method of edge detection technique has been introduces in this research, for detecting boundaries.

           Selection of typical lossy techniques for encoding edge video images are also discussed in this research. The concentration is devoted to discuss the Block-Truncation coding technique and Discrete Cosine Transform (DCT) coding technique. In order to reduce the volume of pictorial data which one may need to store or transmit, the research modifies a method for video image data compression based on the two-component code; in this coding technique, the video image is partitioned into regions of slowly varying intensity. The contours separating the regions are coded by DCT, while the rest image regions are coded by Block-Truncation Coding. this hybrid coding technique called segmented image coding (SIC). Also in this paper A modify of the four step search for motion Estimation technique was produce. for searching scheme has been introduced which is contributed in decreasing  the motion estimation  searching time of the successive inter frames.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Oct 02 2023
Journal Name
Journal Of Engineering
Skull Stripping Based on the Segmentation Models
...Show More Authors

Skull image separation is one of the initial procedures used to detect brain abnormalities. In an MRI image of the brain, this process involves distinguishing the tissue that makes up the brain from the tissue that does not make up the brain. Even for experienced radiologists, separating the brain from the skull is a difficult task, and the accuracy of the results can vary quite a little from one individual to the next. Therefore, skull stripping in brain magnetic resonance volume has become increasingly popular due to the requirement for a dependable, accurate, and thorough method for processing brain datasets. Furthermore, skull stripping must be performed accurately for neuroimaging diagnostic systems since neither no

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jul 01 2021
Journal Name
University Of Northampton Pue
Validating a Proposed Data Mining Approach (SLDM) for Motion Wearable Sensors to Detect the Early Signs of Lameness in Sheep
...Show More Authors

View Publication
Publication Date
Wed Jan 01 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Bayesian and non-Bayesian estimation of the lomax model based on upper record values under weighted LINEX loss function
...Show More Authors

In this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.

View Publication
Scopus (9)
Scopus
Publication Date
Tue Oct 22 2024
Journal Name
Iraqi Statisticians Journal
Inferential Methods for the Dagum Regression Model
...Show More Authors

The Dagum Regression Model, introduced to address limitations in traditional econometric models, provides enhanced flexibility for analyzing data characterized by heavy tails and asymmetry, which is common in income and wealth distributions. This paper develops and applies the Dagum model, demonstrating its advantages over other distributions such as the Log-Normal and Gamma distributions. The model's parameters are estimated using Maximum Likelihood Estimation (MLE) and the Method of Moments (MoM). A simulation study evaluates both methods' performance across various sample sizes, showing that MoM tends to offer more robust and precise estimates, particularly in small samples. These findings provide valuable insights into the ana

... Show More
View Publication Preview PDF
Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Reliability through the Wiener Degradation Process Based on the Genetic Algorithm to Estimating Parameters
...Show More Authors

      In this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process,  where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Hurst exponent estimation methods
...Show More Authors

Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Survival estimation for singly type one censored sample based on generalized Rayleigh distribution
...Show More Authors

This paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.

View Publication Preview PDF
Crossref
Publication Date
Tue Feb 01 2022
Journal Name
Baghdad Science Journal
New White Method of Parameters and Reliability Estimation for Transmuted Power Function Distribution
...Show More Authors

        In this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3)  of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Apr 20 2024
Journal Name
Baghdad Science Journal
طريقة مقترحة لتغيير حجم الصورة باستخدام منحني Bezier
...Show More Authors

عملية تغيير حجم الصورة في مجال معالجة الصور باستخدام التحويلات الهندسية بدون تغيير دقة الصورة تعرف ب image scaling  او image resizing. عملية تغيير حجم الصورة لها تطبيقات واسعة في مجال الحاسوب والهاتف النقال والاجهزة الالكترونية الاخرى. يقترح هذا البحث طريقة لتغيير حجم الصورة باستخدام المعادلات الخاصة بمنحني Bezier وكيفية الحصول على افضل نتائج. تم استخدام Bezier curve في اعمال سابقة في مجالات مختلفة ولكن في هذا البحث تم استخد

... Show More
View Publication Preview PDF
Scopus Clarivate Crossref
Publication Date
Sun Sep 03 2017
Journal Name
Baghdad Science Journal
Scale-Invariant Feature Transform Algorithm with Fast Approximate Nearest Neighbor
...Show More Authors

There is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (2)
Scopus Crossref