Preferred Language
Articles
/
cobAf4YBIXToZYALhIwf
SUGGESTING MULTIPHASE REGRESSION MODEL ESTIMATION WITH SOME THRESHOLD POINT
...Show More Authors

The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of this paper is to suggest a new hybrid estimator obtained by an ad-hoc algorithm which relies on data driven strategy that overcomes outliers. While the minor goal is to introduce a new employment of an unweighted estimation method named "winsorization" which is a good method to get robustness in regression estimation via special technique to reduce the effect of the outliers. Another specific contribution in this paper is to suggest employing "Kernel" function as a new weight (in the scope of the researcher's knowledge).Moreover, two weighted estimations are based on robust weight functions named "Cauchy" and "Talworth". Simulations have been constructed with contamination levels (0%, 5%, and 10%) which associated with sample sizes (n=40,100). Real data application showed the superior performance of the suggested method compared with other methods using RMSE and R2 criteria.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Dec 31 2024
Journal Name
Bulletin Of The Chemical Society Of Ethiopia
Extraction and preconcentration of many metals ions by using CPE technique
...Show More Authors

Cloud point extraction is a simple, safe, and environmentally friendly technique for preparing many different kinds of samples. In this review, we discussed the CPE method and how to apply it to our environmental sample data. We also spoke about the benefits, problems, and likely developments in CPE. This process received a great deal of attention during preconcentration and extraction. It was used as a disconnection and follow-up improvement system before the natural mixtures (nutrients, polybrominated biphenyl ethers, pesticides, polycyclic sweet-smelling hydrocarbons, polychlorinated compounds, and fragrant amines) and inorganic mixtures were examined and many metals like (silver, lead, cadmium, mercury, and so on). We also find

... Show More
View Publication
Scopus (5)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Mon Dec 01 2014
Journal Name
Journal Of Economics And Administrative Sciences
A comparative study of stylistic kriging and Co - kriging Multivariate on the barley crop in Iraq
...Show More Authors

  This paper deals  the prediction of the process of  random spatial data of two properties, the first is called  Primary variables  and the second is called secondary  variables ,   the method  that were used in the  prediction process for this type  of data is technique Co-kriging  , the method is usually used when the number of primary variables  meant to predict for one of its elements is measured in a particular location a few (because of the cost or difficulty of obtaining them) compare with secondary variable which is the number of elements  are available and  highly correlated with primary variables, as was the&nbs

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Oct 01 2022
Journal Name
Baghdad Science Journal
A Crime Data Analysis of Prediction Based on Classification Approaches
...Show More Authors

Crime is considered as an unlawful activity of all kinds and it is punished by law. Crimes have an impact on a society's quality of life and economic development. With a large rise in crime globally, there is a necessity to analyze crime data to bring down the rate of crime. This encourages the police and people to occupy the required measures and more effectively restricting the crimes. The purpose of this research is to develop predictive models that can aid in crime pattern analysis and thus support the Boston department's crime prevention efforts. The geographical location factor has been adopted in our model, and this is due to its being an influential factor in several situations, whether it is traveling to a specific area or livin

... Show More
View Publication Preview PDF
Scopus (10)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Fri Jun 16 2023
Journal Name
Iraqi Journal Of Pharmaceutical Sciences ( P-issn 1683 - 3597 E-issn 2521 - 3512)
Optimization of Bis-anthraquinones Production from Endophytic Fungi Diaporthe sp. GNBP-10
...Show More Authors

Bis-anthraquinones with a unique molecular backbone, (+)-2,2’-epicytoskyrin A (epi) and (+)-1,1′-bislunatin (bis), was produced by endophytic fungi Diaporthe sp GNBP-10 associated with Gambir plant (Uncaria gambier). Epi and bis possess robust antimicrobial activity toward various pathogens. This study focus on knowing the optimum condition of epi and bis production from Diaporthe sp GNBP-10. A series of culture media with various nutrient compositions was investigated in epi and bis production. The content of epi and bis was determined by measuring the area under the curve from TLC-densitometric (scanner) experiment. The linear regression analysis was then applied to obtain the results. The optimi

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Sun Oct 03 2021
Journal Name
Al-manhaj
Fundamentals of Financial Statistics- (Third Edition)
...Show More Authors

This Book is intended to be a textbook studied for undergraduate course in financial statistics/ department of Financial Sciences and Banking. This book is designed to be used in semester system. To achieve the goals of the book, it is divided into the following chapters. Chapter one introduces basic concepts. Chapter two devotes to frequency distribution and data representation. Chapter three discusses central tendency measures (all types of means, mode, and median). Chapter four deals with dispersion Measures (standard deviation, variance, and coefficient of variation). Chapter five concerned with correlation and regression analysis. While chapter six concerned with testing Hypotheses (One population mean test, Two "independent" populati

... Show More
Preview PDF
Publication Date
Thu Dec 31 2020
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Application of data content analysis (DEA) technology to evaluate performance efficiency: applied research in the General Tax Authority
...Show More Authors

The aim of the research is to use the data content analysis technique (DEA) in evaluating the efficiency of the performance of the eight branches of the General Tax Authority, located in Baghdad, represented by Karrada, Karkh parties, Karkh Center, Dora, Bayaa, Kadhimiya, New Baghdad, Rusafa according to the determination of the inputs represented by the number of non-accountable taxpayers and according to the categories professions and commercial business, deduction, transfer of property ownership, real estate and tenders, In addition to determining the outputs according to the checklist that contains nine dimensions to assess the efficiency of the performance of the investigated branches by investing their available resources T

... Show More
View Publication Preview PDF
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Natural Language Processing For Requirement Elicitation In University Using Kmeans And Meanshift Algorithm
...Show More Authors

 Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Mon Jan 01 2018
Journal Name
Communications In Computer And Information Science
Automatically Recognizing Emotions in Text Using Prediction by Partial Matching (PPM) Text Compression Method
...Show More Authors

In this paper, we investigate the automatic recognition of emotion in text. We perform experiments with a new method of classification based on the PPM character-based text compression scheme. These experiments involve both coarse-grained classification (whether a text is emotional or not) and also fine-grained classification such as recognising Ekman’s six basic emotions (Anger, Disgust, Fear, Happiness, Sadness, Surprise). Experimental results with three datasets show that the new method significantly outperforms the traditional word-based text classification methods. The results show that the PPM compression based classification method is able to distinguish between emotional and nonemotional text with high accuracy, between texts invo

... Show More
View Publication
Scopus (2)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Data Aggregation in Wireless Sensor Networks Using Modified Voronoi Fuzzy Clustering Algorithm
...Show More Authors

Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le

... Show More
View Publication Preview PDF
Publication Date
Tue Feb 27 2024
Journal Name
Tem Journal
Supervised Classification Accuracy Assessment Using Remote Sensing and Geographic Information System
...Show More Authors

Assessing the accuracy of classification algorithms is paramount as it provides insights into reliability and effectiveness in solving real-world problems. Accuracy examination is essential in any remote sensing-based classification practice, given that classification maps consistently include misclassified pixels and classification misconceptions. In this study, two imaginary satellites for Duhok province, Iraq, were captured at regular intervals, and the photos were analyzed using spatial analysis tools to provide supervised classifications. Some processes were conducted to enhance the categorization, like smoothing. The classification results indicate that Duhok province is divided into four classes: vegetation cover, buildings,

... Show More
View Publication Preview PDF
Scopus (16)
Crossref (12)
Scopus Clarivate Crossref