Preferred Language
Articles
/
z2F0TZkBdMdGkNqjeSU3
Estimating General Linear Regression Model of Big Data by Using Multiple Test Technique
...Show More Authors

Crossref
View Publication
Publication Date
Mon Jun 05 2023
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Poisson Regression and Conway Maxwell Poisson Models Using Simulation
...Show More Authors

Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well-  Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.

Paper type:

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Solving multicollinearity problem of gross domestic product using ridge regression method
...Show More Authors

This study is dedicated to solving multicollinearity problem for the general linear model by using Ridge regression method. The basic formulation of this method and suggested forms for Ridge parameter is applied to the Gross Domestic Product data in Iraq. This data has normal distribution. The best linear regression model is obtained after solving multicollinearity problem with the suggesting of 10 k value.

Scopus (4)
Scopus
Publication Date
Sat Dec 02 2017
Journal Name
Al-khwarizmi Engineering Journal
Speech Signal Compression Using Wavelet And Linear Predictive Coding
...Show More Authors

A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th

... Show More
View Publication Preview PDF
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Aggregate production planning using linear programming with practical application
...Show More Authors

Abstract :

The study aims at building a mathematical model for the aggregate production planning for Baghdad soft drinks company. The study is based on a set of aggregate planning strategies (Control of working hours, storage level control strategy) for the purpose of exploiting the resources and productive capacities available in an optimal manner and minimizing production costs by using (Matlab) program. The most important finding of the research is the importance of exploiting during the available time of production capacity. In the months when the demand is less than the production capacity available for investment. In the subsequent months when the demand exceeds the available energy and to minimize the use of overti

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Jun 23 2023
Journal Name
Al-mustansiriyah Journal Of Science
Image Encryption Using New Non-Linear Stream Cipher Cryptosystem
...Show More Authors

In this paper, we designed a new efficient stream cipher cryptosystem that depend on a chaotic map to encrypt (decrypt) different types of digital images. The designed encryption system passed all basic efficiency criteria (like Randomness, MSE, PSNR, Histogram Analysis, and Key Space) that were applied to the key extracted from the random generator as well as to the digital images after completing the encryption process.

Publication Date
Sun Apr 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Solving a three dimensional transportation problem using linear programming
...Show More Authors

Transport is a problem and one of the most important mathematical methods that help in making the right decision for the transfer of goods from sources of supply to demand centers and the lowest possible costs, In this research, the mathematical model of the three-dimensional transport problem in which the transport of goods is not homogeneous was constructed. The simplex programming method was used to solve the problem of transporting the three food products (rice, oil, paste) from warehouses to the student areas in Baghdad, This model proved its efficiency in reducing the total transport costs of the three products. After the model was solved in (Winqsb) program, the results showed that the total cost of transportation is (269,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Apr 01 2025
Journal Name
Journal Of Engineering
Comparative Analysis of The Combined Model (Spatial and Temporal) and Regression Models for Predicting Murder Crime
...Show More Authors

This research dealt with the analysis of murder crime data in Iraq in its temporal and spatial dimensions, then it focused on building a new model with an algorithm that combines the characteristics associated with time and spatial series so that this model can predict more accurately than other models by comparing them with this model, which we called the Combined Regression model (CR), which consists of merging two models, the time series regression model with the spatial regression model, and making them one model that can analyze data in its temporal and spatial dimensions. Several models were used for comparison with the integrated model, namely Multiple Linear Regression (MLR), Decision Tree Regression (DTR), Random Forest Reg

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 31 2020
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Application of data content analysis (DEA) technology to evaluate performance efficiency: applied research in the General Tax Authority
...Show More Authors

The aim of the research is to use the data content analysis technique (DEA) in evaluating the efficiency of the performance of the eight branches of the General Tax Authority, located in Baghdad, represented by Karrada, Karkh parties, Karkh Center, Dora, Bayaa, Kadhimiya, New Baghdad, Rusafa according to the determination of the inputs represented by the number of non-accountable taxpayers and according to the categories professions and commercial business, deduction, transfer of property ownership, real estate and tenders, In addition to determining the outputs according to the checklist that contains nine dimensions to assess the efficiency of the performance of the investigated branches by investing their available resources T

... Show More
View Publication Preview PDF
Publication Date
Fri Jul 14 2023
Journal Name
International Journal Of Information Technology & Decision Making
A Decision Modeling Approach for Data Acquisition Systems of the Vehicle Industry Based on Interval-Valued Linear Diophantine Fuzzy Set
...Show More Authors

Modeling data acquisition systems (DASs) can support the vehicle industry in the development and design of sophisticated driver assistance systems. Modeling DASs on the basis of multiple criteria is considered as a multicriteria decision-making (MCDM) problem. Although literature reviews have provided models for DASs, the issue of imprecise, unclear, and ambiguous information remains unresolved. Compared with existing MCDM methods, the robustness of the fuzzy decision by opinion score method II (FDOSM II) and fuzzy weighted with zero inconsistency II (FWZIC II) is demonstrated for modeling the DASs. However, these methods are implemented in an intuitionistic fuzzy set environment that restricts the ability of experts to provide mem

... Show More
View Publication
Scopus (3)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Fri Oct 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison between method penalized quasi- likelihood and Marginal quasi-likelihood in estimating parameters of the multilevel binary model
...Show More Authors

Multilevel models are among the most important models widely used in the application and analysis of data that are characterized by the fact that observations take a hierarchical form, In our research we examined the multilevel logistic regression model (intercept random and slope random model) , here the importance of the research highlights that the usual regression models calculate the total variance of the model and its inability to read variance and variations between levels ,however in the case of multi-level regression models, the calculation of  the total variance is inaccurate and therefore these models calculate the variations for each level of the model, Where the research aims to estimate the parameters of this m

... Show More
View Publication Preview PDF
Crossref