Preferred Language
Articles
/
jeasiq-3045
The Cluster Analysis by Using Nonparametric Cubic B-Spline Modeling for Longitudinal Data
...Show More Authors

Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.

In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.

The longitudinal balanced data profile was compiled into subgroups by penalizing the pairwise distances between the coefficients of the cubic B-spline model using one of the common penalize functions, the Minimax Concave Penalty function (MCP). This method, in turn, works to determine the number of clusters through one of the model selection criteria, Bayesian information criteria (BIC), and we used optimization methods to solve their equations. Therefore, we applied the alternative direction method of the ADMM multiplier algorithm to reach approximate solutions to find the estimators of the nonparametric model using R statistical software.
Longitudinally balanced data were generated in the simulation study, as the number of subjects was 60 and the number of repeats (time) was 10 for each subject. The simulation was iterated 100 times, and it showed that employing the MCP partial methods on the cubic model can group profiles into clusters, which is the aim of this paper.

 

Paper type: Research paper.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Mar 30 2014
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Estimation Liquid Permeability Using Air Permeability Laboratory Data
...Show More Authors

Permeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas

... Show More
View Publication Preview PDF
Publication Date
Sat Dec 01 2012
Journal Name
Journal Of Economics And Administrative Sciences
Using panel data in structural equations with application
...Show More Authors

The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration.  chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Nov 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Analysis of the relationship between the prices of wheat and rice importer in Iraq and crude oil prices and the exchange rate using the ARDL model
...Show More Authors

Since the beginning of 21st century, the prices of Agricultural crops have increased. This Increases is accompanied with that increases of crude oil prices and fluctuation of a dollar exchange rate as a dominant currency used in the global trade. The paper aimed to analysis the short run and long run cointegration relationships between prices of some of Agricultural crops imported by Iraq such as wheat and rice crops and both the crude oil prices and the Iraq dinar exchange rate a gained America dollar using ARDL model. The results show the long run equilibrium between they three variable throng the error correction mechanizem. The results also show the significant and economically sound effects of cru

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Oct 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
''The use of factor analysis to identify the leading factors to high blood pressure.''A field study in Baghdad hospitals
...Show More Authors

Abstract :

    In view of the fact that high blood pressure is one of the serious human diseases that a person can get without having to feel them, which is caused by many reasons therefore it became necessary to do research in this subject and to express these many factors by specific causes through studying it using (factor analysis).

  So the researcher got to the five factors that explains only 71% of the total variation in this phenomenon is the subject of the research, where ((overweight)) and ((alcohol in abundance)) and ((smoking)) and ((lack of exercise)) are the reasons that influential the most in the incidence of this disease.

View Publication Preview PDF
Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
Spatial Distribution of Heavy Element in Erbil's Municipal Landfills by Using GIS
...Show More Authors

Untreated municipal solid waste (MSW) release onto land is prevalent in developing countries. To reduce the high levels of harmful components in polluted soils, a proper evaluation of heavy metal concentrations in Erbil's Kani Qrzhala dump between August 2021 and February 2022 is required. The purpose of this research was to examine the impact of improper solid waste disposal on soil properties within a landfill by assessing the risks of contamination for eight heavy elements in two separate layers of the soil by using geoaccumulation index (I-geo) and pollution load index (PLI) supported. The ArcGIS software was employed to map the spatial distribution of heavy element pollution and potential ecological risks. The I-geo values in summe

... Show More
View Publication Preview PDF
Scopus (1)
Crossref (2)
Scopus Crossref
Publication Date
Sun Oct 23 2022
Journal Name
Baghdad Science Journal
Comparison Between Deterministic and Stochastic Model for Interaction (COVID-19) With Host Cells in Humans
...Show More Authors

In this paper, the deterministic and the stochastic models are proposed to study the interaction of the Coronavirus (COVID-19) with host cells inside the human body. In the deterministic model, the value of the basic reproduction number   determines the persistence or extinction of the COVID-19. If   , one infected cell will transmit the virus to less than one cell, as a result,  the person carrying the Coronavirus will get rid of the disease .If   the infected cell  will be able to infect  all  cells that contain ACE receptors. The stochastic model proves that if  are sufficiently large then maybe  give  us ultimate disease extinction although ,  and this  facts also proved by computer simulation.

View Publication Preview PDF
Scopus (9)
Crossref (6)
Scopus Clarivate Crossref
Publication Date
Thu Dec 01 2011
Journal Name
Iraqi Journal Of Physics
Multilayer Perceptron for analyzing satellite data
...Show More Authors

Different ANN architectures of MLP have been trained by BP and used to analyze Landsat TM images. Two different approaches have been applied for training: an ordinary approach (for one hidden layer M-H1-L & two hidden layers M-H1-H2-L) and one-against-all strategy (for one hidden layer (M-H1-1)xL, & two hidden layers (M-H1-H2-1)xL). Classification accuracy up to 90% has been achieved using one-against-all strategy with two hidden layers architecture. The performance of one-against-all approach is slightly better than the ordinary approach

View Publication Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Indian Journal Of Ecology
Horizontal variability of some soil properties in wasit governorate by using time series analysis
...Show More Authors

Scopus (2)
Scopus
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
The Prevalence of Hepatitis B Virus in High Risk Groups in Nineveh Governorate / Iraq
...Show More Authors

Hepatitis B is an inflammation of the liver that caused by Hepatitis B virus (HBV) which is DNA virus that infects the human and some kinds of animals such as chimpanzees and birds. This disease considered as the major disease of mankind and a serious global public health problem. HBsAg, HBeAg, HBcAb, HBeAb and HBsAb are markers used to detect the presence and the stage of infection. The current study included (181) individuals from both sexes, (137) males and (44) females. By ratio 3.11: 1.The mean age of patients 2.4033 ± 0.83519 (range 18-73) years as follows < 20 (11.6%), 21–40 (47.5%), 41–60 (29.8%) and > 60 (11.0%) . These patients are 73 (40.4%) Blood donors from Central Blood Bank, 88 (48.6%) Chronic kidney failure at Ibn –

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Apr 01 2019
Journal Name
2019 International Conference On Automation, Computational And Technology Management (icactm)
Multi-Resolution Hierarchical Structure for Efficient Data Aggregation and Mining of Big Data
...Show More Authors

Big data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref