Breast cancer has got much attention in the recent years as it is a one of the complex diseases that can threaten people lives. It can be determined from the levels of secreted proteins in the blood. In this project, we developed a method of finding a threshold to classify the probability of being affected by it in a population based on the levels of the related proteins in relatively small case-control samples. We applied our method to simulated and real data. The results showed that the method we used was accurate in estimating the probability of being diseased in both simulation and real data. Moreover, we were able to calculate the sensitivity and specificity under the null hypothesis of our research question of being diseased o
... Show MoreAbstract
The logistic regression model is one of the nonlinear models that aims at obtaining highly efficient capabilities, It also the researcher an idea of the effect of the explanatory variable on the binary response variable. &nb
... Show MoreBackground: One of the most common problems that encountered is postburn contracture which has both functional and aesthetic impact on the patients. Various surgical methods had being proposed to treat such problem. Aim: To evaluate the effectiveness of square flap in management of postburn contracture in several part of the body. Patients and methods: From April 2019 to June 2020 a total number of 20 patients who had postburn contracture in various parts of their body were subjected to scar contracture release using square flap. The follow up period was ranging between 6 months to 12 months. Results: All of our patients had achieved complete release of their band with maximum postoperative motion together with accepted aesthetic outcome. A
... Show MoreThe change in project cost, or cost growth, occurs from many factors, some of which are related to soil problem conditions that may occurs during construction and/or during site investigation period. This paper described a new soil improvement method with a minimum cost solution by using polymer fiber materials having a length of (3 cm) in both directions and (2.5 mm) in thickness, distributed in uniform medium dense .
sandy soil at different depths (B, 1.5B and 2B) below the footings. Three square footings has been used (5,7.5 and 10 cm) to carry the above investigation by using lever arm loading system design for such purposes.
These fibers were distributed from depth of (0.1B) below the footing base down to the investigated dep
The current research aimed to identify the tasks performed by the internal auditors when developing a business continuity plan to face the COVID-19 crisis. It also aims to identify the recovery and resuming plan to the business environment. The research followed the descriptive survey to find out the views of 34 internal auditors at various functional levels in the Kingdom of Saudi Arabia. Spreadsheets (Excel) were used to analyze the data collected by a questionnaire which composed of 43 statements, covering the tasks that the internal auditors can perform to face the COVID-19 crisis. Results revealed that the tasks performed by the internal auditors when developing a business continuity plan to face the COVID-19 crisis is to en
... Show MoreIn this Paper, we proposed two new predictor corrector methods for solving Kepler's equation in hyperbolic case using quadrature formula which plays an important and significant rule in the evaluation of the integrals. The two procedures are developed that, in two or three iterations, solve the hyperbolic orbit equation in a very efficient manner, and to an accuracy that proves to be always better than 10-15. The solution is examined with and with grid size , using the first guesses hyperbolic eccentric anomaly is and , where is the eccentricity and is the hyperbolic mean anomaly.
Texture synthesis using genetic algorithms is one way; proposed in the previous research, to synthesis texture in a fast and easy way. In genetic texture synthesis algorithms ,the chromosome consist of random blocks selected manually by the user .However ,this method of selection is highly dependent on the experience of user .Hence, wrong selection of blocks will greatly affect the synthesized texture result. In this paper a new method is suggested for selecting the blocks automatically without the participation of user .The results show that this method of selection eliminates some blending caused from the previous manual method of selection.
Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show More