Background: Appreciation of the crucial role of risk factors in the development of coronary artery disease (CAD) is one of the most significant advances in the understanding of this important disease. Extensive epidemiological research has established cigarette smoking, diabetes, hyperlipidemia, and hypertension as independent risk factors for CADObjective: To determine the prevalence of the 4 conventional risk factors(cigarette smoking, diabetes, hyperlipidemia, and hypertension) among patients with CAD and to determine the correlation of Thrombolysis in Myocardial Infarction (TIMI) risk score with the extent of coronary artery disease (CAD) in patients with unstable angina /non ST elevation myocardial infarction (UA/NSTEMI).Methods: We conducted a descriptive study among 100 patients admitted with UA/NSTEMI to three major cardiac centers in Iraq: Iraqi Centre for Heart Disease ,Ibn- Al-Bitar Hospital for cardiac surgery and Al -Nasyria Cardiac Centre from January 2010 to January 2o11.Frequency of each conventional risk factors and number of conventional risk factors present among patients with CAD, compared between men and women and by age are estimated at study entry. The TIMI risk score was stratified on seven standard variables. The extent of CAD was evaluated on angiography and significant CAD was defined as ≥ 70% stenosis in any one of the three major epicardial vessels and ≥50% in LMS.Results : Among 100 patients with UA/NSTEMI , 82% of patients have one or more risk factors and only 18%of patients lacked any of 4 conventional risk factors.Smoking is the most common risk factor in male patients while diabetes mellitus and dyslipidemia are common among female patients, and all these results are statistically significant.There were 64 % patients with TIMI score < 4 (low and intermediate TIMI risk score) and 36% patients with TIMI score >4 (high TIMI risk score). Patients with TIMI score > 4 were more likely to have significant three vessel CAD and LMS versus those with TIMI risk score < 4 who have less severe disease (single and two vessel disease).Conclusion: Antecedent major CAD risk factor exposures were very common among those who developed CAD emphasizing the importance of considering all major riskfactors in determining CAD risk estimation . Patients with a high TIMI risk score were more likely to have severe multivessel CAD compared with those with low or intermediate TIMI risk score. Hence, patients with TIMI score >4 should be referred for early invasive coronary evaluation to derive clinical benefit.Key words: unstable angina , Thrombolysis in Myocardial Infarction score, risk factors
Need organizations today to move towards strategic thinking which means analyzing situations faced by particular challenges of change in the external environment, which makes it imperative for The Organization That to reconsider their strategies and orientations and operations, a so-called re-engineering to meet those challenges and pressures, to try to achieve improvement root in the installation of the organization and methodscompletion of its work towards achieving high levels of performance and that is reflected to achieve its objectives, and this is what aims to Current search to deal with implications characteristics of strategic thinking in the stages of application re-engineering business of the company General Industries
... Show MoreBackground: Orthodontic tooth movement is characterized by tissue reactions, which consist of an inflammatory response in periodontal ligament and followed by bone remodeling in the periodontium depending on the forces applied. These processes trigger the secretion of various proteins and enzymes into the saliva.The purpose of this study was to evaluate the activity of alkaline phosphatase (ALP) in saliva during orthodontic tooth movement using different magnitude of continuous orthodontic forces. Materials and Methods: Thirty orthodontic patients (12 males and 18 females) aged 17-23 years with class II division I malocclusion all requiring bilateral maxillary first premolar extractions were randomly divided into three groups according to t
... Show MoreThere are many techniques that can be used to estimate the spray quality traits such as the spray coverage, droplet density, droplet count, and droplet diameter. One of the most common techniques is to use water sensitive papers (WSP) as a spray collector on field conditions and analyzing them using several software. However, possible merger of some droplets could occur after they deposit on WSP, and this could affect the accuracy of the results. In this research, image processing technique was used for better estimation of the spray traits, and to overcome the problem of droplet merger. The droplets were classified as non-merged and merged droplets based on their roundness, then the merged droplets were separated based on the average non-m
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MoreAcceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.
An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo
... Show MoreIn the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show MoreVisual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas