Transient drop in the heart beat or transient heart block (AVB) may be consider the main cause of syncope or presyncope inpatients with bifascicular block and syncope According to the Guidelines for cardiac pacing pacemaker consider part of treatment. Aims of our study were to evaluate whether there is role for EPS in patients BFB and to evaluate the symptoms after pacing. 42 patients were enrolled in this study, with mean age value (63.4± 12.2years), suffer from interventricular conductive defect and syncope; patients underwent EPS on admission time, and pacemaker implantation accordingly and programmed follow up for the device in the last four years. Our patients were 25 (59.5%) male and 17 (40.5%)female, all of them with syncope or presncope and good left ventricular systolic function and the left ventricular ejection fraction (LVEF ≥55%). Left bundle branch block was found in 28 (66.71%) patients, while right bundle branch block were found in,14 (33.3) of them, the result of the EPS was find the cut of HV interval for pacing which represent that that the threshold at level of 75 have a sensitivity of 91% and specificity of 80%. The greater HV intervals gave more successful results for pacing. Pacemaker was implanted in 27 (64.3%) of the patients, with significant relation between pacing and syncope disappeared after pacemaker implantation (p value 000) and in 15 (35.7%) no pacemaker was implanted with persistent symptoms. Pacing were more between patients with coronary artery disease and LBBB with abnormal EPS finding. Permanent pacemaker implantation can implant directly in those old patients with syncope and bifascicular block that associated with LBBB and coronaries artery diseases without or before EP study
As a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreThe objective of this research is to know the economic feasibility of hydroponics technology by estimating the expected demand for green forage for the years 2021-2031 as well as Identify and analyze project data and information in a way that helps the investor make the appropriate investment decision in addition to preparing a detailed technical preliminary study for the cultivar barley project focusing on the commercial and financing aspects and the criteria that take into account the risks and uncertainties . that indicating the economic feasibility of the project to produce green forage using hydroponics technology. Cultured barley as a product falls within the blue ocean strategy. Accordingly, the research recommends the necess
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreThe analysis of survival and reliability considered of topics and methods of vital statistics at the present time because of their importance in the various demographical, medical, industrial and engineering fields. This research focused generate random data for samples from the probability distribution Generalized Gamma: GG, known as: "Inverse Transformation" Method: ITM, which includes the distribution cycle integration function incomplete Gamma integration making it more difficult classical estimation so will be the need to illustration to the method of numerical approximation and then appreciation of the function of survival function. It was estimated survival function by simulation the way "Monte Carlo". The Entropy method used for the
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E
... Show MoreThis paper presents a research for magnetohydrodynamic (MHD) flow of an incompressible generalized Burgers’ fluid including by an accelerating plate and flowing under the action of pressure gradient. Where the no – slip assumption between the wall and the fluid is no longer valid. The fractional calculus approach is introduced to establish the constitutive relationship of the generalized Burgers’ fluid. By using the discrete Laplace transform of the sequential fractional derivatives, a closed form solutions for the velocity and shear stress are obtained in terms of Fox H- function for the following two problems: (i) flow due to a constant pressure gradient, and (ii) flow due to due to a sinusoidal pressure gradient. The solutions for
... Show MoreMPEG-DASH is an adaptive bitrate streaming technology that divides video content into small HTTP-objects file segments with different bitrates. With live UHD video streaming latency is the most important problem. In this paper, creating a low-delay streaming system using HTTP 2.0. Based on the network condition the proposed system adaptively determine the bitrate of segments. The video is coded using a layered H.265/HEVC compression standard, then is tested to investigate the relationship between video quality and bitrate for various HEVC parameters and video motion at each layer/resolution. The system architecture includes encoder/decoder configurations and how to embedded the adaptive video streaming. The encoder includes compression besi
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreObjective: The aim of the study to evaluate the nursing care management for diabetes mellitus patient
with total hip replacement after fractured hip.
Methodology: A field study carried out on patients with diabetes mellitus and have total hip
replacement after fractured hip in orthopedic ward at the hospital of surgical specialization (malefemale)during
January 2002 to January 2003.Physical and psychological nursing
assessment
immediately after the surgery was done for the both subjects (control and experimental) and then a
scientific management with daily nursing care were provided to the experimental subject with daily
nursing care to the patient condition by using a scientific and practical methods and leave th