The present paper describes and analyses three proposed cogeneration plants include back pressure steam-turbine system, gas turbine system, diesel-engine system, and the present Dura refinery plant. Selected actual operating data are employed for analysis. The same amount of electrical and thermal product outputs is considered for all systems to facilitate comparisons. The theoretical analysis was done according to 1st and 2nd law of thermodynamic. The results demonstrate that exergy analysis is a useful tool in performance analysis of cogeneration systems and permits meaningful comparisons of different cogeneration systems based on their merits, also the result showed that the back pressure steam-turbine is more efficient than other proposals. Moreover, the results of the present work indicate that these alternative plants can produce more electric power than that required in the refinery. At present time, the industrial cogeneration plants are recommended in Iraq, especially in petroleum industry sectors, in order to contribute with ministry of electricity to solve the present crisis of electric power generation. Such excess in the power can sold to the main electric network. The economic analysis are proved the feasibility of the proposed cogeneration plants with payback period of four year and six months ,three year and eight months, and ten years for steam cogeneration plant, gas turbine cogeneration plant and diesel engine cogeneration plant respectively.
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreThe complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Tra
... Show MoreRadiotherapy is medical use of ionizing radiation, and commonly applied to the cancerous tumor because of its ability to control cell growth. The amount of radiation used in photon radiation therapy called dose (measured in grey unit), which depend on the type and stage of cancer being treated. In our work, we studied the dose distribution given to the tumor at different depths (zero-20 cm) treated with different field size (4×4- 23×23 cm). Results show that the deeper treated area has less dose rate at the same beam quality and quantity. Also it has been noted increasing in the field increasing in the depth dose at the same depth even if the radiation energy is constant. Increasing in radiation dose attributed to the scattere
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe hydraulic conditions of a flow previously proved to be changed when placing large-scale geometric roughness elements on the bed of an open channel. These elements impose more resistance to the flow. The geometry of the roughness elements, the numbers used, and the configuration are parameters that can affect the hydraulic flow characteristics. The target is to use inclined block elements to control the salt wedge propagation pointed in most estuaries to prevent its negative effects. The Computational Fluid Dynamics CFD Software was used to simulate the two-phase flow in an estuary model. In this model, the used block elements are 2 cm by 3 cm cross-sections with an inclined face in the flow direction, with a length
... Show MoreThe transportation model is a well-recognized and applied algorithm in the distribution of products of logistics operations in enterprises. Multiple forms of solution are algorithmic and technological, which are applied to determine the optimal allocation of one type of product. In this research, the general formulation of the transport model by means of linear programming, where the optimal solution is integrated for different types of related products, and through a digital, dynamic, easy illustration Develops understanding of the Computer in Excel QM program. When choosing, the implementation of the form in the organization is provided.
Image compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show More