This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estimation through working with rough set theory. The results obtained from most code sets show that Bees algorithm better than ID3 in decreasing the number of extracted rules without affecting the accuracy and increasing the accuracy ratio of null values estimation, especially when the number of null values is increasing
This study has applied the theoretical framework of conceptual metaphor theory to the analysis of the source and target domains of metaphors that are used in two English nineteenth century sonnets, both written by contemporaneous female poets. The quantitative and qualitative results of the textual analysis have clearly revealed that Elizabeth Barrett Browning’s sonnet 23 centres around the conceptual mapping of the journey of love and life with that of possession. In contrast, Christina Rossetti’s sonnet Remember tackles the central conceptual mapping of death as a journey in relation to its further experiential connections. In addition, the application of conceptual metaphor theory in identifying the frequencies and densities of metap
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MorePotential data interpretation is significant for subsurface structure characterization. The current study is an attempt to explore the magnetic low lying between Najaf and Diwaniyah Cities, In central Iraq. It aims to understand the subsurface structures that may result from this anomaly and submit a better subsurface structural image of the region. The study area is situated in the transition zone, known as the Abu Jir Fault Zone. This tectonic boundary is an inherited basement weak zone extending towards the NW-SE direction. Gravity and magnetic data processing and enhancement techniques; Total Horizontal Gradient, Tilt Angle, Fast Sigmoid Edge Detection, Improved Logistic, and Theta Map filters highlight source boundaries and the
... Show MoreFlow-production systems whose pieces are connected in a row may not have maintenance scheduling procedures fixed because problems occur at different times (electricity plants, cement plants, water desalination plants). Contemporary software and artificial intelligence (AI) technologies are used to fulfill the research objectives by developing a predictive maintenance program. The data of the fifth thermal unit of the power station for the electricity of Al Dora/Baghdad are used in this study. Three stages of research were conducted. First, missing data without temporal sequences were processed. The data were filled using time series hour after hour and the times were filled as system working hours, making the volume of the data relativel
... Show More(Use of models of game theory in determining the policies to maximize profits for the Pepsi Cola and Coca-Cola in the province of Baghdad)
Due to the importance of the theory of games especially theories of oligopoly in the study of the reality of competition among companies or governments and others the researcher linked theories of oligopoly to Econometrics to include all the policies used by companies after these theories were based on price and quantity only the researcher applied these theories to data taken from Pepsi Cola and Coca-Cola In Baghdad Steps of the solution where stated for the models proposed and solutions where found to be balance points is for the two companies according to the princi
... Show MoreAlpha shape theory for 3D visualization and volumetric measurement of brain tumor progression using magnetic resonance images
Artificial fish swarm algorithm (AFSA) is one of the critical swarm intelligent algorithms. In this
paper, the authors decide to enhance AFSA via diversity operators (AFSA-DO). The diversity operators will
be producing more diverse solutions for AFSA to obtain reasonable resolutions. AFSA-DO has been used to
solve flexible job shop scheduling problems (FJSSP). However, the FJSSP is a significant problem in the
domain of optimization and operation research. Several research papers dealt with methods of solving this
issue, including forms of intelligence of the swarms. In this paper, a set of FJSSP target samples are tested
employing the improved algorithm to confirm its effectiveness and evaluate its ex
The aim of this paper is to present a method for solving third order ordinary differential equations with two point boundary condition , we propose two-point osculatory interpolation to construct polynomial solution. The original problem is concerned using two-points osculatory interpolation with the fit equal numbers of derivatives at the end points of an interval [0 , 1] . Also, many examples are presented to demonstrate the applicability, accuracy and efficiency of the method by compared with conventional method .