Finding orthogonal matrices in different sizes is very complex and important because it can be used in different applications like image processing and communications (eg CDMA and OFDM). In this paper we introduce a new method to find orthogonal matrices by using tensor products between two or more orthogonal matrices of real and imaginary numbers with applying it in images and communication signals processing. The output matrices will be orthogonal matrices too and the processing by our new method is very easy compared to other classical methods those use basic proofs. The results are normal and acceptable in communication signals and images but it needs more research works.
The present work involved four steps: First step include reaction of acrylamide ,N-?-Methylen-bis(acryl amide) and N-tert Butyl acryl amide with poly acryloyl chloride in the presence of triethyl amine (Et3N) as catalyst, the second step include homopolymerization of all products of the first step by using benzoyl peroxide(BPO) as initiator in (80-90)Co in the presence of Nitrogen gas(N2). In the third step the poly acrylimide which prepare in second step was convert into potassium salt by using alcoholic potassium hydroxide solution. Fourth step include Alkylation of the prepared polymeric salts in third step by react it with different alkyl halides(benzyl chloride, allylbromide , methyl iodide) by using DMF as solvent for(10-12) hours.
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreThis research includes structure interpretation of the Yamama Formation (Lower Cretaceous) and the Naokelekan Formation (Jurassic) using 2D seismic reflection data of the Tuba oil field region, Basrah, southern Iraq. The two reflectors (Yamama and Naokelekan) were defined and picked as peak and tough depending on the 2D seismic reflection interpretation process, based on the synthetic seismogram and well log data. In order to obtain structural settings, these horizons were followed over all the regions. Two-way travel-time maps, depth maps, and velocity maps have been produced for top Yamama and top Naokelekan formations. The study concluded that certain longitudinal enclosures reflect anticlines in the east and west of the study ar
... Show MoreA band rationing method is applied to calculate the salinity index (SI) and Normalized Multi-Band Drought Index (NMDI) as pre-processing to take Agriculture decision in these areas is presented. To separate the land from other features that exist in the scene, the classical classification method (Maximum likelihood classification) is used by classified the study area to multi classes (Healthy vegetation (HV), Grasslands (GL), Water (W), Urban (U), Bare Soil (BS)). A Landsat 8 satellite image of an area in the south of Iraq are used, where the land cover is classified according to indicator ranges for each (SI) and (NMDI).
The purpose of this paper is to model and forecast the white oil during the period (2012-2019) using volatility GARCH-class. After showing that squared returns of white oil have a significant long memory in the volatility, the return series based on fractional GARCH models are estimated and forecasted for the mean and volatility by quasi maximum likelihood QML as a traditional method. While the competition includes machine learning approaches using Support Vector Regression (SVR). Results showed that the best appropriate model among many other models to forecast the volatility, depending on the lowest value of Akaike information criterion and Schwartz information criterion, also the parameters must be significant. In addition, the residuals
... Show MoreThis research dealt with study of cladistics taxonomy of five species related to the genus Rumex L. and Polygonum L. from family polygonaceae in Iraq by using Mesquite software V.2.75. This research support strongly delimiting the species P. aviculare L. and P. lapathifolia L.as suggested in floras publication while R. dentatus L. is setted in single group whereas R. vesicarius L. and R. conglomeratus Murray were included in the same group. Also, this study involved characteristics of shape, dimensions, color, and ornamentation of seeds and fruits as the seed forms were ranging from lenticular to trigonous. In terms of size calculations, the seeds of R. vesicarius was recorded the higher range (4.0- 4.5) mm in length w
... Show MoreThis paper describes the use of microcomputer as a laboratory instrument system. The system is focused on three weather variables measurement, are temperature, wind speed, and wind direction. This instrument is a type of data acquisition system; in this paper we deal with the design and implementation of data acquisition system based on personal computer (Pentium) using Industry Standard Architecture (ISA)bus. The design of this system involves mainly a hardware implementation, and the software programs that are used for testing, measuring and control. The system can be used to display the required information that can be transferred and processed from the external field to the system. A visual basic language with Microsoft foundation cl
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show More