Array antennas have an interesting role in the radio astronomy field. The array antennas allow astronomers to obtain high-resolution signals with high sensitivity to weak signals. This paper estimates the meteors' positions entering the Earth's atmosphere and develops a simulation for array antenna radar to analyze the meteor's echoes. The GNU radio software was used to process the echoes, which is a free open-source software development toolkit that provides signal processing blocks to implement in radio projects. Then, the simulation determines the azimuth and elevation of the meteors. An improved Multiple Signal Classification (MUSIC) algorithm has been suggested to analyze these echoes. The detected power of each meteor echo has a Doppler frequency shift due to the high speed of the meteors, which impacts the accuracy of the Direction of Arrival (DOA) estimation. The Doppler shift was considered in this simulation, and the results showed that the suggested method has low complexity and high resolution and can estimate the meteors' position with the minimum error.
n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of cove
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Through recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreAn experiment was carried out in the fields of Agriculture College-Baghdad University during spring and autumn of 2015 by using a randomized complete blocks design with three replications. The first season hybridization was established among three pure cultivars of cowpea (Vigna uniguiculata L.) which: Ramshorn, California black eye and Rahawya in full diallel crosses according to Griffing with first method and fixed model (3 parents+ 3 diallel hybrids +3 reciprocal hybrids) and a comparison experiment was in autumn season. The result of statistical analysis showed that there was a significant difference among the parents and their hybrids for all the studied characters. The parent 1 was the higher for root nodules number , leaf number, pod
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreThe estimation of the initial oil in place is a crucial topic in the period of exploration, appraisal, and development of the reservoir. In the current work, two conventional methods were used to determine the Initial Oil in Place. These two methods are a volumetric method and a reservoir simulation method. Moreover, each method requires a type of data whereet al the volumetric method depends on geological, core, well log and petrophysical properties data while the reservoir simulation method also needs capillary pressure versus water saturation, fluid production and static pressure data for all active wells at the Mishrif reservoir. The petrophysical properties for the studied reservoir is calculated using neural network technique
... Show MoreThis research deals with unusual approach for analyzing the Simple Linear Regression via Linear Programming by Two - phase method, which is known in Operations Research: “O.R.”. The estimation here is found by solving optimization problem when adding artificial variables: Ri. Another method to analyze the Simple Linear Regression is introduced in this research, where the conditional Median of (y) was taken under consideration by minimizing the Sum of Absolute Residuals instead of finding the conditional Mean of (y) which depends on minimizing the Sum of Squared Residuals, that is called: “Median Regression”. Also, an Iterative Reweighted Least Squared based on the Absolute Residuals as weights is performed here as another method to
... Show More