In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes estimators of the shape parameter of the Maxwell distribution decreases with the increase of Jeffreys prior constants. The results also show that values of Bayes estimators are almost close to the maximum likelihood estimator when the Jeffreys prior constants are small, yet they are identical in some certain cases. Comparison with respect to loss functions show that Bayes estimators under the modified squared error loss function has greater MSE than the squared error loss function especially with the increase of r.
The introduction of concrete damage plasticity material models has significantly improved the accuracy with which the concrete structural elements can be predicted in terms of their structural response. Research into this method's accuracy in analyzing complex concrete forms has been limited. A damage model combined with a plasticity model, based on continuum damage mechanics, is recommended for effectively predicting and simulating concrete behaviour. The damage parameters, such as compressive and tensile damages, can be defined to simulate concrete behavior in a damaged-plasticity model accurately. This research aims to propose an analytical model for assessing concrete compressive damage based on stiffness deterioration. The prop
... Show MoreIn recent years, there has been expanding development in the vehicular part and the number of vehicles moving on the road in all the sections of the country. Vehicle number plate identification based on image processing is a dynamic area of this work; this technique is used for security purposes such as tracking of stolen cars and access control to restricted areas. The License Plate Recognition System (LPRS) exploits a digital camera to capture vehicle plate numbers is used as input to the proposed recognition system. Basically, the developing system is consist of three phases, vehicle license plate localization, character segmentation, and character recognition, the License Plate (LP) detection is presented using canny
... Show MoreIn low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreIn recent years, there has been expanding development in the vehicular part and the number of vehicles moving on the road in all the sections of the country. Vehicle number plate identification based on image processing is a dynamic area of this work; this technique is used for security purposes such as tracking of stolen cars and access control to restricted areas. The License Plate Recognition System (LPRS) exploits a digital camera to capture vehicle plate numbers is used as input to the proposed recognition system. Basically, the developing system is consist of three phases, vehicle license plate localization, character segmentation, and character recognition, the License Plate (LP) detection is presented using canny Edge detection algo
... Show MoreAkaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
A new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreA substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show More