Digital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different resolutions. By considering features from multiple levels, the detection algorithm can better capture both global and local characteristics of the manipulated regions, enhancing the accuracy of forgery detection. To achieve a high accuracy rate, this paper presents a variety of scenarios based on a machine-learning approach. In Copy-Move detection, artifacts and their properties are used as image features and support Vector Machine (SVM) to determine whether an image is tampered with. The dataset is manipulated to train and test each classifier; the target is to learn the discriminative patterns that detect instances of copy-move forgery. Media Integration and Call Center Forgery (MICC-F2000) were utilized in this paper. Experimental evaluations demonstrate the effectiveness of the proposed methodology in detecting copy-move. The implementation phases in the proposed work have produced encouraging outcomes. In the case of the best-implemented scenario involving multiple trials, the detection stage achieved a copy-move accuracy of 97.8 %.
This research aims to know the intellectual picture the displaced people formed about aid organizations and determine whether they were positive or negative, the researchers used survey tool as standard to study the society represented by displaced people living in Baghdad camps from Shiites, Sunnis, Shabak, Turkmen, Christians, and Ezidis.
The researcher reached to important results and the most important thing he found is that displaced people living in camps included in this survey hold a positive opinion about organizations working to meet their demands but they complain about the shortfall in the health care side.
The research also found that displaced people from (Shabak, Turkmen, and Ezidi) minorities see that internati
Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreA novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreIn this article, the high accuracy and effectiveness of forecasting global gold prices are verified using a hybrid machine learning algorithm incorporating an Adaptive Neuro-Fuzzy Inference System (ANFIS) model with Particle Swarm Optimization (PSO) and Gray Wolf Optimizer (GWO). The hybrid approach had successes that enabled it to be a good strategy for practical use. The ARIMA-ANFIS hybrid methodology was used to forecast global gold prices. The ARIMA model is implemented on real data, and then its nonlinear residuals are predicted by ANFIS, ANFIS-PSO, and ANFIS-GWO. The results indicate that hybrid models improve the accuracy of single ARIMA and ANFIS models in forecasting. Finally, a comparison was made between the hybrid foreca
... Show MoreIn this research a new system identification algorithm is presented for obtaining an optimal set of mathematical models for system with perturbed coefficients, then this algorithm is applied practically by an “On Line System Identification Circuit”, based on real time speed response data of a permanent magnet DC motor. Such set of mathematical models represents the physical plant against all variation which may exist in its parameters, and forms a strong mathematical foundation for stability and performance analysis in control theory problems.
An Optimal Algorithm for HTML Page Building Process
Speech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
Regression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show More