The use of credit cards for online purchases has significantly increased in recent years, but it has also led to an increase in fraudulent activities that cost businesses and consumers billions of dollars annually. Detecting fraudulent transactions is crucial for protecting customers and maintaining the financial system's integrity. However, the number of fraudulent transactions is less than legitimate transactions, which can result in a data imbalance that affects classification performance and bias in the model evaluation results. This paper focuses on processing imbalanced data by proposing a new weighted oversampling method, wADASMO, to generate minor-class data (i.e., fraudulent transactions). The proposed method is based on the Synthetic Minority Over-sampling Technique (SMOTE), Adaptive Synthetic Sampling (ADASYN), and weight adjustment to identify specific minority areas while retaining data generalization and accurately identifying patterns associated with fraudulent transactions. Experimental results obtained from two datasets with Autoencoder (AE), Convolutional Neural Network (CNN), and Long Short-Term Memory (LSTM) learning models show that wADASMO surpasses other oversampling methods in three evaluation metrics: accuracy at 95.6%, 98.8%, and 99.2%; detection rate at 90.4%, 93.38%, and 93.38%; and area under the curve (AUC) at 93%, 96%, and 96.3% for AE, CNN, and LSTM models, respectively.
The Cu(II) was found using a quick and uncomplicated procedure that involved reacting it with a freshly synthesized ligand to create an orange complex that had an absorbance peak of 481.5 nm in an acidic solution. The best conditions for the formation of the complex were studied from the concentration of the ligand, medium, the eff ect of the addition sequence, the eff ect of temperature, and the time of complex formation. The results obtained are scatter plot extending from 0.1–9 ppm and a linear range from 0.1–7 ppm. Relative standard deviation (RSD%) for n = 8 is less than 0.5, recovery % (R%) within acceptable values, correlation coeffi cient (r) equal 0.9986, coeffi cient of determination (r2) equal to 0.9973, and percentage capita
... Show MoreIn this paper, an estimate has been made for parameters and the reliability function for Transmuted power function (TPF) distribution through using some estimation methods as proposed new technique for white, percentile, least square, weighted least square and modification moment methods. A simulation was used to generate random data that follow the (TPF) distribution on three experiments (E1 , E2 , E3) of the real values of the parameters, and with sample size (n=10,25,50 and 100) and iteration samples (N=1000), and taking reliability times (0< t < 0) . Comparisons have been made between the obtained results from the estimators using mean square error (MSE). The results showed the
... Show MoreThis paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper,
... Show MoreImage compression is a suitable technique to reduce the storage space of an image, increase the area of storage in the device, and speed up the transmission process. In this paper, a new idea for image compression is proposed to improve the performance of the Absolute Moment Block Truncation Coding (AMBTC) method depending on Weber's law condition to distinguish uniform blocks (i.e., low and constant details blocks) from non-uniform blocks in original images. Then, all elements in the bitmap of each uniform block are represented by zero. After that, the lossless method, which is Run Length method, is used for compressing the bits more, which represent the bitmap of these uniform blocks. Via this simple idea, the result is improving
... Show MoreThe research seeks to identify the impact of fraud detection skills in the settlement of compensatory claims for the fire and accident insurance portfolio and the reflection of these skills in preventing and reducing the payment of undue compensation to some who seek profit and enrichment at the expense of the insurance contract. And compensatory claims in the portfolio of fire and accident insurance in the two research companies, which show the effect and positive return of the detection skills and settlement of the compensation on the amount of actual compensation against the claims inflated by some of the insured, The research sample consisted of (70) respondents from a community size (85) individuals between the director and assistan
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreFraud Includes acts involving the exercise of deception by multiple parties inside and outside companies in order to obtain economic benefits against the harm to those companies, as they are to commit fraud upon the availability of three factors which represented by the existence of opportunities, motivation, and rationalization. Fraud detecting require necessity of indications the possibility of its existence. Here, Benford’s law can play an important role in direct the light towards the possibility of the existence of financial fraud in the accounting records of the company, which provides the required effort and time for detect fraud and prevent it.
A new method for determination of allopurinol in microgram level depending on its ability to reduce the yellow absorption spectrum of (I-3) at maximum wavelength ( ?max 350nm) . The optimum conditions such as "concentration of reactant materials , time of sitting and order of addition were studied to get a high sensitivity ( ? = 27229 l.mole-1.cm-1) sandal sensitivity : 0.0053 µg cm-2 ,with wide range of calibration curve ( 1 – 9 µg.ml-1 ) good stability (more then24 hr.) and repeatability ( RSD % : 2.1 -2.6 % ) , the Recovery % : ( 98.17 – 100.5 % ) , the Erel % ( 0.50 -1.83 % ) and the interference's of Xanthine , Cystein , Creatinine , Urea and the Glucose in 20 , 40 , 60 fold of analyate were also studied .