Blockchain is an innovative technology that has gained interest in all sectors in the era of digital transformation where it manages transactions and saves them in a database. With the increasing financial transactions and the rapidly developed society with growing businesses many people looking for the dream of a better financially independent life, stray from large corporations and organizations to form startups and small businesses. Recently, the increasing demand for employees or institutes to prepare and manage contracts, papers, and the verifications process, in addition to human mistakes led to the emergence of a smart contract. The smart contract has been developed to save time and provide more confidence while dealing, as well as to cover the security aspects of digital management and to solve negotiation concerns. The smart contract was employed in creating a distributed ledger to eliminate the need for centralization. In this paper, a simple prototype has been implemented for the smart contract integrated with blockchain which is simulated in a local server with a set of nodes. Several security objectives, such as confidentiality, authorization, integrity, and non-repudiation, have been achieved in the proposed system. Besides, the paper discussed the importance of using the Blockchain technique, and how it contributed to the management of transactions in addition to how it was implemented in highly transparent real-estate scenarios. The smart contract was employed in creating a distributed ledger to eliminate the need for centralization. The elliptic-curve public key has been adopted as an alternative for the RSA in a signature generation/verification process and encryption protocol. For secure transactions, The Secure Socket Layer (SSL) also has been adopted as a secure layer in the web browser. The results have been investigated and evaluated from different aspects and the implementation was in a restricted environment. Experiments showed us the complexity of time and cost when using the (ECC) algorithm and using (RSA) algorithm depending on the size and length of the key. So if the size of the key in (ECC) equals (160) bits, and it corresponds to (1024) bits in (RSA), which is equivalent to 40% for (ECC) and 30% for (RSA). As a result, the (ECC) algorithm is complex, its key is smaller and the process of generating the key is faster, so it has achieved a high level of security.
This study employs wavelet transforms to address the issue of boundary effects. Additionally, it utilizes probit transform techniques, which are based on probit functions, to estimate the copula density function. This estimation is dependent on the empirical distribution function of the variables. The density is estimated within a transformed domain. Recent research indicates that the early implementations of this strategy may have been more efficient. Nevertheless, in this work, we implemented two novel methodologies utilizing probit transform and wavelet transform. We then proceeded to evaluate and contrast these methodologies using three specific criteria: root mean square error (RMSE), Akaike information criterion (AIC), and log
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreIn the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn
... Show MoreBackground/Objectives: The purpose of this study was to classify Alzheimer’s disease (AD) patients from Normal Control (NC) patients using Magnetic Resonance Imaging (MRI). Methods/Statistical analysis: The performance evolution is carried out for 346 MR images from Alzheimer's Neuroimaging Initiative (ADNI) dataset. The classifier Deep Belief Network (DBN) is used for the function of classification. The network is trained using a sample training set, and the weights produced are then used to check the system's recognition capability. Findings: As a result, this paper presented a novel method of automated classification system for AD determination. The suggested method offers good performance of the experiments carried out show that the
... Show MoreFree-Space Optical (FSO) can provide high-speed communications when the effect of turbulence is not serious. However, Space-Time-Block-Code (STBC) is a good candidate to mitigate this seriousness. This paper proposes a hybrid of an Optical Code Division Multiple Access (OCDMA) and STBC in FSO communication for last mile solutions, where access to remote areas is complicated. The main weakness effecting a FSO link is the atmospheric turbulence. The feasibility of employing STBC in OCDMA is to mitigate these effects. The current work evaluates the Bit-Error-Rate (BER) performance of OCDMA operating under the scintillation effect, where this effect can be described by the gamma-gamma model. The most obvious finding to emerge from the analysis
... Show MoreThis work implements an Electroencephalogram (EEG) signal classifier. The implemented method uses Orthogonal Polynomials (OP) to convert the EEG signal samples to moments. A Sparse Filter (SF) reduces the number of converted moments to increase the classification accuracy. A Support Vector Machine (SVM) is used to classify the reduced moments between two classes. The proposed method’s performance is tested and compared with two methods by using two datasets. The datasets are divided into 80% for training and 20% for testing, with 5 -fold used for cross-validation. The results show that this method overcomes the accuracy of other methods. The proposed method’s best accuracy is 95.6% and 99.5%, respectively. Finally, from the results, it
... Show MoreThe fingerprints are the more utilized biometric feature for person identification and verification. The fingerprint is easy to understand compare to another existing biometric type such as voice, face. It is capable to create a very high recognition rate for human recognition. In this paper the geometric rotation transform is applied on fingerprint image to obtain a new level of features to represent the finger characteristics and to use for personal identification; the local features are used for their ability to reflect the statistical behavior of fingerprint variation at fingerprint image. The proposed fingerprint system contains three main stages, they are: (i) preprocessing, (ii) feature extraction, and (iii) matching. The preprocessi
... Show MoreA variety of new phenolic Schiff bases derivatives have been synthesized starting from Terephthaladehyde compound, all proposed structures were supported by FTIR, 1H-NMR, 13C-NMR, Elemental analysis, some derivatives evaluated by Thermal analysis (TGA).
Sensitive information of any multimedia must be encrypted before transmission. The dual chaotic algorithm is a good option to encrypt sensitive information by using different parameters and different initial conditions for two chaotic maps. A dual chaotic framework creates a complex chaotic trajectory to prevent the illegal use of information from eavesdroppers. Limited precisions of a single chaotic map cause a degradation in the dynamical behavior of the communication system. To overcome this degradation issue in, a novel form of dual chaos map algorithm is analyzed. To maintain the stability of the dynamical system, the Lyapunov Exponent (LE) is determined for the single and dual maps. In this paper, the LE of the single and dual maps
... Show More