One of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures, it will be more difficult to attain greater verification accuracy. On the other hand, the characteristics of Arabic signatures are not very clear and are subject to a great deal of variation (features’ uncertainty). To address this issue, the suggested work offers a novel method of verifying offline Arabic signatures that employs two layers of verification, as opposed to the one level employed by prior attempts or the many classifiers based on statistical learning theory. A static set of signature features is used for layer one verification. The output of a neutrosophic logic module is used for layer two verification, with the accuracy depending on the signature characteristics used in the training dataset and on three membership functions that are unique to each signer based on the degree of truthiness, indeterminacy, and falsity of the signature features. The three memberships of the neutrosophic set are more expressive for decision-making than those of the fuzzy sets. The purpose of the developed model is to account for several kinds of uncertainty in describing Arabic signatures, including ambiguity, inconsistency, redundancy, and incompleteness. The experimental results show that the verification system works as intended and can successfully reduce the FAR and FRR.
The aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MorePhotonic crystal fiber interferometers are used in many sensing applications. In this work, an in-reflection photonic crystal fiber (PCF) based on Mach-Zehnder (micro-holes collapsing) (MZ) interferometer, which exhibits high sensitivity to different volatile organic compounds (VOCs), without the needing of any permeable material. The interferometer is robust, compact, and consists of a stub photonic crystal fiber of large-mode area, photonic crystal fiber spliced to standard single mode fiber (SMF) (corning-28), this splicing occurs with optimized splice loss 0.19 dB In the splice regions the voids of the holey fiber are completely collapsed, which allows the excitation and recombination of core and cladding modes. The device reflection
... Show MoreMultiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show MoreThe rapid development of telemedicine services and the requirements for exchanging medical information between physicians, consultants, and health institutions have made the protection of patients’ information an important priority for any future e-health system. The protection of medical information, including the cover (i.e. medical image), has a specificity that slightly differs from the requirements for protecting other information. It is necessary to preserve the cover greatly due to its importance on the reception side as medical staff use this information to provide a diagnosis to save a patient's life. If the cover is tampered with, this leads to failure in achieving the goal of telemedicine. Therefore, this work provides an in
... Show MoreShadow removal is crucial for robot and machine vision as the accuracy of object detection is greatly influenced by the uncertainty and ambiguity of the visual scene. In this paper, we introduce a new algorithm for shadow detection and removal based on different shapes, orientations, and spatial extents of Gaussian equations. Here, the contrast information of the visual scene is utilized for shadow detection and removal through five consecutive processing stages. In the first stage, contrast filtering is performed to obtain the contrast information of the image. The second stage involves a normalization process that suppresses noise and generates a balanced intensity at a specific position compared to the neighboring intensit
... Show MoreOver the past few decades, the surveying fieldworks were usually carried out based on classical positioning methods for establishing horizontal and vertical geodetic networks. However, these conventional positioning techniques have many drawbacks such as time-consuming, too costly, and require massive effort. Thus, the Global Navigation Satellite System (GNSS) has been invented to fulfill the quickness, increase the accuracy, and overcome all the difficulties inherent in almost every surveying fieldwork. This research assesses the accuracy of local geodetic networks using different Global Navigation Satellite System (GNSS) techniques, such as Static, Precise Point Positioning, Post Processing Kinematic, Session method, a
... Show More