Information and communication technology has a significant influence on employee procedures. Businesses are investing in e-CRM technologies, yet it is difficult to assess the performance of their e-CRM platforms. The DeLone and McLean Information Systems Success framework can be modified to the current e-CRM assessment difficulties. The new framework's different aspects provide a concise framework for organizing the e-CRM key metrics identified in this study. The purpose of this study is to apply and verify that the Updated DeLone and McLean IS Model can be employed to explain e-CRM adoption among employees, along with the extended Updated DeLone and McLean Model with its five output factors, namely system quality, service quality, information quality, ease of use employee satisfaction. For this study, data was collected from 300 employees working on e-CRM and the data were analyzed using PLS-SEM. The experimental framework has a significant effect and shows that most of the hypotheses of the study are supported. Moreover, the framework contributes to the area of the success of e-CRM and individual performance.
Face recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreThis paper focuses on developing a self-starting numerical approach that can be used for direct integration of higher-order initial value problems of Ordinary Differential Equations. The method is derived from power series approximation with the resulting equations discretized at the selected grid and off-grid points. The method is applied in a block-by-block approach as a numerical integrator of higher-order initial value problems. The basic properties of the block method are investigated to authenticate its performance and then implemented with some tested experiments to validate the accuracy and convergence of the method.
Electrocoagulation is an electrochemical method for treatment of different types of wastewater whereby sacrificial anodes corrode to release active coagulant (usually aluminium or iron cations) into solution, while simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation or settling. The Taguchi method was applied as an experimental design and to determine the best conditions for chromium (VI) removal from wastewater. Various parameters in a batch stirred tank by iron metal electrodes: pH, initial chromium concentration, current density, distance between electrodes and KCl concentration were investigated, and the results have been analyzed using signal-to-noise (S/N) ratio. It was found that the r
... Show MoreThe linear segment with parabolic blend (LSPB) trajectory deviates from the specified waypoints. It is restricted to that the acceleration must be sufficiently high. In this work, it is proposed to engage modified LSPB trajectory with particle swarm optimization (PSO) so as to create through points on the trajectory. The assumption of normal LSPB method that parabolic part is centered in time around waypoints is replaced by proposed coefficients for calculating the time duration of the linear part. These coefficients are functions of velocities between through points. The velocities are obtained by PSO so as to force the LSPB trajectory passing exactly through the specified path points. Also, relations for velocity correction and exact v
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreThe main idea of this research is that the researcher believes that media research remains useless unless its goals and results are achieved by using the correct scientific tools. The researcher chooses 100 research papers of about 35% of the published ones, 10 of them are excluded because they are outside media. We use a simple and randomized sample including the three departments of media: journalism, television and radio journalism and public relations. The researcher adopts statistical methods such as Fay coefficient, correlation coefficient, Pearson correlation coefficient and straight line equation.
The researcher uses an analytical form followed by analysis of content, them the scale. The results are found in 58 researches, w
In this research, carbon nanotubes (CNTs) is prepared through the Hummers method with a slight change in some of the work steps, thus, a new method has been created for preparing carbon nanotubes which is similar to the original Hummers method that is used to prepare graphene oxide. Then, the suspension carbon nanotubes is transferred to a simple electrode position platform consisting of two electrodes and the cell body for the coating and reduction of the carbon nanotubes on ITO glass which represents the cathode electrode while platinum represents the anode electrode. The deposited layer of carbon nanotubes is examined through the scanning electron microscope technique (SEM), and the images throughout the research show the
... Show More