Interface evaluation has been the subject of extensive study and research in human-computer interaction (HCI). It is a crucial tool for promoting the idea that user engagement with computers should resemble casual conversations and interactions between individuals, according to specialists in the field. Researchers in the HCI field initially focused on making various computer interfaces more usable, thus improving the user experience. This study's objectives were to evaluate and enhance the user interface of the University of Baghdad's implementation of an online academic management system using the effectiveness, time-based efficiency, and satisfaction rates that comply with the task questionnaire process. We made a variety of interfaces for the system's interface and selected the best ones based on the study's findings. This study will show how user-friendly our interface will be for those with little to no computer experience. 24 participants participated in the usability testing to evaluate the system's functionality. The results demonstrate excellent usability with a participation error rate of 33.3% and an efficacy rate of 71.43%. Furthermore, some suggestions are given for enhancing our university system's human-computer interaction metrics.
This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreSteganography is defined as hiding confidential information in some other chosen media without leaving any clear evidence of changing the media's features. Most traditional hiding methods hide the message directly in the covered media like (text, image, audio, and video). Some hiding techniques leave a negative effect on the cover image, so sometimes the change in the carrier medium can be detected by human and machine. The purpose of suggesting hiding information is to make this change undetectable. The current research focuses on using complex method to prevent the detection of hiding information by human and machine based on spiral search method, the Structural Similarity Index Metrics measures are used to get the accuracy and quality
... Show MoreString matching is seen as one of the essential problems in computer science. A variety of computer applications provide the string matching service for their end users. The remarkable boost in the number of data that is created and kept by modern computational devices influences researchers to obtain even more powerful methods for coping with this problem. In this research, the Quick Search string matching algorithm are adopted to be implemented under the multi-core environment using OpenMP directive which can be employed to reduce the overall execution time of the program. English text, Proteins and DNA data types are utilized to examine the effect of parallelization and implementation of Quick Search string matching algorithm on multi-co
... Show More<p>Combating the COVID-19 epidemic has emerged as one of the most promising healthcare the world's challenges have ever seen. COVID-19 cases must be accurately and quickly diagnosed to receive proper medical treatment and limit the pandemic. Imaging approaches for chest radiography have been proven in order to be more successful in detecting coronavirus than the (RT-PCR) approach. Transfer knowledge is more suited to categorize patterns in medical pictures since the number of available medical images is limited. This paper illustrates a convolutional neural network (CNN) and recurrent neural network (RNN) hybrid architecture for the diagnosis of COVID-19 from chest X-rays. The deep transfer methods used were VGG19, DenseNet121
... Show MoreIn this study, gold nanoparticles were synthesized in a single step biosynthetic method using aqueous leaves extract of thymus vulgaris L. It acts as a reducing and capping agent. The characterizations of nanoparticles were carried out using UV-Visible spectra, X-ray diffraction (XRD) and FTIR. The surface plasmon resonance of the as-prepared gold nanoparticles (GNPs) showed the surface plasmon resonance centered at 550[Formula: see text]nm. The XRD pattern showed that the strong four intense peaks indicated the crystalline nature and the face centered cubic structure of the gold nanoparticles. The average crystallite size of the AuNPs was 14.93[Formula: see text]nm. Field emission scanning electron microscope (FESEM) was used to s
... Show MoreIn aspect-based sentiment analysis ABSA, implicit aspects extraction is a fine-grained task aim for extracting the hidden aspect in the in-context meaning of the online reviews. Previous methods have shown that handcrafted rules interpolated in neural network architecture are a promising method for this task. In this work, we reduced the needs for the crafted rules that wastefully must be articulated for the new training domains or text data, instead proposing a new architecture relied on the multi-label neural learning. The key idea is to attain the semantic regularities of the explicit and implicit aspects using vectors of word embeddings and interpolate that as a front layer in the Bidirectional Long Short-Term Memory Bi-LSTM. First, we
... Show MoreDecision-making in Operations Research is the main point in various studies in our real-life applications. However, these different studies focus on this topic. One drawback some of their studies are restricted and have not addressed the nature of values in terms of imprecise data (ID). This paper thus deals with two contributions. First, decreasing the total costs by classifying subsets of costs. Second, improving the optimality solution by the Hungarian assignment approach. This newly proposed method is called fuzzy sub-Triangular form (FS-TF) under ID. The results obtained are exquisite as compared with previous methods including, robust ranking technique, arithmetic operations, magnitude ranking method and centroid ranking method. This
... Show More