Preferred Language
Articles
/
alkej-522
Multidimensional Systolic Arrays of LMS Algorithm Adaptive (FIR) Digital Filters
...Show More Authors

A multidimensional systolic arrays realization of LMS algorithm by a method of mapping regular algorithm onto processor array, are designed. They are based on appropriately selected 1-D systolic array filter that depends on the inner product sum systolic implementation. Various arrays may be derived that exhibit a regular arrangement of the cells (processors) and local interconnection pattern, which are important for VLSI implementation. It reduces latency time and increases the throughput rate in comparison to classical 1-D systolic arrays. The 3-D multilayered array consists of 2-D layers, which are connected with each other only by edges. Such arrays for LMS-based adaptive (FIR) filter may be opposed the fundamental requirements of fast convergence rate in most adaptive filter applications.       

                                                                                                              

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Jan 01 2011
Journal Name
Journal: Ibn Al-haitham Journal For Pure And Applied Sciences
A Study the effect the direction of the distribution of lighting to improve Images in different lighting by using technique adaptive histogram equalization
...Show More Authors

Publication Date
Tue Aug 23 2022
Journal Name
Journal Of Craniofacial Surgery
Differences Between Impression Stone Pouring and Digital Pouring in Fully Guided Dental Implant Surgery: A Prospective Clinical Study
...Show More Authors

View Publication
Scopus Clarivate Crossref
Publication Date
Wed Jan 01 2025
Journal Name
Journal Of Cybersecurity And Information Management
A New Automated System Approach to Detect Digital Forensics using Natural Language Processing to Recommend Jobs and Courses
...Show More Authors

A resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref

... Show More
View Publication
Scopus Crossref
Publication Date
Thu Jan 20 2022
Journal Name
Webology
Hybrid Intrusion Detection System based on DNA Encoding, Teiresias Algorithm and Clustering Method
...Show More Authors

Until recently, researchers have utilized and applied various techniques for intrusion detection system (IDS), including DNA encoding and clustering that are widely used for this purpose. In addition to the other two major techniques for detection are anomaly and misuse detection, where anomaly detection is done based on user behavior, while misuse detection is done based on known attacks signatures. However, both techniques have some drawbacks, such as a high false alarm rate. Therefore, hybrid IDS takes advantage of combining the strength of both techniques to overcome their limitations. In this paper, a hybrid IDS is proposed based on the DNA encoding and clustering method. The proposed DNA encoding is done based on the UNSW-NB15

... Show More
View Publication
Crossref (3)
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Natural Language Processing For Requirement Elicitation In University Using Kmeans And Meanshift Algorithm
...Show More Authors

 Data Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Sat Sep 30 2017
Journal Name
Al-khwarizmi Engineering Journal
Robot Arm Path Planning Using Modified Particle Swarm Optimization based on D* algorithm
...Show More Authors

Abstract

Much attention has been paid for the use of robot arm in various applications. Therefore, the optimal path finding has a significant role to upgrade and guide the arm movement. The essential function of path planning is to create a path that satisfies the aims of motion including, averting obstacles collision, reducing time interval, decreasing the path traveling cost and satisfying the kinematics constraints. In this paper, the free Cartesian space map of 2-DOF arm is constructed to attain the joints variable at each point without collision. The D*algorithm and Euclidean distance are applied to obtain the exact and estimated distances to the goal respectively. The modified Particle Swarm Optimization al

... Show More
View Publication Preview PDF
Crossref (8)
Crossref
Publication Date
Thu Jun 01 2023
Journal Name
Bulletin Of Electrical Engineering And Informatics
A missing data imputation method based on salp swarm algorithm for diabetes disease
...Show More Authors

Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B

... Show More
View Publication
Scopus (5)
Crossref (1)
Scopus Crossref
Publication Date
Sun Apr 30 2023
Journal Name
Iraqi Journal Of Science
An Evolutionary Algorithm with Gene Ontology-Aware Crossover Operator for Protein Complex Detection
...Show More Authors

     Evolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust E

... Show More
Scopus (3)
Scopus Crossref
Publication Date
Sun Sep 07 2014
Journal Name
Baghdad Science Journal
An Algorithm for nth Order Intgro-Differential Equations by Using Hermite Wavelets Functions
...Show More Authors

In this paper, the construction of Hermite wavelets functions and their operational matrix of integration is presented. The Hermite wavelets method is applied to solve nth order Volterra integro diferential equations (VIDE) by expanding the unknown functions, as series in terms of Hermite wavelets with unknown coefficients. Finally, two examples are given

View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2017
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to

... Show More