Preferred Language
Articles
/
BRcoPo8BVTCNdQwCamXJ
IMPROVED STRUCTURE OF DATA ENCRYPTION STANDARD ALGORITHM
...Show More Authors

The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.

Publication Date
Sun Oct 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
Assessing Service Quality using Data Envelopment analysis Case study at the Iraqi Middle East Investment Bank
...Show More Authors

The use of data envelopment analysis method helps to improve the performance of organizations in order to exploit their resources efficiently in order to improve the service quality. represented study a problem in need of the Iraqi Middle East Investment Bank to assess the performance of bank branches, according to the service quality provided, Thus, the importance of the study is to contribute using a scientific and systematic method by applying  the data envelopment analysis method in assessing the service quality provided by the bank branches, The study focused on achieving the goal of determining the efficiency of the  services quality provided by the bank branches manner which reflect the extent of utilization of a

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Aug 05 2016
Journal Name
Wireless Communications And Mobile Computing
A comparison study on node clustering techniques used in target tracking WSNs for efficient data aggregation
...Show More Authors

Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati

... Show More
View Publication
Scopus (31)
Crossref (23)
Scopus Clarivate Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (506)
Crossref (500)
Scopus Clarivate Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Journal Of Economics And Administrative Sciences
Analysis of the structure of salaries of public sector workers for strategic planning purposes An analytical study in the general body of groundwater
...Show More Authors

The research discusses the problem of salaries in the public sector in terms of the process of analyzing its structure and the possibility of benefiting from the information provided by the analysis process for the strategic planning process, and the General Authority for Groundwater has been adopted and one of the formations of the Ministry of Water Resources, which is centrally funded, to represent the salary structure of its employees (1117) employees be a field of research, as the salary structure in it was analyzed for the period between (2014-2019) using the quantitative approach to analysis and by relying on a number of statistical tools in the analysis process, including mathematical circles, upper limits, lower limits, p

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat May 31 2025
Journal Name
Iraqi Journal For Computers And Informatics
Discussion on techniques of data cleaning, user identification, and session identification phases of web usage mining from 2000 to 2022
...Show More Authors

The data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.

View Publication Preview PDF
Crossref
Publication Date
Mon Aug 05 2019
Journal Name
Gen. Lett. Math
Building a three-dimensional maritime transport model to find the best solution by using the heuristic algorithm
...Show More Authors

The aim of this research is to construct a three-dimensional maritime transport model to transport nonhomogeneous goods (k) and different transport modes (v) from their sources (i) to their destinations (j), while limiting the optimum quantities v ijk x to be transported at the lowest possible cost v ijk c and time v ijk t using the heuristic algorithm, Transport problems have been widely studied in computer science and process research and are one of the main problems of transport problems that are usually used to reduce the cost or times of transport of goods with a number of sources and a number of destinations and by means of transport to meet the conditions of supply and demand. Transport models are a key tool in logistics an

... Show More
Publication Date
Fri Nov 01 2019
Journal Name
Civil Engineering Journal
Time-Cost-Quality Trade-off Model for Optimal Pile Type Selection Using Discrete Particle Swarm Optimization Algorithm
...Show More Authors

The cost of pile foundations is part of the super structure cost, and it became necessary to reduce this cost by studying the pile types then decision-making in the selection of the optimal pile type in terms of cost and time of production and quality .So The main objective of this study is to solve the time–cost–quality trade-off (TCQT) problem by finding an optimal pile type with the target of "minimizing" cost and time while "maximizing" quality. There are many types In the world of piles but  in this paper, the researcher proposed five pile types, one of them is not a traditional, and   developed a model for the problem and then employed particle swarm optimization (PSO) algorithm, as one of evolutionary algorithms with t

... Show More
Scopus (9)
Crossref (9)
Scopus Clarivate Crossref
Publication Date
Sat Jan 12 2013
Journal Name
Pierb
RADAR SENSING FEATURING BICONICAL ANTENNA AND ENHANCED DELAY AND SUM ALGORITHM FOR EARLY-STAGE BREAST CANCER DETECTION
...Show More Authors

A biconical antenna has been developed for ultra-wideband sensing. A wide impedance bandwidth of around 115% at bandwidth 3.73-14 GHz is achieved which shows that the proposed antenna exhibits a fairly sensitive sensor for microwave medical imaging applications. The sensor and instrumentation is used together with an improved version of delay and sum image reconstruction algorithm on both fatty and glandular breast phantoms. The relatively new imaging set-up provides robust reconstruction of complex permittivity profiles especially in glandular phantoms, producing results that are well matched to the geometries and composition of the tissues. Respectively, the signal-to-clutter and the signal-to-mean ratios of the improved method are consis

... Show More
Publication Date
Fri May 17 2013
Journal Name
International Journal Of Computer Applications
Applied Minimized Matrix Size Algorithm on the Transformed Images by DCT and DWT used for Image Compression
...Show More Authors

View Publication
Crossref (1)
Crossref
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Dynamic algorithm (DRBLTS) and potentially weighted (WBP) to estimate hippocampal regression parameters using a techniqueBootstrap (comparative study)
...Show More Authors

Bootstrap is one of an important re-sampling technique which has given the attention of  researches recently. The presence of outliers in the original data set may cause serious problem to the classical bootstrap when the percentage of outliers are higher than the original one. Many methods are proposed to overcome this problem such  Dynamic Robust Bootstrap for LTS (DRBLTS) and Weighted Bootstrap with Probability (WBP). This paper try to show the accuracy of parameters estimation by comparison the results of both methods. The bias , MSE and RMSE are considered. The criterion of the accuracy is based on the RMSE value since the method that provide us RMSE value smaller than other is con

... Show More
View Publication Preview PDF
Crossref