Portable devices such as smartphones, tablet PCs, and PDAs are a useful combination of hardware and software turned toward the mobile workers. While they present the ability to review documents, communicate via electronic mail, appointments management, meetings, etc. They usually lack a variety of essential security features. To address the security concerns of sensitive data, many individuals and organizations, knowing the associated threats mitigate them through improving authentication of users, encryption of content, protection from malware, firewalls, intrusion prevention, etc. However, no standards have been developed yet to determine whether such mobile data management systems adequately provide the fundamental security functions demanded by organizations and whether these functions have been securely developed. Therefore, this paper proposes a security framework for mobile data that combines core security mechanisms to avoid these problems and protects sensitive information without spending time and money deploying several new applications.
Cloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak
... Show MoreThis paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number o
... Show MoreA novel design and implementation of a cognitive methodology for the on-line auto-tuning robust PID controller in a real heating system is presented in this paper. The aim of the proposed work is to construct a cognitive control methodology that gives optimal control signal to the heating system, which achieve the following objectives: fast and precise search efficiency in finding the on- line optimal PID controller parameters in order to find the optimal output temperature response for the heating system. The cognitive methodology (CM) consists of three engines: breeding engine based Routh-Hurwitz criterion stability, search engine based particle
swarm optimization (PSO) and aggregation knowledge engine based cultural algorithm (CA)
This growing interest of the international scientific specialized commissions is due to the role that the audit committee can play, as one of companies’ governance tools, to increase the accuracy and transparency of the financial information disclosed by the companies, through its oversight role on the process of preparing financial reports, its supervision on the internal audit function within the companies, and supporting its independency, as well as coordinating the efforts between the internal control unites and the external auditor represented by the (Board of Supreme Audit) to clear the observations and irregularities in order to reduce the fraud cases.
This research was built on an applied sample of audit committee works
... Show MoreIt has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreThe research aims to measure the impact of knowledge management processes individually and in total in the innovative marketing.
We depart search of a problem expressed in a number of intellectual and practical questions, the application of this research in the General Company for Vegetable Oil Industry, represented composed a sample of (63) (Director General and Deputy Director General and Director of the Department and the Division) in the company researched, it has been designed measuring instrument to collect the necessary data either statistical means they are the percentage and the arithmetic mean and standard deviation and coefficient of variation and the coefficient of simple correlation and model
... Show MoreWith the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security. This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show More