The current research aims to study the extent to which the Independent High Electoral Commission applies to information security risk management by the international standard (ISO / IEC27005) in terms of policies, administrative and technical procedures, and techniques used in managing information security risks, based on the opinions of experts in the sector who occupy positions (General Manager The directorate, department heads and their agents, project managers, heads of divisions, and those authorized to access systems and software). The importance of the research comes by giving a clear picture of the field of information security risk management in the organization in question because of its significant role in identifying risks and setting appropriate controls to manage or get rid of them, flexibility in setting controls at work and gaining the confidence of stakeholders and customers that Their data is protected. Compliance with controls gives the organization the confidence of customers that it is the best supplier and raises the level of ability to meet the requirements of tenders and then get new job opportunities, which encouraged addressing this topic by focusing on the basic standards of this specification and trying to study these standards and identify the most critical problems that This prevents its application in the commission understudy in particular. The Independent High Electoral Commission/National Office in Baghdad was chosen as a site to conduct the research, and the approach of the case study and applied research was followed and through field coexistence, observations, interviews, access to documents and information extracted from records and documents in order to determine the extent of the gap Between the Information Security Department of the commission in question and the system that the specification came with, analyzing the causes of the gaps and developing solutions, and considering The research was extended to the checklists prepared by the International Standardization Organization, and for the purpose of data analysis, the heptagonal scale was used in the checklists to measure the extent to which the implementation and actual documentation conform to the requirements of the specification, while determining the weights for the answers to the questions contained in the checklists by allocating a specific weight to each paragraph of the scale. The research used two statistical methods, the percentage and the weighted mean to express the extent of application and documentation of the specification paragraphs above and relied on the statement of the main reasons for surgery in the emergence of those gaps. The results that were reached showed several reasons that prevented the application of information security risk management, in the light of which treatments were developed that would reduce the gaps that appeared, the most important of which are: that the Commission did not adopt a clear and documented strategy to address risks, and that information security risk management Ineffective and completely secured from external and internal threats. There was also interest in documenting fixed Hardware and portable Hardware represented by computers used at the headquarters of the directorate, servers and small computers used as workstations in divisions and departments and their connection to senior management, as well as laptops and personal digital assistants, which showed a gap attributed to the total undocumented application of Hardware (automatic data processing), processing accessories, and electronic media), while the application was partially and undocumented for other electronic media, including disk drives, printers, paper, and documents.
The key objective of the study is to understand the best processes that are currently used in managing talent in Australian higher education (AHE) and design a quantitative measurement of talent management processes (TMPs) for the higher education (HE) sector.
The three qualitative multi-method studies that are commonly used in empirical studies, namely, brainstorming, focus group discussions and semi-structured individual interviews were considered. Twenty
The continued acceleration in the business environment has led to the need for organizations great attention to quality applied in organizations to meet the needs of customers and stay in the market for as long as possible.
Search launched from the underlying problem is the presence of concentrations of defects and waste plaguing the company and to achieve the goal of the study detects the level of quality applied in the factory vessels and reservoirs of the General Company for Heavy Engineering Equipment, As well as calculate wastage rates occurring in the production process and find a relationship between the level of quality and ratios defective in each type of waste, it has been used quantitative meas
... Show MoreThe primary goal of in-situ load testing is to evaluate the safety and performance of a structural system under particular loading conditions. Advancements in building techniques, analytical tools, and monitoring instruments are prompting the evaluation of the appropriate loading value, loading process, and examination criteria. The procedure for testing reinforced concrete (RC) structures on-site, as outlined in the ACI Building Code, involves conducting a 24-h load test and applying specific evaluation criteria. This article detailed a retrofitting project for an RC slab-beams system by utilizing carbon fiber-reinforced polymer (CFRP) sheets to strengthen the structure following a fire incident. The RC structure showed indicators of deter
... Show MoreThe study investigates the water quality of the Orontes River, which is considered one of the important water recourses in Syria, as it is used for drinking, irrigation, swimming and industrial needs. A database of 660 measurements for 13 parameters concentrations used, were taken from 11 monitoring points distributed along the Orontes River for a period of five years from 2015-2019, and to study the correlation between parameters and their impact on water quality, statistical analysis was applied using (SPSS) program. Cluster analysis was applied in order to classify the pollution areas along the river, and two groups were given: (low pollution - high pollution), where the areas were classified according to the sources of pollution to w
... Show MoreBackground: The risk of antibiotics resistance (AR) increases due to excessive of antibiotics either by health care provider or by the patients.
Objective: The assessment of the self-medication Practice of over the counter drugs and other prescription drugs and its associated risk factor.
Subjects and Methods: Study design: A descriptive study was conducted from “20th December 2019 to 08th January 2021”. A pre validated and structured questionnaire in English and Urdu language was created to avoid language barrier including personal detail, reasons and source and knowledge about over the counter drugs and Antibiotics. Sample of the study was randomly selected.
... Show MoreMultilayer reservoirs are currently modeled as a single zone system by averaging the reservoir parameters associated with each reservoir zone. However, this type of modeling is rarely accurate because a single zone system does not account for the fact that each zone's pressure decreases independently. Pressure drop for each zone has an effect on the total output and would result in inter-flow and the premature depletion of one of the zones. Understanding reservoir performance requires a precise estimation of each layer's permeability and skin factor. The Multilayer Transient Analysis is a well-testing technique designed to determine formation properties in more than one layer, and its effectiveness over the past two decades has been
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show MoreIn modern era, which requires the use of networks in the transmission of data across distances, the transport or storage of such data is required to be safe. The protection methods are developed to ensure data security. New schemes are proposed that merge crypto graphical principles with other systems to enhance information security. Chaos maps are one of interesting systems which are merged with cryptography for better encryption performance. Biometrics is considered an effective element in many access security systems. In this paper, two systems which are fingerprint biometrics and chaos logistic map are combined in the encryption of a text message to produce strong cipher that can withstand many types of attacks. The histogram analysis o
... Show More