The migration from IPv4 to IPv6 can not be achieved in a brief period, thus both protocols co-exist at certain years. IETF Next Generation Transition Working Group (NGtrans) developed IPv4/IPv6 transition mechanisms. Since Iraq infrastructure, including universities, companies and institutions still use IPv4 protocol only. This research article tries to highlight, discuss a required transition roadmap and extend the local knowledge and practice on IPv6. Also, it introduces a prototype model using Packet tracer (network simulator) deployed for the design and implementation of IPv6 migration. Finally, it compares and evaluates the performance of IPv6, IPv4 and dual stack using OPNET based on QoS metrics such as throughput, delay and point to point utilization the key performance metrics for network with address allocation and router configuration supported by Open Shortest Path First (OSPF) routing protocol. In addition it compares dual-stack to the tunneling mechanism of IPv6 transition using OPNET. The results have shown that IPv6 network produces a higher in throughput, response time and Ethernet delay, but little difference in packet dropped, additionally the result in TCP delay, Point to point utilization shows small values compared to dual-stack networks. The worst performance is noted when 6 to 4 tunneling is used, tunneling network produces a higher delay than other scenarios.
Cyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreThe psychological burnout is considered one of dangerous phenomenon’s which appeared in the 70s of the 20th century and suffered by most of the society classes. It is a term which the wide uses and various meaning like emotional, mental, and physical exhaustion and chronic weak exhaustion. The presented research aims t identify the psychological burnout as for the Kindergarten teacher and the differences significance according to the variations (Kindergarten type: private or govern mental). The research sample contains 400 female teachers divided to 170 governmental Kindergarten teachers and 230 private Kindergarten teachers. The researcher prepared a measurement method to measure the psychological burnout for those teachers after
... Show MoreElectronic Health Record (EHR) systems are used as an efficient and effective method of exchanging patients’ health information with doctors and other key stakeholders in the health sector to obtain improved patient treatment decisions and diagnoses. As a result, questions regarding the security of sensitive user data are highlighted. To encourage people to move their sensitive health records to cloud networks, a secure authentication and access control mechanism that protects users’ data should be established. Furthermore, authentication and access control schemes are essential in the protection of health data, as numerous responsibilities exist to ensure security and privacy in a network. So, the main goal of our s
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreThe aim of the research is to identify the extent of the direct and indirect relationship of the population growth of the cities as a result of the urbanization process witnessed by the Arab region for the urban development of the city structures and their formative structures, changing the planning criteria of some cities and the extent of their changes in spatial and temporal dimensions and their relation to the standards of the western cities. In changing the concept of the modern Arab city, such as the emergence of new functional uses affecting the change in the pattern of formal formations of its urban fabric associated with its ancient morphology and distinctive human nature. The research seeks to identify the extent to which plann
... Show MoreThe research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.
The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res
... Show More