The idea of carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeed
... Show MoreThe presented work shows a preliminary analytic method for estimation of load and pressure distributions on low speed wings with flow separation and wake rollup phenomena’s. A higher order vortex panel method is coupled with the numerical lifting line theory by means of iterative procedure including models of separation and wake rollup. The computer programs are written in FORTRAN which are stable and efficient.
The capability of the present method is investigated through a number of test cases with different types of wing sections (NACA 0012 and GA(W)-1) for different aspect ratios and angles of attack, the results include the lift and drag curves, lift and pressure distributions along the wing s
... Show MoreIn this research work, a simulator with time-domain visualizers and configurable parameters using a continuous time simulation approach with Matlab R2019a is presented for modeling and investigating the performance of optical fiber and free-space quantum channels as a part of a generic quantum key distribution system simulator. The modeled optical fiber quantum channel is characterized with a maximum allowable distance of 150 km with 0.2 dB/km at =1550nm. While, at =900nm and =830nm the attenuation values are 2 dB/km and 3 dB/km respectively. The modeled free space quantum channel is characterized at 0.1 dB/km at =860 nm with maximum allowable distance of 150 km also. The simulator was investigated in terms of the execution of the BB84 p
... Show MoreIn this research work, a simulator with time-domain visualizers and configurable parameters using a continuous time simulation approach with Matlab R2019a is presented for modeling and investigating the performance of optical fiber and free-space quantum channels as a part of a generic quantum key distribution system simulator. The modeled optical fiber quantum channel is characterized with a maximum allowable distance of 150 km with 0.2 dB/km at =1550nm. While, at =900nm and =830nm the attenuation values are 2 dB/km and 3 dB/km respectively. The modeled free space quantum channel is characterized at 0.1 dB/km at =860 nm with maximum allowable distance of 150 km also. The simulator was investigated in terms of the execution of the BB84 prot
... Show More3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D mo
... Show MoreThe demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreIn the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show More: The need for means of transmitting data in a confidential and secure manner has become one of the most important subjects in the world of communications. Therefore, the search began for what would achieve not only the confidentiality of information sent through means of communication, but also high speed of transmission and minimal energy consumption, Thus, the encryption technology using DNA was developed which fulfills all these requirements [1]. The system proposes to achieve high protection of data sent over the Internet by applying the following objectives: 1. The message is encrypted using one of the DNA methods with a key generated by the Diffie-Hellman Ephemeral algorithm, part of this key is secret and this makes the pro
... Show MoreA new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show More