The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
The philosopher and social psychologist Erich Fromm (1900-1980), in his book "Escape from Freedom" highlighted the distinction between the "I" of the authoritarian personality and the "I" of the destructive personality based on their stance towards "the other." The former (the authoritarian self) relies on a submissive, enslaving formula, where the "I" is the master/dominator/controller/strong, while "the other" is the servant/submissive/controlled/weak, essential for perpetuating this formula. In contrast, the latter (the destructive self) relies on an annihilating, negating formula, where the "I" is existence/killer/destroyer/pe
... Show MoreThe aim of the study was to know the factors analysis of scale Bar-On & Parker, post analysis is found fourteen factors for the first degree of the scale. Also we extracted five factors from the second degree.
The scale consists of (60) items , applied on sample of (200) students (Male &Female ) age (15-18) years randomly chosen from preparatory schools . The scale unveiled satis factors validity and reliability. An others aims is to low the emotional Intelligence level and know the difference of statistical in sex , age variable and the specialization variable .The result was no difference of statistical in sex and specialization variable , but the difference appear
... Show Morein this paper copper oxide (cuO thin films were prepared by the method of vacum thermal evaporation a pressure.
In this research, titanium dioxide nanoparticles (TiO2 NPs) were prepared through the sol-gel process at an acidic medium (pH3).TiO2 nanoparticles were prepared from titanium trichloride (TiCl3) as a precursor with Ammonium hydroxide (NH4OH) with 1:3 ratio at 50 °C. The resulting gel was dried at 70 °C to obtain the Nanocrystalline powder. The powder from the drying process was treated thermally at temperatures 500 °C and 700 °C. The crystalline structure, surface morphology, and particle size were studied by using X-ray diffraction (XRD), Atomic Force Microscopy (AFM), and Scanning Electron Microscope (SEM). The results showed (anatase) phase of titanium dioxide with the average grain size
... Show MoreThe simulation have been made for 3D flow structure and heat transfer with and without
longitudinal riblet upstream of leading edge vane endwall junction of first stage nozzle guide vane .The research explores concept of weakening the secondary flows and reducing their harmful effects.Numerical investigation involved examination of the secondary flows ,velocity and heat transfer rates by solving the governing equations (continuity, Navier -stokes and energy equations ) using the known package FLUENT version (12.1).The governing equations were solved for three dimentional, turbulent flowe, incompressible with an appropriate turbulent model (k-ω,SST) .The numerical solution was carried out for 25 mode
... Show MoreIn this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreThe development of information systems in recent years has contributed to various methods of gathering information to evaluate IS performance. The most common approach used to collect information is called the survey system. This method, however, suffers one major drawback. The decision makers consume considerable time to transform data from survey sheets to analytical programs. As such, this paper proposes a method called ‘survey algorithm based on R programming language’ or SABR, for data transformation from the survey sheets inside R environments by treating the arrangement of data as a relational format. R and Relational data format provide excellent opportunity to manage and analyse the accumulated data. Moreover, a survey syste
... Show MoreThis paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More