The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key generations. This means the key generation system generates two keys: one is simple, and the other one is encrypted by using an improved Caesar algorithm. The encryption algorithm in the first 8 round uses simple key 1, and from round 9 to round 16, the algorithm uses encrypted key 2. Using the improved structure of the Data Encryption Standard algorithm, the results of this paper increase Data Encryption Standard encryption security, performance, and complexity of search compared with standard Data Encryption Standard. This means the Differential cryptanalysis cannot be performed on the cipher-text.
Twitter data analysis is an emerging field of research that utilizes data collected from Twitter to address many issues such as disaster response, sentiment analysis, and demographic studies. The success of data analysis relies on collecting accurate and representative data of the studied group or phenomena to get the best results. Various twitter analysis applications rely on collecting the locations of the users sending the tweets, but this information is not always available. There are several attempts at estimating location based aspects of a tweet. However, there is a lack of attempts on investigating the data collection methods that are focused on location. In this paper, we investigate the two methods for obtaining location-based dat
... Show MoreThe right of the patient to know the medical risks surrounding the medical intervention is one of the most prominent rights based on the principle of "physical safety", which has undergone several stages of development until it reached the development of the patient's independence in making medical decision without relying on the doctor, The patient's prior informed consent is informed of his / her medical condition. We will study this development in accordance with the French March 4, 2002 legislation on the rights of patients in the health system, whether it was earlier and later. We will highlight the development of the patient's right to "know the medical risks surrounding medical intervention" The legislation and its comparison with th
... Show MoreUtilizing the Turbo C programming language, the atmospheric earth model is created from sea level to 86 km. This model has been used to determine atmospheric Earth parameters in this study. Analytical derivations of these parameters are made using the balancing forces theory and the hydrostatic equation. The effects of altitude on density, pressure, temperature, gravitational acceleration, sound speed, scale height, and molecular weight are examined. The mass of the atmosphere is equal to about 50% between sea level and 5.5 km. g is equal to 9.65 m/s2 at 50 km altitude, which is 9% lower than 9.8 m/s2 at sea level. However, at 86 km altitude, g is close to 9.51 m/s2, which is close to 15% smaller than 9.8 m/s2. These resu
... Show MoreThe present paper focuses on the nature of the different interactions between cometary nucleus and tail with solar wind. The dynamics of the comet will impose many features that provide unique behavior of the comet when entering the solar system. These features are reviewed in this paper and few investigations are made. The calculations made in this work represent the analysis and interpretation of the different features of the comet, such as perihelion and eccentricity dependence on the gas production rate, and the dependence of the latter on the composition of the comet nucleus. The dependences of the heliocentric, bow shock, contact surface, and stand-off distances with gas production rate for many types of comets that cover linear and n
... Show MoreIntrinsic viscosities have been studied for polyethylene oxide in water which has wide industrial applications. The polyethylene oxide samples had two different structures, the first one was linear and covers a wide range of molecular weight of 1, 3, 10, 20, 35, 99, 370, 1100, 4600, and 8000 kg/mol and the second one was branched and had molecular weights of 0.55 and 40 kg/mol.
Intrinsic viscosities and Huggins constants have been determined for all types and molecular weights mentioned above at 25ºC using a capillary viscometer. The values of Mark-Houwink parameters (K and a) were equal to 0.0068 ml/g and 0.67 respectively, and have not been published for this range of molecular weight in as yet.
The phenomenon of spatial variation in the economic, social and urban development levels is considered prevalent in most of the economic and social systems,this relates to the concentration of most of those activities in certain regions and because of their rarity in other regions , that led to the emergence of the problem of the sharp contrast between the most developed areas and least developed areas within the same region or within the regions of the same country,
Reduction of this variables , in addition to the development of areas through following up and relying on an effective regional development enabling to reduce unemployment as well as to stop the migration of the unplanned for population,
And the ideal use of available
Credit risk assessment has become an important topic in financial risk administration. Fuzzy clustering analysis has been applied in credit scoring. Gustafson-Kessel (GK) algorithm has been utilised to cluster creditworthy customers as against non-creditworthy ones. A good clustering analysis implemented by good Initial Centres of clusters should be selected. To overcome this problem of Gustafson-Kessel (GK) algorithm, we proposed a modified version of Kohonen Network (KN) algorithm to select the initial centres. Utilising similar degree between points to get similarity density, and then by means of maximum density points selecting; the modified Kohonen Network method generate clustering initial centres to get more reasonable clustering res
... Show MoreCopula modeling is widely used in modern statistics. The boundary bias problem is one of the problems faced when estimating by nonparametric methods, as kernel estimators are the most common in nonparametric estimation. In this paper, the copula density function was estimated using the probit transformation nonparametric method in order to get rid of the boundary bias problem that the kernel estimators suffer from. Using simulation for three nonparametric methods to estimate the copula density function and we proposed a new method that is better than the rest of the methods by five types of copulas with different sample sizes and different levels of correlation between the copula variables and the different parameters for the function. The
... Show MoreThe necessities of steganography methods for hiding secret message into images have been ascend. Thereby, this study is to generate a practical steganography procedure to hide text into image. This operation allows the user to provide the system with both text and cover image, and to find a resulting image that comprises the hidden text inside. The suggested technique is to hide a text inside the header formats of a digital image. Least Significant Bit (LSB) method to hide the message or text, in order to keep the features and characteristics of the original image are used. A new method is applied via using the whole image (header formats) to hide the image. From the experimental results, suggested technique that gives a higher embe
... Show MoreA Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show More