The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreThis study includes the manufacture of four ternary alloys represented S60Se40-XPbX with weight ratios x = 0, 10, 20, and 30 by the melting point method. The components of each alloy were mixed separately, then placed in quartz ampoules and vacuumed out with a vacuum of roger that 10−4 Torr. The ampule was heated in two stages to avoid sudden dissipation and precipitation of selenium on the inner mass of the quartz tube. The ampoule was gradually heated and kept at 450°C for approximately 4 hours followed by 950°C for 10 hours.at a rate of 10 degrees Celsius, the temperature of the electric furnace
Prediction of daily rainfall is important for flood forecasting, reservoir operation, and many other hydrological applications. The artificial intelligence (AI) algorithm is generally used for stochastic forecasting rainfall which is not capable to simulate unseen extreme rainfall events which become common due to climate change. A new model is developed in this study for prediction of daily rainfall for different lead times based on sea level pressure (SLP) which is physically related to rainfall on land and thus able to predict unseen rainfall events. Daily rainfall of east coast of Peninsular Malaysia (PM) was predicted using SLP data over the climate domain. Five advanced AI algorithms such as extreme learning machine (ELM), Bay
... Show MoreThis paper is intended to focus on the existing relation between 'logic' and 'meaning', and how 'meaning' is looked at through logical perspective. Besides, this paper adopts simple logical symbols to represent some aspects of meaning.
Since meaning is still regarded as a thorny area that needs further study to determine its nature and borderline, this paper proposes to resort to logic and logical rules. This paper points out how logical rules are used and how they clarify some oblique sentences. The paper also sheds light on how meaningful sentences are logically symbolized and how logic can define the borderline of meaning in an adequate manner. This paper hypothesizes that logic, l
... Show MoreThis study aims at investigating the partial Islamic rules of preparing and distributing cartoons in order to issue an overall Islamic rul. To reach an end, descriptive and analytical approaches are adopted to clarify the nature of cartoons and other related concepts. The researcher, as well, with reference to verses of the Holy Quran, tradition (Hadith) and Islamic jurists, adopts a deductive approach to issue Islamic rules related to the industry of cartoons and it's distribution
The study consists of three sections. The first Section addresses the following issues: Definition animation; and related wordy. The second Section: Origin of Cartoon's history and it's negative and positive effects. The third Section: Islamic rules related
Wireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
We consider the outflow of water from the peak of a triangular ridge into a channel of finite depth. Solutions are computed for different flow rates and bottom angles. A numerical method is used to compute the flow from the source for small values of flow rate and it is found that there is a maximum flow rate beyond which steady solutions do not seem to exist. Limiting flows are computed for each geometrical configuration. One application of this work is as a model of saline water being returned to the ocean after desalination. References Craya, A. ''Theoretical research on the flow of nonhomogeneous fluids''. La Houille Blanche, (1):22–55, 1949. doi:10.1051/lhb/1949017 Dun, C. R. and Hocking, G. C. ''Withdrawal of fluid through
... Show More