In this research work, a new type of concrete based on sulfur-melamine modification was introduced, and its various properties were studied. This new type of concrete was prepared based on the sulfur-melamine modification and various ingredients. The new sulfur-melamine modifier was fabricated, and its fabrication was confirmed by IR spectroscopy and TG analysis. The surface morphology resulted from this modifier was studied by SEM and EDS analysis. The components ratios in concrete, chemical and physical characteristics resulted from sulfur-melamine modifier, chemical and corrosion resistance of concrete, stability of concrete against water adsorption, stability of concrete against freezing, physical and mechanical properties and durability, modulus of elasticity, and thermal expansion coefficient of the studied sulfur concrete were investigated. The IR results confirmed the amino functional groups (attached melamine ring) and the formation of polymer sulfur chains. The sulfur-melamine modification thermic mass loss was one step. The mass loss processes of the modifier were endothermic processes. The obtained SEM results revealed that the sulfur-melamine modifier had a more porous structure, without any crystal forms. EDS analysis showed that the nitrogen atoms were accounted for 51.33% of total mass while the carbon was 30.94% of total mass. The stability of sulfur-melamine modifier-based concrete was very high in the various aggressive solutions. The low size of aggregates-based concrete had more density, i.e., 2417 kg\m3. The concrete density was decreased slowly with increase in the size of aggregate. The average deformation of studied concrete was (0.0030-0.0033), confirming that the deformation performance of concrete was better than the traditional concretes. The obtained results also confirmed that value of thermal expansion coefficient for sulfur-melamine modified concrete was 17.2×10-6\˚C.
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreAuthentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreThe quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of ho
... Show More