Data deduplication is a data reduction technology that is worked by detecting and eliminating data redundancy and keep only one copy of these data, and is often used to reduce the storage space and network bandwidth. While our main motivation has been low band-width synchronization applications such as Low Bandwidth Network File System (LBNFS), deduplication is also useful in archival file systems. A number of researchers have advocated a scheme for archival. Data deduplication now is one of the hottest research topics in the backup storage area. In this paper, A survey on different chunking algorithms of data deduplication are discussed, and studying the most popular used chunking algorithm Two Threshold Two Divisor (TTTD), and evaluate
... Show MoreDetecting and subtracting the Motion objects from backgrounds is one of the most important areas. The development of cameras and their widespread use in most areas of security, surveillance, and others made face this problem. The difficulty of this area is unstable in the classification of the pixels (foreground or background). This paper proposed a suggested background subtraction algorithm based on the histogram. The classification threshold is adaptively calculated according to many tests. The performance of the proposed algorithms was compared with state-of-the-art methods in complex dynamic scenes.
The city of Karbala is one of the most important holy places for visitors and pilgrims from the Islamic faith, especially through the Arabian visit, when crowds of millions gather to commemorate the martyrdom of Imam Hussein. Offering services and medical treatments during this time is very important, especially when the crowds head to their destination (the holy shrine of Imam Hussein (a.s)). In recent years, the Arba'in visit has witnessed an obvious growth in the number of participants. The biggest challenge is the health risks, and the preventive measures for both organizers and visitors. Researchers identified various challenges and factors to facilitating the Arba'in visit. The purpose of this research is to deal with the religious an
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
Using remote sensing technology and modeling methodologies to monitor changes in land surface temperature (LST) and urban heat islands (UHI) has become an essential reference for making decisions on sustainable land use. This study estimates LST and UHI in Salah al-din Province to contribute to land management, Urban planning, or climate resilience in the region; as a result of environmental changes in recent years, LANDSAT Satellite Imagery from 2014- 2024 was implemented to estimate the LST and UHI indexes in Salah al-din Province, ArcGIS 10.7 was use to calculate the indices, and The normalized mean vegetation index (NDVI) was calculated as it is closely related to extracting (LST