Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize deduplication ratios. Our approach uses data deduplication to remove identical copies of the video. Our experimental results show significant storage savings, while providing strong level security
The aim of the study is the assessment of changes in the land cover within Mosul City in the north of Iraq using Geographic Information Systems (GIS) and remote sensing techniques during the period (2014-2018). Satellite images of the Landsat 8 on this period have been selected to classify images in order to measure normalized difference vegetation index (NDVI) to assess land cover changes within Mosul City. The results indicated that the vegetative distribution ratio in 2014 is 4.98% of the total area under study, decreased to 4.77% in 2015 and then decreased to 4.54
An optical video communication system is designed and constructed using pulse frequency modulation (PFM) technique. In this work PFM pulses are generated at the transmitter using voltage control oscillator (VCO) of width 50 ns for each pulse. Double frequency, equal width and narrow pulses are produced in the receiver be for demodulation. The use of the frequency doubling technique in such a system results in a narrow transmission bandwidth (25 ns) and high receiver sensitivity.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MorePotential data interpretation is significant for subsurface structure characterization. The current study is an attempt to explore the magnetic low lying between Najaf and Diwaniyah Cities, In central Iraq. It aims to understand the subsurface structures that may result from this anomaly and submit a better subsurface structural image of the region. The study area is situated in the transition zone, known as the Abu Jir Fault Zone. This tectonic boundary is an inherited basement weak zone extending towards the NW-SE direction. Gravity and magnetic data processing and enhancement techniques; Total Horizontal Gradient, Tilt Angle, Fast Sigmoid Edge Detection, Improved Logistic, and Theta Map filters highlight source boundaries and the
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreMPEG-DASH is an adaptive bitrate streaming technology that divides video content into small HTTP-objects file segments with different bitrates. With live UHD video streaming latency is the most important problem. In this paper, creating a low-delay streaming system using HTTP 2.0. Based on the network condition the proposed system adaptively determine the bitrate of segments. The video is coded using a layered H.265/HEVC compression standard, then is tested to investigate the relationship between video quality and bitrate for various HEVC parameters and video motion at each layer/resolution. The system architecture includes encoder/decoder configurations and how to embedded the adaptive video streaming. The encoder includes compression besi
... Show MoreThere are many images you need to large Khoznah space With the continued evolution of storage technology for computers, there is a need nailed required to reduce Alkhoznip space for pictures and image compression in a good way, the conversion method Alamueja
The need for cloud services has been raised globally to provide a platform for healthcare providers to efficiently manage their citizens’ health records and thus provide treatment remotely. In Iraq, the healthcare records of public hospitals are increasing progressively with poor digital management. While recent works indicate cloud computing as a platform for all sectors globally, a lack of empirical evidence demands a comprehensive investigation to identify the significant factors that influence the utilization of cloud health computing. Here we provide a cost-effective, modular, and computationally efficient model of utilizing cloud computing based on the organization theory and the theory of reasoned action perspectives. A tot
... Show MoreApplying load to a structural member may result in a bottle-shaped compression field especially when the width of the loading is less than the width of bearing concrete members. At the Building and Construction Department – the University of Technology-Iraq, series tests on fibre reinforced concrete specimens were carried out, subjected to compression forces at the top and bottom of the specimens to produce compression field. The effects of steel fibre content, concrete compressive strength, transverse tension reinforcement, the height of test specimen, and the ratio of the width of loading plate to specimen width were studied by testing a total of tenth normal strength concrete blocks with steel fibre and one normal s
... Show More