Cloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many applications, and tools that can be used for providing the integrity and security to the data located in the cloud environment.
The city is a built-up urban space and multifunctional structures that ensure safety, health and the best shelter for humans. All its built structures had various urban roofs influenced by different climate circumstances. That creates peculiarities and changes within the urban local climate and an increase in the impact of urban heat islands (UHI) with wastage of energy. The research question is less information dealing with the renovation of existing urban roofs using color as a strategy to mitigate the impact of UHI. In order to achieve local urban sustainability; the research focused on solutions using different materials and treatments to reduce urban surface heating emissions. The results showed that the new and old technologies, produ
... Show MoreThis study employs wavelet transforms to address the issue of boundary effects. Additionally, it utilizes probit transform techniques, which are based on probit functions, to estimate the copula density function. This estimation is dependent on the empirical distribution function of the variables. The density is estimated within a transformed domain. Recent research indicates that the early implementations of this strategy may have been more efficient. Nevertheless, in this work, we implemented two novel methodologies utilizing probit transform and wavelet transform. We then proceeded to evaluate and contrast these methodologies using three specific criteria: root mean square error (RMSE), Akaike information criterion (AIC), and log
... Show MoreCyber-attacks keep growing. Because of that, we need stronger ways to protect pictures. This paper talks about DGEN, a Dynamic Generative Encryption Network. It mixes Generative Adversarial Networks with a key system that can change with context. The method may potentially mean it can adjust itself when new threats appear, instead of a fixed lock like AES. It tries to block brute‑force, statistical tricks, or quantum attacks. The design adds randomness, uses learning, and makes keys that depend on each image. That should give very good security, some flexibility, and keep compute cost low. Tests still ran on several public image sets. Results show DGEN beats AES, chaos tricks, and other GAN ideas. Entropy reached 7.99 bits per pix
... Show MoreBackground: Skull secondary tumors are malignant bone tumors which are increasing in incidence.Objective: The objectives of this study were to present clinical features , asses the outcome of patients with secondary skull tumors ,characterize the MRI features, locations, and extent of secondary skull tumors to determine the frequency of the symptomatic disease.Type of the study: This is a prospective study.Methods: This is a prospective study from February 2000 to February 2008. The patients were selected from five neurosurgical centers and one oncology hospital in Baghdad/Iraq. The inclusion criteria were MRI study of the head(either as an initial radiological study or following head CT scan when secondary brain tumor is suspected , vis
... Show MoreThe calibration of a low-speed wind tunnel (LSWT) test section had been made in the present work. The tunnel was designed and constructed at the Aerodynamics Lab. in the Mechanical Engineering Department/University of Baghdad. The test section design speed is 70 m/s. Frictional loses and uniformity of the flow inside the test section had been tested and calibrated based on the British standards for flow inside ducts and conduits. Pitot-static tube, boundary layer Pitot tube were the main instruments which were used in the present work to measure the flow characteristics with emphasize on the velocity uniformity and boundary layer growth along the walls of the test section. It is found that the maximum calibrated velocity for empty test sect
... Show MoreThe synthesis of nanoparticles (GNPs) from the reduction of HAuCl4 .3H2O by aluminum metal was obtained in aqueous solution with the use of Arabic gum as a stabilizing agent. The GNPs were characterized by TEM, AFM and Zeta potential spectroscopy. The reduction process was monitored over time by measuring ultraviolet spectra at a range of λ 520-525 nm. Also the color changes from yellow to ruby red, shape and size of GNP was studied by TEM. Shape was spherical and the size of particles was (12-17.5) nm. The best results were obtained at pH 6.
This study aims to encapsulate atenolol within floating alginate-ethylcellulose beads as an oral controlled-release delivery system using aqueous colloidal polymer dispersion (ACPD) method.To optimize drug entrapment efficiency and dissolution behavior of the prepared beads, different parameters of drug: polymer ratio, polymer mixture ratio, and gelling agent concentration were involved.The prepared beads were investigated with respect to their buoyancy, encapsulation efficiency, and dissolution behavior in the media: 0.1 N HCl (pH 1.2), acetate buffer (pH 4.6) and phosphate buffer (pH 6.8). The release kinetics and mechanism of the drug from the prepared beads was investigated.All prepared atenolol beads remained f
... Show MoreA particular solution of the two and three dimensional unsteady state thermal or mass diffusion equation is obtained by introducing a combination of variables of the form,
η = (x+y) / √ct , and η = (x+y+z) / √ct, for two and three dimensional equations
respectively. And the corresponding solutions are,
θ (t,x,y) = θ0 erfc (x+y)/√8ct and θ( t,x,y,z) =θ0 erfc (x+y+z/√12ct)
This paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreIn this paper a system is designed on an FPGA using a Nios II soft-core processor, to detect the colour of a specific surface and moving a robot arm accordingly. The surface being detected is bounded by a starting mark and an ending mark, to define the region of interest. The surface is also divided into sections as rows and columns and each section can have any colour. Such a system has so many uses like for example warehouses or even in stores where their storing areas can be divided to sections and each section is coloured and a robot arm collects objects from these sections according to the section’s colour also the robot arm can organize objects in sections according to the section’s colour.