The present work provides theoretical investigation of laser photoacoustic one dimensional imaging to detect a blood vessel or tumor embedded within normal tissue. The key task in photoacoustic imaging is to have acoustic signal that help to determine the size and location of the target object inside normal tissue. The analytical simulation used a spherical wave model representing target object (blood vessel or tumor) inside normal tissue. A computer program in MATLAB environment has been written to realize this simulation. This model generates time resolved acoustic wave signal that include both expansion and contraction parts of the wave. The photoacoustic signal from the target object is simulated for a range of laser pulse duration 10ns-10μs emitted from Nd:YAG laser, depth of target object 0.3-3 cm, distance from the object to the detector 0.7-3 cm and the diameter of target object 0.1-0.6 cm. The diameter of the object computed by the simulation is always being 75% of its value. The amplitude of the signal is directly proportional with the laser pulse energy and inversely proportional with the depth of target object and the distance from the object to the detector. The PA signal is fully generated in Nano second laser pulse duration range as it is short enough to fulfill the stress confinement condition.
In this study, the hydromorphodynamic simulation of a stretch of the Euphrates River was conducted. The stretch of the Euphrates River extended from Haditha dam to the city of Heet in Al-Anbar Governorate and it is estimated to be 124.4 km. Samples were taken from 3 sites along the banks of the river stretch using sampling equipment. The samples were taken to the laboratory for grain size analysis where the median size (D50) and sediment load were determined. The hydromorphodynamic simulation was conducted using the NACY 2DH solver of the iRIC model. The model was calibration using the Manning roughness, sediment load, and median particle size and the validation process showed that the error between th
... Show MoreOften phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreIncremental sheet metal forming is a modern technique of sheet metal forming in which a uniform sheet is locally deformed during the progressive action of a forming tool. The tool movement is governed by a CNC milling machine. The tool locally deforms by this way the sheet with pure deformation stretching. In SPIF process, the research is concentrate on the development of predict models for estimate the product quality. Using simulated annealing algorithm (SAA), Surface quality in SPIF has been modeled. In the development of this predictive model, spindle speed, feed rate and step depth have been considered as model parameters. Maximum peak height (Rz) and Arithmetic mean surface roughness (Ra) are used as response parameter to assess th
... Show MoreThe main aim of this paper is to study how the different estimators of the two unknown parameters (shape and scale parameter) of a generalized exponential distribution behave for different sample sizes and for different parameter values. In particular,
. Maximum Likelihood, Percentile and Ordinary Least Square estimators had been implemented for different sample sizes (small, medium, and large) and assumed several contrasts initial values for the two parameters. Two indicators of performance Mean Square Error and Mean Percentile Error were used and the comparisons were carried out between different methods of estimation by using monte carlo simulation technique .. It was obse
... Show MoreA simulation study of using 2D tomography to reconstruction a 3D object is presented. The 2D Radon transform is used to create a 2D projection for each slice of the 3D object at different heights. The 2D back-projection and the Fourier slice theorem methods are used to reconstruction each 2D projection slice of the 3D object. The results showed the ability of the Fourier slice theorem method to reconstruct the general shape of the body with its internal structure, unlike the 2D Radon method, which was able to reconstruct the general shape of the body only because of the blurring artefact, Beside that the Fourier slice theorem could not remove all blurring artefact, therefore, this research, suggested the threshold technique to eliminate the
... Show MoreIn this research estimated the parameters of Gumbel distribution Type 1 for Maximum values through the use of two estimation methods:- Moments (MoM) and Modification Moments(MM) Method. the Simulation used for comparison between each of the estimation methods to reach the best method to estimate the parameters where the simulation was to generate random data follow Gumbel distributiondepending on three models of the real values of the parameters for different sample sizes with samples of replicate (R=500).The results of the assessment were put in tables prepared for the purpose of comparison, which made depending on the mean squares error (MSE).
Use of computer simulation to quantify the effectiveness of blowing agents can be an effective tool for optimizing formulations and for the adopting of new blowing agents. This paper focuses on a mass balance on blowing agent during foaming including the quantification of the amount that stays in the resin, the amount that ends up in the foam cells, and the pressure of the blowing agent in the foam cells. Experimental data is presented both in the sense of developing the simulation capabilities and the validating of simulation results.