Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and estimation. The current method (visual quantification methods) of image quantification is time-consuming and cumbersome, and manual measurement is imprecise because of the natural differences among human eyes’ abilities. Subsequently, objective outcome evaluation can obviate the drawbacks of the current methods and facilitate recording for documenting function and research purposes. To achieve a fast and valuable objective estimation of fluorescence in each image, an algorithm was designed based on machine vision techniques to extract the targeted objects in images that resulted from confocal images and then estimate the covered area to produce a percentage value similar to the outcome of the current method and is predicted to contribute to sustainable biotechnology image analyses by reducing time and labor consumption. The results show strong evidence that t-designed objective algorithm evaluations can replace the current method of manual and visual quantification methods to the extent that the Intraclass Correlation Coefficient (ICC) is 0.9.
Intended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreThe data communication has been growing in present day. Therefore, the data encryption became very essential in secured data transmission and storage and protecting data contents from intruder and unauthorized persons. In this paper, a fast technique for text encryption depending on genetic algorithm is presented. The encryption approach is achieved by the genetic operators Crossover and mutation. The encryption proposal technique based on dividing the plain text characters into pairs, and applying the crossover operation between them, followed by the mutation operation to get the encrypted text. The experimental results show that the proposal provides an important improvement in encryption rate with comparatively high-speed Process
... Show MorePhase-change materials (PCMs) have a remarkable potential for use as efficient energy storage means. However, their poor response rates during energy storage and retrieval modes require the use of heat transfer enhancers to combat these limitations. This research marks the first attempt to explore the potential of dimple-shaped fins for the enhancement of PCM thermal response in a shell-and-tube casing. Fin arrays with different dimensions and diverse distribution patterns were designed and studied to assess the effect of modifying the fin geometric parameters and distribution patterns in various spatial zones of the physical domain. The results indicate that increasing the number of
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreTexture synthesis using genetic algorithms is one way; proposed in the previous research, to synthesis texture in a fast and easy way. In genetic texture synthesis algorithms ,the chromosome consist of random blocks selected manually by the user .However ,this method of selection is highly dependent on the experience of user .Hence, wrong selection of blocks will greatly affect the synthesized texture result. In this paper a new method is suggested for selecting the blocks automatically without the participation of user .The results show that this method of selection eliminates some blending caused from the previous manual method of selection.
Contours extraction from two dimensional echocardiographic images has been a challenge in digital image processing. This is essentially due to the heavy noise, poor quality of these images and some artifacts like papillary muscles, intra-cavity structures as chordate, and valves that can interfere with the endocardial border tracking. In this paper, we will present a technique to extract the contours of heart boundaries from a sequence of echocardiographic images, where it started with pre-processing to reduce noise and produce better image quality. By pre-processing the images, the unclear edges are avoided, and we can get an accurate detection of both heart boundary and movement of heart valves.