Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and estimation. The current method (visual quantification methods) of image quantification is time-consuming and cumbersome, and manual measurement is imprecise because of the natural differences among human eyes’ abilities. Subsequently, objective outcome evaluation can obviate the drawbacks of the current methods and facilitate recording for documenting function and research purposes. To achieve a fast and valuable objective estimation of fluorescence in each image, an algorithm was designed based on machine vision techniques to extract the targeted objects in images that resulted from confocal images and then estimate the covered area to produce a percentage value similar to the outcome of the current method and is predicted to contribute to sustainable biotechnology image analyses by reducing time and labor consumption. The results show strong evidence that t-designed objective algorithm evaluations can replace the current method of manual and visual quantification methods to the extent that the Intraclass Correlation Coefficient (ICC) is 0.9.
The unconventional techniques called “the quick look techniques”, have been developed to present well log data calculations, so that they may be scanned easily to identify the zones that warrant a more detailed analysis, these techniques have been generated by service companies at the well site which are among the useful, they provide the elements of information needed for making decisions quickly when time is of essence. The techniques used in this paper are:
- Apparent resistivity Rwa
- Rxo /Rt
The above two methods had been used to evaluate Nasiriyah oil field formations (well-NS-3) to discover the hydrocarbon bearing formations. A compu
... Show MoreThe influx of data in bioinformatics is primarily in the form of DNA, RNA, and protein sequences. This condition places a significant burden on scientists and computers. Some genomics studies depend on clustering techniques to group similarly expressed genes into one cluster. Clustering is a type of unsupervised learning that can be used to divide unknown cluster data into clusters. The k-means and fuzzy c-means (FCM) algorithms are examples of algorithms that can be used for clustering. Consequently, clustering is a common approach that divides an input space into several homogeneous zones; it can be achieved using a variety of algorithms. This study used three models to cluster a brain tumor dataset. The first model uses FCM, whic
... Show MoreSocial Networking has dominated the whole world by providing a platform of information dissemination. Usually people share information without knowing its truthfulness. Nowadays Social Networks are used for gaining influence in many fields like in elections, advertisements etc. It is not surprising that social media has become a weapon for manipulating sentiments by spreading disinformation. Propaganda is one of the systematic and deliberate attempts used for influencing people for the political, religious gains. In this research paper, efforts were made to classify Propagandist text from Non-Propagandist text using supervised machine learning algorithms. Data was collected from the news sources from July 2018-August 2018. After annota
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreWith the wide developments of computer applications and networks, the security of information has high attention in our common fields of life. The most important issues is how to control and prevent unauthorized access to secure information, therefore this paper presents a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of encryption in Rijndael-AES algorithm. This paper presents a proposed Rijndael encryption and decryption process with NTRU algorithm, Rijndael algorithm is widely accepted due to its strong encryption, and complex processing as well as its resistance to brute force attack. The proposed modifications are implemented by encryption and decryption Rijndael
... Show MoreWith the wide developments of computer science and applications of networks, the security of information must be increased and make it more complex. The most important issues is how to control and prevent unauthorized access to secure information, therefore this paper presents a combination of two efficient encryption algorithms to satisfy the purpose of information security by adding a new level of encryption in Rijndael-AES algorithm. This paper presents a proposed Rijndael encryption and decryption process with NTRU algorithm, Rijndael algorithm is important because of its strong encryption. The proposed updates are represented by encryption and decryption Rijndael S-Box using NTRU algorithm. These modifications enhance the degree of
... Show MoreAbstract:
The research aims to diagnose the relationship between the environmental tax and the development of the sustainable social dimension, where the environmental tax is considered a tool in promoting sustainable development according to its economic, social and environmental dimensions through the application of legislation and instructions for environmental protection, and that imposing an environmental tax will have a clear impact in achieving the dimensions of sustainable development and compliance With regard to the social dimension, the research relied on the financial data for the years (2019-2022) in obtaining information. The research reached a set of results, the most prominent of which was
... Show MoreThe problem of poverty and deprivation constitute a humanitarian tragedy and its continuation may threaten the political achievements reached by the State. Iraq, in particular, and although he is one of the very rich countries due to availability of huge economic wealth, poverty indicators are still high. In addition, the main factor in the decline in the standard of living due to the weakness of the government's performance in the delivery of public services of water, electricity and sanitation. Thus, the guide for human development has been addressed which express the achievements that the state can be achieved both on a physical level or on the human level, so in order to put appropriate strategies and policies aimed at elimin
... Show More