Currently, one of the topical areas of application of machine learning methods is the prediction of material characteristics. The aim of this work is to develop machine learning models for determining the rheological properties of polymers from experimental stress relaxation curves. The paper presents an overview of the main directions of metaheuristic approaches (local search, evolutionary algorithms) to solving combinatorial optimization problems. Metaheuristic algorithms for solving some important combinatorial optimization problems are described, with special emphasis on the construction of decision trees. A comparative analysis of algorithms for solving the regression problem in CatBoost Regressor has been carried out. The object of the study is the generated data sets obtained on the basis of theoretical stress relaxation curves. Tables of initial data for training models for all samples are presented, a statistical analysis of the characteristics of the initial data sets is carried out. The total number of numerical experiments for all samples was 346020 variations. When developing the models, CatBoost artificial intelligence methods were used, regularization methods (Weight Decay, Decoupled Weight Decay Regularization, Augmentation) were used to improve the accuracy of the model, and the Z-Score method was used to normalize the data. As a result of the study, intelligent models were developed to determine the rheological parameters of polymers included in the generalized non-linear Maxwell-Gurevich equation (initial relaxation viscosity, velocity modulus) using generated data sets for the EDT-10 epoxy binder as an example. Based on the results of testing the models, the quality of the models was assessed, graphs of forecasts for trainees and test samples, graphs of forecast errors were plotted. Intelligent models are based on the CatBoost algorithm and implemented in the Jupyter Notebook environment in Python. The constructed models have passed the quality assessment according to the following metrics: MAE, MSE, RMSE, MAPE. The maximum value of model error predictions was 0.86 for the MAPE metric, and the minimum value of model error predictions was 0.001 for the MSE metric. Model performance estimates obtained during testing are valid.
CNC machines are widely used in production fields since they produce similar parts in a minimum time, at higher speed and with possibly minimum error. A control system is designed, implemented and tested to control the operation of a laboratory CNC milling machine having three axes that are moved by using a stepper motor attached to each axis. The control system includes two parts, hardware part and software part, the hardware part used a PC (works as controller) connected to the CNC machine through its parallel port by using designed interface circuit. The software part includes the algorithms needed to control the CNC. The sample needs to be machined is drawn by using one of the drawing software like AUTOCAD or 3D MAX and is saved in a we
... Show MoreImage compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreTwo unsupervised classifiers for optimum multithreshold are presented; fast Otsu and k-means. The unparametric methods produce an efficient procedure to separate the regions (classes) by select optimum levels, either on the gray levels of image histogram (as Otsu classifier), or on the gray levels of image intensities(as k-mean classifier), which are represent threshold values of the classes. In order to compare between the experimental results of these classifiers, the computation time is recorded and the needed iterations for k-means classifier to converge with optimum classes centers. The variation in the recorded computation time for k-means classifier is discussed.
Segmentation is the process of partition digital images into different parts depending on texture, color, or intensity, and can be used in different fields in order to segment and isolate the area to be partitioned. In this work images of the Moon obtained through observations in Astronomy and space dep. College of science university of Baghdad by ( Toward space telescopes and widespread used of a CCD camera) . Different segmentation methods were used to segment lunar craters. Different celestial objects cause craters when they crash into the surface of the Moon like asteroids and meteorites. Thousands of craters appears on the Moon's surface with ranges in size from meter to many kilometers, it provide insights into the age and ge
... Show MoreSegmentation is the process of partition digital images into different parts depending on texture, color, or intensity, and can be used in different fields in order to segment and isolate the area to be partitioned. In this work images of the Moon obtained through observations in Astronomy and space dep. College of science university of Baghdad by ( Toward space telescopes and widespread used of a CCD camera) . Different segmentation methods were used to segment lunar craters. Different celestial objects cause craters when they crash into the surface of the Moon like asteroids and meteorites. Thousands of craters appears on the Moon's surface with ranges in size from meter to many kilometers, it provide insights into the age and geology
... Show MoreOptimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreObjective: This study aimed to evaluate the effectiveness of the Benson Relaxation Technique (BRT) in reducing pain during femoral artery sheath removal after percutaneous coronary intervention (PCI). Methods and Materials: A randomized controlled trial was conducted at three cardiac centers in Iraq. A total of 58 patients undergoing therapeutic PCI were randomly assigned into two groups: intervention (n=27) and control (n=31). The intervention group received BRT for 10 minutes before and after sheath removal. Pain was assessed using the Visual Analogue Scale (VAS) immediately after the procedure. Demographic data and clinical variables were collected. Data were analyzed using SPSS v26 and non-parametric tests (Mann-Whitney U, Krus
... Show MoreThe complexity and variety of language included in policy and academic documents make the automatic classification of research papers based on the United Nations Sustainable Development Goals (SDGs) somewhat difficult. Using both pre-trained and contextual word embeddings to increase semantic understanding, this study presents a complete deep learning pipeline combining Bidirectional Long Short-Term Memory (BiLSTM) and Convolutional Neural Network (CNN) architectures which aims primarily to improve the comprehensibility and accuracy of SDG text classification, thereby enabling more effective policy monitoring and research evaluation. Successful document representation via Global Vector (GloVe), Bidirectional Encoder Representations from Tra
... Show MoreMelanoidins can be diagnosed using the Fourier transform infrared (FTIR) technique. UV/Vis is an effective tool for both qualitative and quantitative analysis of chemical components in melanoidin polymers. The structural and vibrational features of melanoidin synthesized from D-glucose and D-fructose are identical, according to FTIR spectra, with the only difference being the intensity of bands. Using FTIR spectra, the skeleton of melanoidin is divided into seven major regions. The existence of the C=C, C=N, and C=O groups in all melanoidins formed from fructose and glucose with ammonia is confirmed by the areas ranging from 1600 to 1690 cm-1, and the band is largely evident as a broad shoulder. Both melan
... Show More