The primary objective of the current paper is to suggest and implement effective computational methods (DECMs) to calculate analytic and approximate solutions to the nonlocal one-dimensional parabolic equation which is utilized to model specific real-world applications. The powerful and elegant methods that are used orthogonal basis functions to describe the solution as a double power series have been developed, namely the Bernstein, Legendre, Chebyshev, Hermite, and Bernoulli polynomials. Hence, a specified partial differential equation is reduced to a system of linear algebraic equations that can be solved by using Mathematica®12. The techniques of effective computational methods (DECMs) have been applied to solve some specific cases of time-dependent diffusion equations. Moreover, the maximum absolute error () is determined to demonstrate the accuracy of the proposed techniques.
Computational Thinking (CT) is very useful in the process of solving everyday problems for undergraduates. In terms of content, computational thinking involves solving problems, studying data patterns, deconstructing problems using algorithms and procedures, doing simulations, computer modeling, and reasoning about abstract things. However, there is a lack of studies dealing with it and its skills that can be developed and utilized in the field of information and technology used in learning and teaching. The descriptive research method was used, and a test research tool was prepared to measure the level of (CT) consisting of (24) items of the type of multiple-choice to measure the level of "CT". The research study group consists of
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreThe researcher studied transportation problem because it's great importance in the country's economy. This paper which ware studied several ways to find a solution closely to the optimization, has applied these methods to the practical reality by taking one oil derivatives which is benzene product, where the first purpose of this study is, how we can reduce the total costs of transportation for product of petrol from warehouses in the province of Baghdad, to some stations in the Karsh district and Rusafa in the same province. Secondly, how can we address the Domandes of each station by required quantity which is depending on absorptive capacity of the warehouses (quantities supply), And through r
... Show More
In this work, a novel technique to obtain an accurate solutions to nonlinear form by multi-step combination with Laplace-variational approach (MSLVIM) is introduced. Compared with the traditional approach for variational it overcome all difficulties and enable to provide us more an accurate solutions with extended of the convergence region as well as covering to larger intervals which providing us a continuous representation of approximate analytic solution and it give more better information of the solution over the whole time interval. This technique is more easier for obtaining the general Lagrange multiplier with reduces the time and calculations. It converges rapidly to exact formula with simply computable terms wit
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreBacteria could produce bacterial nanocellulose through a procedure steps: polymerization and crystallization, that occur in the cytoplasm of the bacteria, the residues of glucose polymerize to (β-1,4) lineal glucan chains that produced from bacterial cell extracellularly, these lineal glucan are converted to microfbrils, after that these microfbrils collected together to shape very pure three dimensional pored net. It could be obtained a pure cellulose that created by some M.O, from the one of the active producer organism like Acetic acid bacteria (AAB), that it is a gram -ve, motile and live in aerobic condition. The bacterial nanocellulose (BNC) have great consideration in many fields because of its flexible properties, features
... Show More