A space X is named a πp – normal if for each closed set F and each π – closed set F’ in X with F ∩ F’ = ∅, there are p – open sets U and V of X with U ∩ V = ∅ whereas F ⊆ U and F’ ⊆ V. Our work studies and discusses a new kind of normality in generalized topological spaces. We define ϑπp – normal, ϑ–mildly normal, & ϑ–almost normal, ϑp– normal, & ϑ–mildly p–normal, & ϑ–almost p-normal and ϑπ-normal space, and we discuss some of their properties.
This research is marked by " lighting and processors formalism for spaces of shops selling mobile phones ," aims to " reveal about the relationship between lighting treatments formal designs spaces own internal selling mobile phone in addition to the recognition criteria Altsamama to lighting techniques and processes formality spaces shops competent to sell mobile phone " and lies the importance of this research in the " active role of the lighting in the building processors formalities and processes to prepare designs and show the final form integrated functionally and aesthetically so as to be ready for implementation , taking into consideration the role expressionist in the recipient and to make these spaces more than meet the require
... Show MoreThe theory of general topology view for continuous mappings is general version and is applied for topological graph theory. Separation axioms can be regard as tools for distinguishing objects in information systems. Rough theory is one of map the topology to uncertainty. The aim of this work is to presented graph, continuity, separation properties and rough set to put a new approaches for uncertainty. For the introduce of various levels of approximations, we introduce several levels of continuity and separation axioms on graphs in Gm-closure approximation spaces.
The design of the interior spaces process the product of intellectual civilization expresses the prevailing thought, discoverers of principles and beliefs through the sheen reflects the present, and generating languages graphical variety caused a different revolution in design mounting structure, and because of the complex nature of the interior spaces were and we have to be a reflection of cultural reality of being a form of cultural expression and true embodiment of scientific developments prevailing for each stage where she was born, the changes occurring in human thought and then extremism and the discrepancy tastes among individuals all communities factors have caused a change in the design structure involving modernization an
... Show MoreConditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
Liquid electrodes of domperidone maleate (DOMP) imprinted polymer were synthesis based on precipitation polymerization mechanism. The molecularly imprinted (MIP) and non-imprinted (NIP) polymers were synthesized using DOMP as a template. By methyl methacrylate (MMA) as monomer, N,Nmethylenebisacrylamide (NMAA) and ethylene glycol dimethacrylate (EGDMA) as cross-linkers and benzoyl peroxide (BP) as an initiator. The molecularly imprinted membranes were synthesis using acetophenone (APH), di-butyl sabacate (DBS), Di octylphthalate (DOPH) and triolyl phosphate (TP)as plasticizers in PVC matrix. The slopes and limit of detection of l
... Show MoreThis article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f
... Show MoreBlockchain has garnered the most attention as the most important new technology that supports recent digital transactions via e-government. The most critical challenge for public e-government systems is reducing bureaucracy and increasing the efficiency and performance of administrative processes in these systems since blockchain technology can play a role in a decentralized environment and execute a high level of security transactions and transparency. So, the main objectives of this work are to survey different proposed models for e-government system architecture based on blockchain technology implementation and how these models are validated. This work studies and analyzes some research trends focused on blockchain
... Show MoreA variety of new phenolic Schiff bases derivatives have been synthesized starting from Terephthaladehyde compound, all proposed structures were supported by FTIR, 1H-NMR, 13C-NMR, Elemental analysis, some derivatives evaluated by Thermal analysis (TGA).
Abstract
In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on t
... Show More
In this work, a novel technique to obtain an accurate solutions to nonlinear form by multi-step combination with Laplace-variational approach (MSLVIM) is introduced. Compared with the traditional approach for variational it overcome all difficulties and enable to provide us more an accurate solutions with extended of the convergence region as well as covering to larger intervals which providing us a continuous representation of approximate analytic solution and it give more better information of the solution over the whole time interval. This technique is more easier for obtaining the general Lagrange multiplier with reduces the time and calculations. It converges rapidly to exact formula with simply computable terms wit
... Show More