n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreSecure storage of confidential medical information is critical to healthcare organizations seeking to protect patient's privacy and comply with regulatory requirements. This paper presents a new scheme for secure storage of medical data using Chaskey cryptography and blockchain technology. The system uses Chaskey encryption to ensure integrity and confidentiality of medical data, blockchain technology to provide a scalable and decentralized storage solution. The system also uses Bflow segmentation and vertical segmentation technologies to enhance scalability and manage the stored data. In addition, the system uses smart contracts to enforce access control policies and other security measures. The description of the system detailing and p
... Show MoreHaruki Murakami (1949-present) is a contemporary Japanese writer whose works have been translated into fifty languages and won him plenty of Japanese and international awards. His short stories are well constructed in a weird realistic manner and are mixed with elements of surrealism. His novels and short stories fall under the genre of magical realism. One of the major revolving themes that Murakami wrote about was the haunting feeling of emptiness and disconnectedness in a world which seems to care much for materialism and self-interests.
The paper explores two of Murakami’s short stories in his book After the Quake (2000) and the relevance of their themes and characters to Iraq after the q
... Show MoreLong memory analysis is one of the most active areas in econometrics and time series where various methods have been introduced to identify and estimate the long memory parameter in partially integrated time series. One of the most common models used to represent time series that have a long memory is the ARFIMA (Auto Regressive Fractional Integration Moving Average Model) which diffs are a fractional number called the fractional parameter. To analyze and determine the ARFIMA model, the fractal parameter must be estimated. There are many methods for fractional parameter estimation. In this research, the estimation methods were divided into indirect methods, where the Hurst parameter is estimated fir
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreLand Use / Land Cover (LULC) classification is considered one of the basic tasks that decision makers and map makers rely on to evaluate the infrastructure, using different types of satellite data, despite the large spectral difference or overlap in the spectra in the same land cover in addition to the problem of aberration and the degree of inclination of the images that may be negatively affect rating performance. The main objective of this study is to develop a working method for classifying the land cover using high-resolution satellite images using object based method. Maximum likelihood pixel based supervised as well as object approaches were examined on QuickBird satellite image in Karbala, Iraq. This study illustrated that
... Show MoreData of multispectral satellite image (Landsat- 5 and Landsat-7) was used to monitoring the case of study area in the agricultural (extension and plant density), using ArcGIS program by the method of analysis (Soil adjusted vegetative Index). The data covers the selected area at west of Baghdad Government with a part of the Anbar and Karbala Government. Satellite image taken during the years 1990, 2001 and 2007. The scene of Satellite Image is consists of seven of spectral band for each satellite, Landsat-5(TM) thematic mapper for the year 1990, as well as satellite Landsat-7 (ETM+) Enhancement thematic mapper for the year 2001 and 2007. The results showed that in the period from 1990 to 2001 decreased land area exposed (bare) and increased
... Show MoreEsterification reaction is most important reaction in biodiesel production. In this study, oleic acid was used as a suggested feedstock to study and simulate production of biodiesel. Batch esterification of oleic acid was carried out at operating conditions; temperature from 40 to 70 °C, ethanol to oleic acid molar ratio from 1/1 to 6/1, H2SO4 as the catalyst 1 and 5% wt of oleic acid, reaction time up to 180 min. The optimum conditions for the esterification reaction were molar ratio of ethanol/oleic acid 6/1, 5%wt H2SO4 relative to oleic acid, 70 °C, 90 min and conversion of oleic 0.92. The activation energy for the suggested model was 26625 J/mole for forward reaction and 42189 J/mole for equilibrium constant. The obtained results s
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show More