Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreTrue random number generators are essential components for communications to be conconfidentially secured. In this paper a new method is proposed to generate random sequences of numbers based on the difference of the arrival times of photons detected in a coincidence window between two single-photon counting modules
Botnet is a malicious activity that tries to disrupt traffic of service in a server or network and causes great harm to the network. In modern years, Botnets became one of the threads that constantly evolving. IDS (intrusion detection system) is one type of solutions used to detect anomalies of networks and played an increasing role in the computer security and information systems. It follows different events in computer to decide to occur an intrusion or not, and it used to build a strategic decision for security purposes. The current paper
In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreThe measurement data of the raw water quality of Tigris River were statistically analyzed to measure the salinity value in relation to the selected raw water quality parameters. The analyzed data were collected from five water treatment plants (WTPs) assembled alongside of the Tigris River in Baghdad: Al-Karkh, Al-Karama, Al-Qadisiya, Al-Dora, and Al-Wihda for the period from 2015 to 2021. The selected parameters are total dissolved solid (TDS), electrical conductivity (EC), pH and temperature. The main objective of this research is to predicate a mathematical model using SPSS software to calculate the value of salinity along the river, in addition, the effect of electrical conductivi
Journal of Theoretical and Applied Information Technology is a peer-reviewed electronic research papers & review papers journal with aim of promoting and publishing original high quality research dealing with theoretical and scientific aspects in all disciplines of IT (Informaiton Technology
A simple, accurate, precise, rapid, economical and a high sensitive spectrophotometric method has been developed for the determination of tadalafil in pharmaceutical preparations and industrial wastewater samples, which shows a maximum absorbance at 204 nm in 1:1 ethanol-water. Beer's law was obeyed in the range of 1-7?g/ mL ,with molar absorptivity and Sandell ? s sensitivity of 0.783x105l/mol.cm and 4.97 ng/cm2respectively, relative standard deviation of the method was less than 1.7%, and accuracy (average recovery %) was 100 ± 0. 13. The limits of detection and quantitation are 0.18 and 0.54 µg .ml-1, respectively. The method was successfully applied to the determination of tadalafil in some pharmaceutical formulations
... Show MoreAdministrative procedures in various organizations produce numerous crucial records and data. These
records and data are also used in other processes like customer relationship management and accounting
operations.It is incredibly challenging to use and extract valuable and meaningful information from these data
and records because they are frequently enormous and continuously growing in size and complexity.Data
mining is the act of sorting through large data sets to find patterns and relationships that might aid in the data
analysis process of resolving business issues. Using data mining techniques, enterprises can forecast future
trends and make better business decisions.The Apriori algorithm has bee