For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
The major of DDoS attacks use TCP protocol and the TCP SYN flooding attack is the most common one among them. The SYN Cookie mechanism is used to defend against the TCP SYN flooding attack. It is an effective defense, but it has a disadvantage of high calculations and it doesn’t differentiate spoofed packets from legitimate packets. Therefore, filtering the spoofed packet can effectively enhance the SYN Cookie activity. Hop Count Filtering (HCF) is another mechanism used at the server side to filter spoofed packets. This mechanism has a drawback of being not a perfect and final solution in defending against the TCP SYN flooding attack. An enhanced mechanism of Integrating and combining the SYN Cookie with Hop Count Filtering (HCF) mech
... Show MoreThe diagnosis of acute appendicitis (AA) sometimes is illusive and the accompanying clinical and laboratory manifestations cannot be used for definitive diagnosis. Objective: This study aimed to evaluate the diagnostic value of neutrophil/lymphocyte ratio (NLR) in detection of AA. Materials and Methods: This is a cross-sectional study that included a total of 80 adult patients with AA and 62 age- and gender-matched patients with abdominal pain due to causes other than AA. Three milliliter of peripheral blood were collected from each participant. The NLR was calculated by dividing the absolute neutrophil count by the absolute lymphocyte count. Receiver operating characteristic curve was used to assess the diagnostic value of NLR in detection
... Show MoreWith the rapid development of computers and network technologies, the security of information in the internet becomes compromise and many threats may affect the integrity of such information. Many researches are focused theirs works on providing solution to this threat. Machine learning and data mining are widely used in anomaly-detection schemes to decide whether or not a malicious activity is taking place on a network. In this paper a hierarchical classification for anomaly based intrusion detection system is proposed. Two levels of features selection and classification are used. In the first level, the global feature vector for detection the basic attacks (DoS, U2R, R2L and Probe) is selected. In the second level, four local feature vect
... Show MoreThe use of credit cards for online purchases has significantly increased in recent years, but it has also led to an increase in fraudulent activities that cost businesses and consumers billions of dollars annually. Detecting fraudulent transactions is crucial for protecting customers and maintaining the financial system's integrity. However, the number of fraudulent transactions is less than legitimate transactions, which can result in a data imbalance that affects classification performance and bias in the model evaluation results. This paper focuses on processing imbalanced data by proposing a new weighted oversampling method, wADASMO, to generate minor-class data (i.e., fraudulent transactions). The proposed method is based on th
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing artificial TABU algorithm to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as sport, chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement.
Improving" Jackknife Instrumental Variable Estimation method" using A class of immun algorithm with practical application
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show More