In many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally that the speed up of the proposed method compared with traditional approaches approximately reaches up to 20 times depending on the block parameters.
The availability of different processing levels for satellite images makes it important to measure their suitability for classification tasks. This study investigates the impact of the Landsat data processing level on the accuracy of land cover classification using a support vector machine (SVM) classifier. The classification accuracy values of Landsat 8 (LS8) and Landsat 9 (LS9) data at different processing levels vary notably. For LS9, Collection 2 Level 2 (C2L2) achieved the highest accuracy of (86.55%) with the polynomial kernel of the SVM classifier, surpassing the Fast Line-of-Sight Atmospheric Analysis of Spectral Hypercubes (FLAASH) at (85.31%) and Collection 2 Level 1 (C2L1) at (84.93%). The LS8 data exhibits similar behavior. Conv
... Show MoreThe core idea of this study revolves around the news coverage by Iraqi satellite channels regarding corruption issues and their implications on the public's perception of the political process. The researcher designed a content analysis form encompassing both primary and sub-categories of news bulletins from the channels, Dijlah and Al-Itijah, spanning from 01/06/2021 to 31/08/2021, using a comprehensive enumeration method. The chosen timeframe preceded the parliamentary elections held in October 2021. Employing a descriptive-analytical approach coupled with observation, the researcher derived results that met the study's objectives. Among these findings, news items enhanced with video content topped the categorie
... Show MoreThis research entitled: "Artistic processing of Emotional Scenes in the Narrative Film" deals with how to process and embody those emotional scenes. As there are certain filmic elements that play an effective role in deepening the viewer's sense of the importance of those scenes, and that their presence in the film is necessary and inevitable, and cannot be dispensed because it forms an interconnected connection with the rest of the film's scenes, in addition to its dramatic and aesthetic value in the film in crystallizing the viewer's feelings and integrating him/her into the scene.
The research was divided into four chapters, the first chapter includes: the methodological framework, which represented the research problem, and brin
This paper proposes a new method Object Detection in Skin Cancer Image, the minimum
spanning tree Detection descriptor (MST). This ObjectDetection descriptor builds on the
structure of the minimum spanning tree constructed on the targettraining set of Skin Cancer
Images only. The Skin Cancer Image Detection of test objects relies on their distances to the
closest edge of thattree. Our experimentsshow that the Minimum Spanning Tree (MST) performs
especially well in case of Fogginessimage problems and in highNoisespaces for Skin Cancer
Image.
The proposed method of Object Detection Skin Cancer Image wasimplemented and tested on
different Skin Cancer Images. We obtained very good results . The experiment showed that
Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the
... Show MoreScheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreSignal denoising is directly related to sample estimation of received signals, either by estimating the equation parameters for the target reflections or the surrounding noise and clutter accompanying the data of interest. Radar signals recorded using analogue or digital devices are not immune to noise. Random or white noise with no coherency is mainly produced in the form of random electrons, and caused by heat, environment, and stray circuitry loses. These factors influence the output signal voltage, thus creating detectable noise. Differential Evolution (DE) is an effectual, competent, and robust optimisation method used to solve different problems in the engineering and scientific domains, such as in signal processing. This paper looks
... Show More
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show More