The need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as vectors of term TF-IDF weight and two basic and effective similarity measures: Cosine and Jaccard were used. Using the MS MARCO dataset, this article analyzes and assesses the retrieval effectiveness of the TF-ISF weighting scheme. The result shows that the TF-ISF model with the Cosine similarity measure retrieves more relevant documents. The model was evaluated against the conventional TF-ISF technique and shows that it performs significantly better on MS MARCO data (Microsoft-curated data of Bing queries).
DeepFake is a concern for celebrities and everyone because it is simple to create. DeepFake images, especially high-quality ones, are difficult to detect using people, local descriptors, and current approaches. On the other hand, video manipulation detection is more accessible than an image, which many state-of-the-art systems offer. Moreover, the detection of video manipulation depends entirely on its detection through images. Many worked on DeepFake detection in images, but they had complex mathematical calculations in preprocessing steps, and many limitations, including that the face must be in front, the eyes have to be open, and the mouth should be open with the appearance of teeth, etc. Also, the accuracy of their counterfeit detectio
... Show MoreThe importance of topology as a tool in preference theory is what motivates this study in which we characterize topologies generating by digraphs. In this paper, we generalized the notions of rough set concepts using two topological structures generated by out (resp. in)-degree sets of vertices on general digraph. New types of topological rough sets are initiated and studied using new types of topological sets. Some properties of topological rough approximations are studied by many propositions.
This study applies a discourse analysis framework to explore the portrayal of women in Maysloon Hadi’s novel (The Black Eyes) (2011), using Critical Discourse Analysis (CDA) and Norman Fairclough’s tri-dimensional model (1989) as the analytical foundation. It investigates the roles and challenges women face in the novel. While there is growing interest in the portrayal of women in literature, Iraqi literature—especially from the perspective of Iraqi women writers remains underexplored. Hadi’s *The Black Eyes* provides a unique case to examine this intersection. Despite the novel’s rich narrative, which offers insight into Iraqi women’s lives, there is a lack of comprehensive CDA to understand how its language constructs
... Show MoreThe study explores the use of ergative verbs in constructing clauses and their impact on the backgrounding of the agent's role in two selected short stories. Contrary to hypothesis No. 1, the research indicates that changes in sentence patterns don't affect the meaning of the process. Additionally, hypothesis No. 2 is refuted as the middle structure is found to highlight the agent's role in the science fiction short story, Terra Infirmum, rather than concealing it as hypothesized for "The Invisible Man." The analysis uncovers that writers utilize ergative processes to narrate stories in various ways, including transitive/active voice, intransitive/active voice, and transitive/passive voice. Furthermore, the findings suggest that writers emp
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreThrough recent years many researchers have developed methods to estimate the self-similarity and long memory parameter that is best known as the Hurst parameter. In this paper, we set a comparison between nine different methods. Most of them use the deviations slope to find an estimate for the Hurst parameter like Rescaled range (R/S), Aggregate Variance (AV), and Absolute moments (AM), and some depend on filtration technique like Discrete Variations (DV), Variance versus level using wavelets (VVL) and Second-order discrete derivative using wavelets (SODDW) were the comparison set by a simulation study to find the most efficient method through MASE. The results of simulation experiments were shown that the performance of the meth
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreAbstract
This research aims to study the reflection of accounting for contingent assets and liabilities and provisions on Faithful Representation characteristic of accounting information, To achieve this goal has been questionnaire design has been distributed to research sample, which consists of (50) li
... Show More