Preferred Language
Articles
/
bsj-7255
Short Text Semantic Similarity Measurement Approach Based on Semantic Network
...Show More Authors

Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that represents the relationship between the compared texts and extracts the degree of similarity between them. Representing a text as a semantic network is the best knowledge representation that comes close to the human mind's understanding of the texts, where the semantic network reflects the sentence's semantic, syntactical, and structural knowledge. The network representation is a visual representation of knowledge objects, their qualities, and their relationships. WordNet lexical database has been used as a knowledge-based source while the GloVe pre-trained word embedding vectors have been used as a corpus-based source. The proposed method was tested using three different datasets, DSCS, SICK, and MOHLER datasets. A good result has been obtained in terms of RMSE and MAE.

Scopus Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Dec 02 2017
Journal Name
Al-khwarizmi Engineering Journal
Design of a Programmable System for Failure Modes and Effect Analysis of Steam-Power Plant Based on the Fault Tree Analysis
...Show More Authors

In this paper, the system of the power plant has been investigated as a special type of industrial systems, which has a significant role in improving societies since the electrical energy has entered all kinds of industries, and it is considered as the artery of modern life.

   The aim of this research is to construct a programming system, which could be used to identify the most important failure modes that are occur in a steam type of power plants. Also the effects and reasons of each failure mode could be analyzed through the usage of this programming system reaching to the basic events (main reasons) that causing each failure mode. The construction of this system for FMEA is dependi

... Show More
View Publication Preview PDF
Publication Date
Tue Feb 26 2019
Journal Name
Journal Of Accounting And Financial Studies ( Jafs )
Use the Style of the Activity Based Cost time Drivine (TDABC) and its Impact on the Untapped Resources: Empirical study in the General Company for Textile Industries - Wasit
...Show More Authors

   The research aims to identify the importance of using the style of the cost on the basis of activity -oriented in time TDABC and its role in determining the cost of products more equitably and thus its impact on the policy of allocation of resources through the reverse of the changes that occur on an ongoing basis in the specification of the products and thus the change in the nature and type of operations . The research was conducted at the General Company for Textile Industries Wasit / knitting socks factory was based on research into the hypothesis main of that ( possible to calculate the cost of activities that cause the production through the time it takes to run these activities can then be re- distributed product cost

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Feb 01 2011
Journal Name
Iop Conference Series: Materials Science And Engineering
Contour extraction of echocardiographic images based on pre-processing
...Show More Authors

In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Sun Jun 01 2014
Journal Name
Baghdad Science Journal
Multifocus Images Fusion Based On Homogenity and Edges Measures
...Show More Authors

Image fusion is one of the most important techniques in digital image processing, includes the development of software to make the integration of multiple sets of data for the same location; It is one of the new fields adopted in solve the problems of the digital image, and produce high-quality images contains on more information for the purposes of interpretation, classification, segmentation and compression, etc. In this research, there is a solution of problems faced by different digital images such as multi focus images through a simulation process using the camera to the work of the fuse of various digital images based on previously adopted fusion techniques such as arithmetic techniques (BT, CNT and MLT), statistical techniques (LMM,

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed May 06 2015
Journal Name
16th Conference In Natural Science And Mathematics
Efficient digital Image filtering method based on fuzzy algorithm
...Show More Authors

Recently, Image enhancement techniques can be represented as one of the most significant topics in the field of digital image processing. The basic problem in the enhancement method is how to remove noise or improve digital image details. In the current research a method for digital image de-noising and its detail sharpening/highlighted was proposed. The proposed approach uses fuzzy logic technique to process each pixel inside entire image, and then take the decision if it is noisy or need more processing for highlighting. This issue is performed by examining the degree of association with neighboring elements based on fuzzy algorithm. The proposed de-noising approach was evaluated by some standard images after corrupting them with impulse

... Show More
View Publication
Publication Date
Wed Jan 01 2020
Journal Name
Aip Conference Proceedings
Developing a lightweight cryptographic algorithm based on DNA computing
...Show More Authors

This work aims to develop a secure lightweight cipher algorithm for constrained devices. A secure communication among constrained devices is a critical issue during the data transmission from the client to the server devices. Lightweight cipher algorithms are defined as a secure solution for constrained devices that require low computational functions and small memory. In contrast, most lightweight algorithms suffer from the trade-off between complexity and speed in order to produce robust cipher algorithm. The PRESENT cipher has been successfully experimented on as a lightweight cryptography algorithm, which transcends other ciphers in terms of its computational processing that required low complexity operations. The mathematical model of

... Show More
Crossref (7)
Crossref
Publication Date
Sat Aug 01 2015
Journal Name
International Journal Of Advanced Research In Computer Science And Software Engineering
Partial Encryption for Colored Images Based on Face Detection
...Show More Authors

Publication Date
Tue Oct 04 2022
Journal Name
Ieee Access
Plain, Edge, and Texture Detection Based on Orthogonal Moment
...Show More Authors

Image pattern classification is considered a significant step for image and video processing.Although various image pattern algorithms have been proposed so far that achieved adequate classification,achieving higher accuracy while reducing the computation time remains challenging to date. A robust imagepattern classification method is essential to obtain the desired accuracy. This method can be accuratelyclassify image blocks into plain, edge, and texture (PET) using an efficient feature extraction mechanism.Moreover, to date, most of the existing studies are focused on evaluating their methods based on specificorthogonal moments, which limits the understanding of their potential application to various DiscreteOrthogonal Moments (DOMs). The

... Show More
Publication Date
Wed Jan 01 2020
Journal Name
Ieee Access
A New Separable Moments Based on Tchebichef-Krawtchouk Polynomials
...Show More Authors

View Publication
Scopus (22)
Crossref (20)
Scopus Clarivate Crossref
Publication Date
Sat Jul 01 2023
Journal Name
International Journal Of Computing And Digital Systems
Human Identification Based on SIFT Features of Hand Image
...Show More Authors

View Publication
Scopus (3)
Scopus Crossref