The internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of data transmission at the original targeted location. Initially, healthcare data was collected. Preprocessing is achieved by the normalization method. An EEEF data offloading scheme is proposed. A fruit fly optimization (FFO) technique is utilized. The performance metrics such as energy consumption, delay, resource utilization, scalability, and packet loss are analyzed and compared with existing techniques. The future scope will make use of a revolutionary optimization approach for IoMT.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreReliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
A two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreObjective. This study aimed to evaluate the orthodontic bond strength and enamel-preserving ability of a hydroxyapatite nanoparticles-containingself-etch system following exposure to various ageing methods. Materials and Methods. Hydroxyapatite nanoparticles (nHAp) were incorporated into an orthodontic self-etch primer (SEP, Transbond™ plus) in three different concentrations (5%, 7%, and 9% wt) and tested versus the plain SEP (control) for shear bond strength (SBS), adhesive remnant index (ARI) scores, and enamel damage in range-finding experiments using premolar teeth. The best-performing formulation was further exposed to the following four artificial ageing methods: initial debonding, 24 h water storage, one-month water stora
... Show MoreStream of Consciousness technique has a great impact on writing literary texts in the modern age. This technique was broadly used in the late of nineteen century as a result of thedecay of plot, especially in novel writing. Novelists began to use stream of consciousness technique as a new phenomenon, because it goes deeper into the human mind and soul through involving it in writing. Modern novel has changed after Victorian age from the traditional novel that considers themes of religion, culture, social matters, etc. to be a group of irregular events and thoughts interrogate or reveal the inner feeling of readers.
This study simplifies stream of consciousness technique through clarifying the three levels of conscious
... Show MoreThis study aims to deeply analyze the character of serial killers in Thomas Harris’ novels and focuses on his novel Red Dragon. The Study searches in these complex and deep characters through the use of Erich Fromm's concepts about the destructive nature of the human psyche and what are the factors affecting serial killers in all social psychological, and biological aspects. This study Concluded: Thomas Harris portrayed the characters of serial killers in a professional and complex and made the reader go on a contemplative journey in the mind and soul of the serial killer, thus reaching the climax of artistic perfection.
The futuristic age requires progress in handwork or even sub-machine dependency and Brain-Computer Interface (BCI) provides the necessary BCI procession. As the article suggests, it is a pathway between the signals created by a human brain thinking and the computer, which can translate the signal transmitted into action. BCI-processed brain activity is typically measured using EEG. Throughout this article, further intend to provide an available and up-to-date review of EEG-based BCI, concentrating on its technical aspects. In specific, we present several essential neuroscience backgrounds that describe well how to build an EEG-based BCI, including evaluating which signal processing, software, and hardware techniques to use. Individu
... Show More