Ischemic stroke is a significant cause of morbidity and mortality worldwide. Autophagy, a process of intracellular degradation, has been shown to play a crucial role in the pathogenesis of ischemic stroke. Long non-coding RNAs (lncRNAs) have emerged as essential regulators of autophagy in various diseases, including ischemic stroke. Recent studies have identified several lncRNAs that modulate autophagy in ischemic stroke, including MALAT1, MIAT, SNHG12, H19, AC136007. 2, C2dat2, MEG3, KCNQ1OT1, SNHG3, and RMRP. These lncRNAs regulate autophagy by interacting with key proteins involved in the autophagic process, such as Beclin-1, ATG7, and LC3. Understanding the role of lncRNAs in regulating autophagy in ischemic stroke may provide new insights into the pathogenesis of this disease and identify potential therapeutic targets for its treatment.
Nigella sativa has various pharmacological properties and has been used throughout history for a variety of reasons. However, there is limited data about the effects of N. sativa (NS) on human cancer cells. This study aimed at observing the roles of methanolic extract of N. sativa on apoptosis and autophagy pathway in the Human PC3 (prostate cancer) cell line. The cell viability was checked by MTT assay. Clonogenic assay was performed to demonstrate clonogenicity and Western blot was used to check caspase-3, TIGAR, p53, and LC3 protein expression. The results demonstrated that PC3 cell proliferation was inhibited, caspase-3 and p53 protein expression was induced, and LC3 protein expression was modulated. The clonogenic assay showed that PC3
... Show MoreImage compression is one of the data compression types applied to digital images in order to reduce their high cost for storage and/or transmission. Image compression algorithms may take the benefit of visual sensitivity and statistical properties of image data to deliver superior results in comparison with generic data compression schemes, which are used for other digital data. In the first approach, the input image is divided into blocks, each of which is 16 x 16, 32 x 32, or 64 x 64 pixels. The blocks are converted first into a string; then, encoded by using a lossless and dictionary-based algorithm known as arithmetic coding. The more occurrence of the pixels values is codded in few bits compare with pixel values of less occurre
... Show MoreIn this paper, an algorithm through which we can embed more data than the
regular methods under spatial domain is introduced. We compressed the secret data
using Huffman coding and then this compressed data is embedded using laplacian
sharpening method.
We used Laplace filters to determine the effective hiding places, then based on
threshold value we found the places with the highest values acquired from these filters
for embedding the watermark. In this work our aim is increasing the capacity of
information which is to be embedded by using Huffman code and at the same time
increasing the security of the algorithm by hiding data in the places that have highest
values of edges and less noticeable.
The perform
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreStroke is the second largest cause of death worldwide and one of the most common causes of disability. However, several approaches have been proposed to deal with stroke patient rehabilitation like robotic devices and virtual reality systems, researchers have found that the brain-computer interfaces (BCI) approaches can provide better results. In this study, the electroencephalography (EEG) dataset from post-stroke patients were investigated to identify the effects of the motor imagery (MI)-based BCI therapy by investigating sensorimotor areas using frequency and time-domain features and to select particular methods that help in enhancing the MI-based BCI systems for stroke patients using EEG signal processing. Therefore, to detect
... Show MoreIn this paper, we will focus to one of the recent applications of PU-algebras in the coding theory, namely the construction of codes by soft sets PU-valued functions. First, we shall introduce the notion of soft sets PU-valued functions on PU-algebra and investigate some of its related properties.Moreover, the codes generated by a soft sets PU-valued function are constructed and several examples are given. Furthermore, example with graphs of binary block code constructed from a soft sets PU-valued function is constructed.
The aim of the research is to prepare motor sense exercises for developing motor and physiological abilities of backstroke and forward stroke service skill in badminton and investigated their effect. The research is adopted the experimental method with two groups design. The sample of the research is 8 players (13-15 years). The sample is divided into two groups of 4 players for each group. Both groups are exposed to pre and post tests, after the experimented were finished, the results are statically analyzed. The results have showed that there are positive developing abilities of motor and physiological of service skill in badminton. Finally, these prepared exercises are recommended for developing players’ abilities in badminton.
In this paper, an efficient method for compressing color image is presented. It allows progressive transmission and zooming of the image without need to extra storage. The proposed method is going to be accomplished using cubic Bezier surface (CBI) representation on wide area of images in order to prune the image component that shows large scale variation. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, bi-orthogonal wavelet transform is applied to decompose the residue component. Both scalar quantization and quad tree coding steps are applied on the produced wavelet sub bands. Finally, adaptive shift coding is applied to handle the remaining statistical redundancy and attain e
... Show MoreAbstract Objective: The underlying molecular basis of ischemic heart diseases (IHDs) has not yet been studied among Iraqi people. This study determined the frequency and types of some cardiovascular genetic risk factors among Iraqi patients with IHDs. Methods: This is a cross-sectional study recruiting 56 patients with acute IHD during a 2-month period excluding patients >50 years and patients with documented hyperlipidemia. Their ages ranged between 18 and 50 years; males were 54 and females were only 2. Peripheral blood samples were aspirated from all patients for troponin I and DNA testing. Molecular analysis to detect 12 common cardiovascular genetic risk factors using CVD StripAssay® (ViennaLab Diagnostics GmbH, Austria) was performed
... Show More