Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such as decision tree and nearest neighbor search. The proposed method can handle streaming data efficiently and, for entropy discretization, provide su the optimal split value.
The hydraulic behavior of the flow can be changed by using large-scale geometric roughness elements in open channels. This change can help in controlling erosions and sedimentations along the mainstream of the channel. Roughness elements can be large stone or concrete blocks placed at the channel's bed to impose more resistance in the bed. The geometry of the roughness elements, numbers used, and configuration are parameters that can affect the flow's hydraulic characteristics. In this paper, velocity distribution along the flume was theoretically investigated using a series of tests of T-shape roughness elements, fixed height, arranged in three different configurations, differ in the number of lines of roughness element
... Show MoreThis article presents the results of an experimental investigation of using carbon fiber–reinforced polymer sheets to enhance the behavior of reinforced concrete deep beams with large web openings in shear spans. A set of 18 specimens were fabricated and tested up to a failure to evaluate the structural performance in terms of cracking, deformation, and load-carrying capacity. All tested specimens were with 1500-mm length, 500-mm cross-sectional deep, and 150-mm wide. Parameters that studied were opening size, opening location, and the strengthening factor. Two deep beams were implemented as control specimens without opening and without strengthening. Eight deep beams were fabricated with openings but without strengthening, while
... Show MoreThe aggregation capacity of human reb blood cells lies between that of the non- aggregated arythrocyte and the remarkably full sedimentation. As the ability to aggregate is atributed to many factors such as the availability of macromolecules and plasma lipids, the role of plasm lipid profile on RBC aggregation and sedimentation changes in normal and diabetic patients is studied.Also serum lipid profile measurement (Total cholesterol, Triglyceride, HDL, LDL, VLDL) in normal and diabetic subjects were made. The principle of measurement includes detecting the transmitted laser light through a suspension of 10% diluted red blood cells in plasma. In all diabetics, the raulux formation and sedimentation rate is enhanced.
The successful implementation of deep learning nets opens up possibilities for various applications in viticulture, including disease detection, plant health monitoring, and grapevine variety identification. With the progressive advancements in the domain of deep learning, further advancements and refinements in the models and datasets can be expected, potentially leading to even more accurate and efficient classification systems for grapevine leaves and beyond. Overall, this research provides valuable insights into the potential of deep learning for agricultural applications and paves the way for future studies in this domain. This work employs a convolutional neural network (CNN)-based architecture to perform grapevine leaf image classifi
... Show MoreInvestigating gender differences based on emotional changes becomes essential to understand various human behaviors in our daily life. Ten students from the University of Vienna have been recruited by recording the electroencephalogram (EEG) dataset while watching four short emotional video clips (anger, happiness, sadness, and neutral) of audiovisual stimuli. In this study, conventional filter and wavelet (WT) denoising techniques were applied as a preprocessing stage and Hurst exponent
In this research the Empirical Bayes method is used to Estimate the affiliation parameter in the clinical trials and then we compare this with the Moment Estimates for this parameter using Monte Carlo stimulation , we assumed that the distribution of the observation is binomial distribution while the distribution with the unknown random parameters is beta distribution ,finally we conclude that the Empirical bayes method for the random affiliation parameter is efficient using Mean Squares Error (MSE) and for different Sample size .
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More