The seasonal behavior of the light curve for selected star SS UMI and EXDRA during outburst cycle is studied. This behavior describes maximum temperature of outburst in dwarf nova. The raw data has been mathematically modeled by fitting Gaussian function based on the full width of the half maximum and the maximum value of the Gaussian. The results of this modeling describe the value of temperature of the dwarf novae star system leading to identify the type of elements that each dwarf nova consisted of.
The current research discusses the topic of the formal data within the methodological framework through defining the research problem, limits and objectives and defining the most important terms mentioned in this research. The theoretical framework in the first section addressed (the concept of the Bauhaus school, the philosophy of the Bauhaus school and the logical bases of this school). The second section dealt with (the most important elements and structural bases of the Bauhaus school) which are considered the most important formal data of this school and their implications on the fabrics and costumes design. The research came up with the most important indicators resulting from the theoretical framework.
Chapter three defined the
This paper provides an attempt for modeling rate of penetration (ROP) for an Iraqi oil field with aid of mud logging data. Data of Umm Radhuma formation was selected for this modeling. These data include weight on bit, rotary speed, flow rate and mud density. A statistical approach was applied on these data for improving rate of penetration modeling. As result, an empirical linear ROP model has been developed with good fitness when compared with actual data. Also, a nonlinear regression analysis of different forms was attempted, and the results showed that the power model has good predicting capability with respect to other forms.
Cloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreIn this paper, image compression technique is presented based on the Zonal transform method. The DCT, Walsh, and Hadamard transform techniques are also implements. These different transforms are applied on SAR images using Different block size. The effects of implementing these different transforms are investigated. The main shortcoming associated with this radar imagery system is the presence of the speckle noise, which affected the compression results.
Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show More