In this paper the experimentally obtained conditions for the fusion splicing with photonic crystal fibers (PCF) having large mode areas were reported. The physical mechanism of the splice loss and the microhole collapse property of photonic crystal fiber (PCF) were studied. By controlling the arc-power and the arc-time of a conventional electric arc fusion splicer (FSM-60S), the minimum loss of splicing for fusion two conventional single mode fibers (SMF-28) was (0.00dB), which has similar mode field diameter. For splicing PCF (LMA-10) with a conventional single mode fiber (SMF-28), the loss was increased due to the mode field mismatch.
Nebivolol (NBH) is a third-generation B1-blocker with high selectivity and vasodilation activity. Nevertheless, nebivolol exhibits low oral bioavailability, which may adversely affect its efficacy. Recently, supersaturable self-nanoemulsion (Su-SNE) is an advanced SNE approach that can address low bioavailability The study aims to prepare nebivolol-loaded Su-SNE by reduction the amount of the prepared conventional SNE to half. Besides, an appropriate polymer type and concentration to prevent NBH precipitation upon oral administration have investigated.. A conventional self-nanoemulsion (formula A) was prepared by dissolving NBH in 500 mg vehicle mixture of imwitor®988: cremophor-EL: propylene glycol. Then, eight Su-SNE formul
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreThe Log-Logistic distribution is one of the important statistical distributions as it can be applied in many fields and biological experiments and other experiments, and its importance comes from the importance of determining the survival function of those experiments. The research will be summarized in making a comparison between the method of maximum likelihood and the method of least squares and the method of weighted least squares to estimate the parameters and survival function of the log-logistic distribution using the comparison criteria MSE, MAPE, IMSE, and this research was applied to real data for breast cancer patients. The results showed that the method of Maximum likelihood best in the case of estimating the paramete
... Show MoreThe aim of this essay is to use a single-index model in developing and adjusting Fama-MacBeth. Penalized smoothing spline regression technique (SIMPLS) foresaw this adjustment. Two generalized cross-validation techniques, Generalized Cross Validation Grid (GGCV) and Generalized Cross Validation Fast (FGCV), anticipated the regular value of smoothing covered under this technique. Due to the two-steps nature of the Fama-MacBeth model, this estimation generated four estimates: SIMPLS(FGCV) - SIMPLS(FGCV), SIMPLS(FGCV) - SIM PLS(GGCV), SIMPLS(GGCV) - SIMPLS(FGCV), SIM PLS(GGCV) - SIM PLS(GGCV). Three-factor Fama-French model—market risk premium, size factor, value factor, and their implication for excess stock returns and portfolio return
... Show MoreThis work presents a completely new develop an analyzer, named NAG-5SX1-1D-SSP, that is simple, accurate, reproducible, and affordable for the determination of cefotaxime sodium (CFS) in both pure and pharmaceutical drugs. The analyzer was designed according to flow injection analysis, and conducted to turbidimetric measurements. Ammonium cerium nitrate was utilized as a precipitating agent. After optimizing the conditions, the analysis system exhibited a linear range of 0.008-27 mmol. L-1 (n=29), with a limit of detection of 439.3 ng/sample, a limit of quantification of 0.4805 mg/sample, and a correlation coefficient of 0.9988. The repeatability of the responses was assessed by performing six successive injections of CFS at concentra
... Show MoreThe biggest problem of structural materials for fusion reactor is the damage caused by the fusion product neutrons to the structural material. If this problem is overcomed, an important milestone will be left behind in fusion energy. One of the important problems of the structural material is that nuclei forming the structural material interacting with fusion neutrons are transmuted to stable or radioactive nuclei via (n, x) (x; alpha, proton, gamma etc.) reactions. In particular, the concentration of helium gas in the structural material increases through deuteron- tritium (D-T) and (n, α) reactions, and this increase significantly changes the microstructure and the properties of the structural materials. T
... Show MoreIn this study, we made a comparison between LASSO & SCAD methods, which are two special methods for dealing with models in partial quantile regression. (Nadaraya & Watson Kernel) was used to estimate the non-parametric part ;in addition, the rule of thumb method was used to estimate the smoothing bandwidth (h). Penalty methods proved to be efficient in estimating the regression coefficients, but the SCAD method according to the mean squared error criterion (MSE) was the best after estimating the missing data using the mean imputation method
The purpose of this paper is applying the robustness in Linear programming(LP) to get rid of uncertainty problem in constraint parameters, and find the robust optimal solution, to maximize the profits of the general productive company of vegetable oils for the year 2019, through the modify on a mathematical model of linear programming when some parameters of the model have uncertain values, and being processed it using robust counterpart of linear programming to get robust results from the random changes that happen in uncertain values of the problem, assuming these values belong to the uncertainty set and selecting the values that cause the worst results and to depend buil
... Show MoreObjective(s): To determine the impact of psychological distress in women upon coping with breast cancer.
Methodology: A descriptive design is carried throughout the present study. Convenient sample of (60) woman with breast cancer is recruited from the community. Two instruments, psychological distress scale and coping scale are developed for the study. Internal consistency reliability and content validity are obtained for the study instruments. Data are collect through the application of the study instruments. Data are analyzed through the use of descriptive statistical data analysis approach and inferential statistical data analysis approach.
Results: The study findings depict that women with breast cancer have experien
... Show More