Metasurface polarizers are essential optical components in modern integrated optics and play a vital role in many optical applications including Quantum Key Distribution systems in quantum cryptography. However, inverse design of metasurface polarizers with high efficiency depends on the proper prediction of structural dimensions based on required optical response. Deep learning neural networks can efficiently help in the inverse design process, minimizing both time and simulation resources requirements, while better results can be achieved compared to traditional optimization methods. Hereby, utilizing the COMSOL Multiphysics Surrogate model and deep neural networks to design a metasurface grating structure with high extinction ration of »60000 at visible spectral wavelength of 632 nm, could be achieved.
To expedite the learning process, a group of algorithms known as parallel machine learning algorithmscan be executed simultaneously on several computers or processors. As data grows in both size andcomplexity, and as businesses seek efficient ways to mine that data for insights, algorithms like thesewill become increasingly crucial. Data parallelism, model parallelism, and hybrid techniques are justsome of the methods described in this article for speeding up machine learning algorithms. We alsocover the benefits and threats associated with parallel machine learning, such as data splitting,communication, and scalability. We compare how well various methods perform on a variety ofmachine learning tasks and datasets, and we talk abo
... Show MoreThe aim of this research is to develop qualitative workouts based on certain sensory perceptions for the development of offensive basketball abilities and to determine their impact on female pupils. Several findings, based on the au-thor's extensive expertise instructing basketball materials and our closeness to the sample, revealed deficits in some sensory perceptions “in the game of basketball”, which impair the accuracy of passing the ball to the best team-mate. It also affects the pace of dribbling and the difficulty of selecting the op-timal moment and distance to fire. Therefore, the researcher designs qualita-tive activities based on many sensory experiences, including distance, speed, force, and direction shift. In addition, the
... Show MoreThis study is due to insufficient development of the issues of initial training in tennis at youthful (student) age. Objective: development of a methodological and scientific-methodological base of students' tennis with current trends in tennis. Summing up the best practices of modern tennis, we came to the conclusion that the formation of the art of reflection backhands in teaching beginner students of sports specialization to achieve future success. In modern conditions in the development of Russian tennis student opens the possibility of using new technologies and programs. Using these approaches, we have developed a training program and tested students' tennis in the pedagogical experiment, which resulted in its effectiveness.
The theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreThis paper presents a nonlinear finite element modeling and analysis of steel fiber reinforced concrete (SFRC) deep beams with and without openings in web subjected to two- point loading. In this study, the beams were modeled using ANSYS nonlinear finite element
software. The percentage of steel fiber was varied from 0 to 1.0%.The influence of fiber content in the concrete deep beams has been studied by measuring the deflection of the deep beams at mid- span and marking the cracking patterns, compute the failure loads for each deep beam, and also study the shearing and first principal stresses for the deep beams with and without openings and with different steel fiber ratios. The above study indicates that the location of openings an
This paper is devoted to investigate the effect of internal curing technique on the properties of self-compacting concrete. In this study, self-compacting concrete is produced by using limestone powder as partial replacement by weight of cement with percentage of (5%), sand is partially replaced by volume with saturated fine lightweight aggregate which is thermostone aggregate as internal curing material in three percentages of (5%, 10%, 15%) for self-compacting concrete, and the use of two external curing conditions which are water and air. The experimental work was divided into three parts: in the first part, the workability tests of fresh self-compacting concrete were conducted. The second part included conducting compressive str
... Show MoreThis study was conducted to explore the effects of using ionized water on the productive and physiological performance of Japanese quails (Coturnix japonica). Our study was conducted at a private farm from 20th April, 2016 to 13th July, 2016 (84 d). One hundred 42-day-old Japanese quail chicks were used, divided randomly into 5 groups with 4 replicates. Treatments consisted in a control group (T1 - normal water:), alkaline (T2 - pH 8 and T3 - pH 9), and acidic water (T4 - pH 6 and T5 - pH 5). All birds were fed a balanced diet of energy and protein. The egg production ratio, egg weight, cumulative number of eggs, egg mass, feed conversion ratio, productivity per hen per week, and effects on plasma lipids, uric acid, glucose, calcium, and ph
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More