Ground-based active optical sensors (GBAOS) have been successfully used in agriculture to predict crop yield potential (YP) early in the season and to improvise N rates for optimal crop yield. However, the models were found weak or inconsistent due to environmental variation especially rainfall. The objectives of the study were to evaluate if GBAOS could predict YP across multiple locations, soil types, cultivation systems, and rainfall differences. This study was carried from 2011 to 2013 on corn (Zea mays L.) in North Dakota, and in 2017 in potatoes in Maine. Six N rates were used on 50 sites in North Dakota and 12 N rates on two sites, one dryland and one irrigated, in Maine. Two active GBAOS used for this study were GreenSeeker and Holland Scientific Crop Circle Sensor ACS 470 (HSCCACS-470) and 430 (HSCCACS-430). Rainfall data, with or without including crop height, improved the YP models in term of reliability and consistency. The polynomial model was relatively better compared to the exponential model. A significant difference in the relationship between sensor reading multiplied by rainfall data and crop yield was observed in terms of soil type, clay and medium textured, and cultivation system, conventional and no-till, respectively, in the North Dakota corn study. The two potato sites in Maine, irrigated and dryland, performed differently in terms of total yield and rainfall data helped to improve sensor YP models. In conclusion, this study strongly advocates the use of rainfall data while using sensor-based N calculator algorithms.
In this research want to make analysis for some indicators and it's classifications that related with the teaching process and the scientific level for graduate studies in the university by using analysis of variance for ranked data for repeated measurements instead of the ordinary analysis of variance . We reach many conclusions for the
important classifications for each indicator that has affected on the teaching process. &nb
... Show MoreThis paper delves into some significant performance measures (PMs) of a bulk arrival queueing system with constant batch size b, according to arrival rates and service rates being fuzzy parameters. The bulk arrival queuing system deals with observation arrival into the queuing system as a constant group size before allowing individual customers entering to the service. This leads to obtaining a new tool with the aid of generating function methods. The corresponding traditional bulk queueing system model is more convenient under an uncertain environment. The α-cut approach is applied with the conventional Zadeh's extension principle (ZEP) to transform the triangular membership functions (Mem. Fs) fuzzy queues into a family of conventional b
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
The use of data envelopment analysis method helps to improve the performance of organizations in order to exploit their resources efficiently in order to improve the service quality. represented study a problem in need of the Iraqi Middle East Investment Bank to assess the performance of bank branches, according to the service quality provided, Thus, the importance of the study is to contribute using a scientific and systematic method by applying the data envelopment analysis method in assessing the service quality provided by the bank branches, The study focused on achieving the goal of determining the efficiency of the services quality provided by the bank branches manner which reflect the extent of utilization of a
... Show MoreWireless sensor applications are susceptible to energy constraints. Most of the energy is consumed in communication between wireless nodes. Clustering and data aggregation are the two widely used strategies for reducing energy usage and increasing the lifetime of wireless sensor networks. In target tracking applications, large amount of redundant data is produced regularly. Hence, deployment of effective data aggregation schemes is vital to eliminate data redundancy. This work aims to conduct a comparative study of various research approaches that employ clustering techniques for efficiently aggregating data in target tracking applications as selection of an appropriate clustering algorithm may reflect positive results in the data aggregati
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
Today the NOMA has exponential growth in the use of Optical Visible Light Communication (OVLC) due to good features such as high spectral efficiency, low BER, and flexibility. Moreover, it creates a huge demand for electronic devices with high-speed processing and data rates, which leads to more FPGA power consumption. Therefore; it is a big challenge for scientists and researchers today to recover this problem by reducing the FPGA power and size of the devices. The subject matter of this article is producing an algorithm model to reduce the power consumption of (Field Programmable Gate Array) FPGA used in the design of the Non-Orthogonal Multiple Access (NOMA) techniques applied in (OVLC) systems combined with a blue laser. However, The po
... Show MoreThe present work focuses on the experimental implementation of one of the fiber optical sensors, the optical glass fiber built on surface Plasmon resonance. A type of optical glass fiber was used in this work, single-mode no-core fiber with pre-tapering diameter: (125.1 μm) and (125.3 μm), respectively. The taper method can be tested by measuring the output power of the optical fiber before and after chemical etching to show the difference in cladding diameter due to the effect of hydrofluoric acid with increasing time for the taper process. The optical glass fiber sensor can be fabricated using the taper method to reduce the cladding diameter of the fibers to (83.12 µm, 64.37 µm, and 52.45 µm) for single-mode fibers using Hydrofluoric
... Show More