Sequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of coverage. Motivated by such challenge, this paper proposes a novel SCA strategy called Dynamic Event Order (DEO), in which the test case generation is done using one-parameter-at-a-time fashion. The details of the DEO are presented with a step-by-step example to demonstrate the behavior and show the correctness of the proposed strategy. In addition, this paper makes a comparison with existing computational strategies. The practical results demonstrate that the proposed DEO strategy outperforms the existing strategies in term of minimal test size in most cases. Moreover, the significance of the DEO increases as the number of sequences increases and/ or the strength of coverage increases. Furthermore, the proposed DEO strategy succeeds to generate SCAs up to t=7. Finally, the DEO strategy succeeds to find new upper bounds for SCA. In fact, the proposed strategy can act as a research vehicle for variants future implementation.
AbstractThe research aims to identify the impact of the different methods in calculating the Items sensitivity coefficient on the standard characteristics of the Criterion-Referenced test in the measurement and evaluation material. The research sample consisted of (35) male and female students, who were chosen by the intentional method. The researcher prepared learning-teaching program in constructing the content of the measurement and evaluation material for non-specialized departments, prepared an achievement test in its equivalent forms, identified the results of agreement between the methods used in analyzing the items of the criterion-referenced test, and compared the standard characteristics of the achievement test, both according to
... Show MoreA fast moving infrared excess source (G2) which is widely interpreted as a core-less gas and dust cloud approaches Sagittarius A* (Sgr A*) on a presumably elliptical orbit. VLT
The regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc
... Show MorePrediction of the formation of pore and fracture pressure before constructing a drilling wells program are a crucial since it helps to prevent several drilling operations issues including lost circulation, kick, pipe sticking, blowout, and other issues. IP (Interactive Petrophysics) software is used to calculate and measure pore and fracture pressure. Eaton method, Matthews and Kelly, Modified Eaton, and Barker and Wood equations are used to calculate fracture pressure, whereas only Eaton method is used to measure pore pressure. These approaches are based on log data obtained from six wells, three from the north dome; BUCN-52, BUCN-51, BUCN-43 and the other from the south dome; BUCS-49, BUCS-48, BUCS-47. Along with the overburden pressur
... Show MorePrediction of the formation of pore and fracture pressure before constructing a drilling wells program are a crucial since it helps to prevent several drilling operations issues including lost circulation, kick, pipe sticking, blowout, and other issues. IP (Interactive Petrophysics) software is used to calculate and measure pore and fracture pressure. Eaton method, Matthews and Kelly, Modified Eaton, and Barker and Wood equations are used to calculate fracture pressure, whereas only Eaton method is used to measure pore pressure. These approaches are based on log data obtained from six wells, three from the north dome; BUCN-52, BUCN-51, BUCN-43 and the other from the south dome; BUCS-49, BUCS-48, BUCS-47. Along with the overburden pr
... Show MoreThis study aimed to investigate the role of Big Data in forecasting corporate bankruptcy and that is through a field analysis in the Saudi business environment, to test that relationship. The study found: that Big Data is a recently used variable in the business context and has multiple accounting effects and benefits. Among the benefits is forecasting and disclosing corporate financial failures and bankruptcies, which is based on three main elements for reporting and disclosing that, these elements are the firms’ internal control system, the external auditing, and financial analysts' forecasts. The study recommends: Since the greatest risk of Big Data is the slow adaptation of accountants and auditors to these technologies, wh
... Show MoreThe hydrological process has a dynamic nature characterised by randomness and complex phenomena. The application of machine learning (ML) models in forecasting river flow has grown rapidly. This is owing to their capacity to simulate the complex phenomena associated with hydrological and environmental processes. Four different ML models were developed for river flow forecasting located in semiarid region, Iraq. The effectiveness of data division influence on the ML models process was investigated. Three data division modeling scenarios were inspected including 70%–30%, 80%–20, and 90%–10%. Several statistical indicators are computed to verify the performance of the models. The results revealed the potential of the hybridized s
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show More