The structure of the interrogation process in cross-examinations is said to be diverse and complex in terms of question-response typology. This is because the counsel has to extract truth from an opposing party’s witness whose views are expected to advocate that party's views regarding the case. Accordingly, the study which is basically quantitative in nature aims to investigate what the examining party intends to obtain out of these questions and which of these questions are the most prevalently used. It also aims to measure the amount of cooperativity in witnesses' responses. Accordingly, three transcripts of cross-examination have been analyzed, using a pragmatically-oriented approach. The approach draws on Stenstorm (1984) and Archer's (2005) classification of questions; Stenstorm (1984) and Archer's (2002) classificatory scheme of responses which is based on the strategies of violating Grice's (1975) maxims to determine the degree of cooperation on the part of respondents. The analysis revealed a diversity in the attorneys' method, making the use of four types of leading questions as well as non-leading ones represented by WH questions. The latter recorded the least percentage in comparison with the overall percentage of leading questions. That is; a preference is shed on the part of cross-examining counsel towards leading over non-leading questions. Moreover, the majority of the responses given have indicated the witnesses' commitment to the purpose and format of the questions posed, showing a high level of cooperativity on the part of those witnesses
In the field of civil engineering, the adoption and use of Falling Weight Deflectometers (FWDs) is seen as a response to the ever changing and technology-driven world. Specifically, FWDs refer to devices that aid in evaluating the physical properties of a pavement. This paper has assessed the concepts of data processing, storage, and analysis via FWDs. The device has been found to play an important role in enabling the operators and field practitioners to understand vertical deflection responses upon subjecting pavements to impulse loads. In turn, the resultant data and its analysis outcomes lead to the backcalculation of the state of stiffness, with initial analyses of the deflection bowl occurring in conjunction with the measured or assum
... Show MoreAbstract: Microfluidic devices present unique advantages for the development of efficient drug assay and screening. The microfluidic platforms might offer a more rapid and cost-effective alternative. Fluids are confined in devices that have a significant dimension on the micrometer scale. Due to this extreme confinement, the volumes used for drug assays are tiny (milliliters to femtoliters).
In this research, a microfluidic chip consists of micro-channels carved on substrate materials built by using Acrylic (Polymethyl Methacrylate, PMMA) chip was designed using a Carbon Dioxide (CO2) laser machine. The CO2 parameters have influence on the width, depth, roughness of the chip. In order to have regular
... Show MoreThis paper features the modeling and design of a pole placement and output Feedback control technique for the Active Vibration Control (AVC) of a smart flexible cantilever beam for a Single Input Single Output (SISO) case. Measurements and actuation actions done by using patches of piezoelectric layer, it is bonded to the master structure as sensor/actuator at a certain position of the cantilever beam.
The smart structure is modeled based on the concept of piezoelectric theory, Bernoulli -Euler beam theory, using Finite Element Method (FEM) and the state space techniques. The number of modes is reduced using the controllability and observability grammians retaining the first three
dominant vibratory modes, and for the reduced syste
The Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More