The smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, recognition and understanding and efficient processing of large amounts of video data. This research proposes a novel unified approach to lossy and lossless video frame compression, which is beneficial for the autonomous processing and enhanced representation of high-resolution video data in various domains. The proposed fast block matching motion estimation technique, namely mean predictive block matching, is based on the principle that general motion in any video frame is usually coherent. This coherent nature of the video frames dictates a high probability of a macroblock having the same direction of motion as the macroblocks surrounding it. The technique employs the partial distortion elimination algorithm to condense the exploration time, where partial summation of the matching distortion between the current macroblock and its contender ones will be used, when the matching distortion surpasses the current lowest error. Experimental results demonstrate the superiority of the proposed approach over state-of-the-art techniques, including the four step search, three step search, diamond search, and new three step search.
Information systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreTor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreThe transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the
... Show MoreThe aim of the current research is to study a topic from the Qur’anic topics, few have researched it and realized its content, so people knew it in one name in the Qur’an in another name, and due to the ancientity of the topic and its contemporaneity, I wanted to write about it. The research has an introduction, three demands, and a conclusion with the most important results of the research:
As for the introduction: It was to indicate the importance of the topic and an optional reason for it.
As for the first requirement: it included the definition of reasoning, its divisions, and its characteristics.
As for the second requirement, it was to indicate the meaning, types, and methods of labeling it.
As for the third require
Malaysia's growing population and industrialisation have increased solid waste accumulation in landfills, leading to a rise in leachate production. Leachate, a highly contaminated liquid from landfills, poses environmental risks and affects water quality. Conventional leachate treatments are costly and time-consuming due to the need for additional chemicals. Therefore, the Electrocoagulation process could be used as an alternative method. Electrocoagulation is an electrochemical method of treating water by eliminating impurities by applying an electric current. In the present study, the optimisation of contaminant removal was investigated using Response Surface Methodology. Three parameters were considered for optimisation: the curr
... Show MoreTo evaluate and improve the efficiency of photovoltaic solar modules connected with linear pipes for water supply, a three-dimensional numerical simulation is created and simulated via commercial software (Ansys-Fluent). The optimization utilizes the principles of the 1st and 2nd laws of thermodynamics by employing the Response Surface Method (RSM). Various design parameters, including the coolant inlet velocity, tube diameter, panel dimensions, and solar radiation intensity, are systematically varied to investigate their impacts on energetic and exergitic efficiencies and destroyed exergy. The relationship between the design parameters and the system responses is validated through the development of a predictive model. Both single and mult
... Show MoreThe purpose of this paper is to apply different transportation models in their minimum and maximum values by finding starting basic feasible solution and finding the optimal solution. The requirements of transportation models were presented with one of their applications in the case of minimizing the objective function, which was conducted by the researcher as real data, which took place one month in 2015, in one of the poultry farms for the production of eggs
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for