This study aims to demonstrate the role of artificial intelligence and metaverse techniques, mainly logistical Regression, in reducing earnings management in Iraqi private banks. Synthetic intelligence approaches have shown the capability to detect irregularities in financial statements and mitigate the practice of earnings management. In contrast, many privately owned banks in Iraq historically relied on manual processes involving pen and paper for recording and posting financial information in their accounting records. However, the banking sector in Iraq has undergone technological advancements, leading to the Automation of most banking operations. Conventional audit techniques have become outdated due to factors such as the accuracy of data, cost savings, and the pace of business completion. Therefore, relying on auditing a large volume of financial data is insufficient. The Metaverse is a novel technological advancement seeking to fundamentally transform corporate operations and interpersonal interactions. Metaverse has implications for auditing and accounting practices, particularly concerning a company’s operational and financial aspects. Economic units have begun to switch from traditional methods of registration and posting to using software for financial operations to limit earnings management. Therefore, this research proposes applying one of the Data Mining techniques, namely the logistical regression technique, to reduce earning management in a sample of Iraqi private banks, including (11) banks. Accounting ratios were employed, followed by Logistic Regression, to achieve earnings management within the proportions.
This study employed the biosynthetic technique for creating vanadium nanoparticles (VNPs), which are affordable and user-friendly; VNPs was synthesized using vanadium sulfate (VOSO4.H2O) and a plant extract derived from Fumaria Strumii Opiz (E2) at a NaOH concentration of 0.1 M. This study aims to investigate the potential applications of utilizing an adsorbent for metal ions to achieve environmentally friendly production and assess its antibacterial activity and cytotoxicity. The reaction was conducted in an alkaline environment with a pH range of 8–12. The resulting product was subjected to various characterization techniques, including Fourier transform infrared spectroscopy, ultraviolet-visible spectroscopy, x-ray diffraction (XRD), t
... Show MoreThis article aim to estimate the Return Stock Rate of the private banking sector, with two banks, by adopting a Partial Linear Model based on the Arbitrage Pricing Model (APT) theory, using Wavelet and Kernel Smoothers. The results have proved that the wavelet method is the best. Also, the results of the market portfolio impact and inflation rate have proved an adversely effectiveness on the rate of return, and direct impact of the money supply.
Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreThis review investigates the practice and influence of chatbots and ChatGPT as employable tools in writing for scientific academic purposes. A primary collection of 150 articles was gathered from academic databases, but it was systematically chosen and refined to include 30 studies that focused on the use of ChatGPT and chatbot technology in academic writing contexts. Chatbots and ChatGPT in writing enhancement, support for student learning at higher education institutions, scientific and medical writing, and the evolution of research and academic publishing are some of the topics covered in the reviewed literature. The review finds these tools helpful, with their greatest advantages being in areas such as structuring writings, gram
... Show MoreIn present work examined the oxidation desulfurization in batch system for model fuels with 2250 ppm sulfur content using air as the oxidant and ZnO/AC composite prepared by thermal co-precipitation method. Different factors were studied such as composite loading 1, 1.5 and 2.5 g, temperature 25 oC, 30 oC and 40 oC and reaction time 30, 45 and 60 minutes. The optimum condition is obtained by using Tauguchi experiential design for oxidation desulfurization of model fuel. the highest percent sulfur removal is about 33 at optimum conditions. The kinetic and effect of internal mass transfer were studied for oxidation desulfurization of model fuel, also an empirical kinetic model was calculated for model fuels
... Show MorePure SnSe thin film and doped with S at different percentage (0,3,5,7)% were deposited from alloy by thermal evaporation technique on glass substrate at room temperature with 400±20nm thickness .The influences of S dopant ratio on characterization of SnSe thin film Nano crystalline was investigated by using Atomic force microscopy(AFM), X-ray diffraction (XRD), energy dispersive spectroscopy (EDS), Hall Effect measurement, UV-Vis absorption spectroscopy to study morphological, structural, electrical and optical properties respectively .The XRD showed that all the films have polycrystalline in nature with orthorhombic structure, with preferred orientation along (111)plane .These films was manufactured of very fine crystalline size in the ra
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreA content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a
... Show MoreWith the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show More