It must be emphasized that media is amongst human studies fusing older and more recent sciences together, and that its disclosures are the physics of the new communication. Michio Kaku, a theoretical physicist, in his book “ Visions”, confirms this fact when he says :” As a research physicist, I believe that physicists have been particularly successful at predicting the broad outlines of the future .Professionally, I work in one of the most fundamental areas of physics, the quest to complete Einstein's dream of a "theory of everything." As a result, I am constantly reminded of the ways in which quantum physics touches many of the key discoveries that shaped the twentieth century. “ He then got to the fact that the physical disclosures are at the service of the media process. He says:” In the past, the track record of physicists has been formidable: we have been intimately involved with introducing a host of pivotal inventions (TV, radio, radar, X-rays, the transistor, the computer, the laser, the atomic bomb), decoding the DNA molecule, opening new dimensions in probing the body with PET, MRI, and CAT scans, and even designing the Internet and the World Wide Web. “ The problem of the research is that we set its limits at the beginning of the third millennium, ie. after the emergence of the phenomenon of the Italian Berlusconi, and the importance of research lies in the diagnosis of a new phenomenon that is contrary to the ethics of media work. But the problem of the research will remain present, represented by this continuous transformation and pursuit of bringing together the three authorities, also the disengagement will be a priority for the research. The major breakthroughs made to the media will make the monitoring task a difficult one. It is clear from the foregoing, that the ethics of media work face serious risks as dangerous as technological development that has left a chance for the media to reach high levels. A new threat has been diagnosed by "liberal globalization", which seeks to form multinational media groups with the power of politics, media and finance.On the other hand, there are those who are working to stop this new media invasion. We have referred to the attempts of media expert Agnassio Ramonet, who demanded the establishment of a fifth authority and the establishment of a media watchdog, but the level of work did not record any material return, while for the liberal globalization front, it performs like a fast train that can reach any destination. If we have an opinion regarding this matter, what we require is the broadcasting of a "media culture" in which the masses constitute a media immunity for the citizen, which means that the average recipient can understand and analyse the media game through the same source,the same dimension and the same goal. It is the civic sense of the media that will help uncover the mysteries of the great media game. It also reduces the clash between the three authorities - the state authority, the media authority and the authority of money,but still, it won’t be able to dismantle it and cancel it at the current stage.
Non-steroidal anti-inflammatory drugs (NSAIDs) contain free –COOH which thought to be responsible for the GI irritation associated with all traditional NSAIDs. The esterification of this group is one of an approach to ultimate aim for reduce the gastric irritation; so in this study we synthesized and preliminarily evaluated new ester compounds as new analogues with expected selectivity toward COX-2 enzyme. Synthetic procedures have been successfully developed for the generation of the target compounds (III a and b). The synthetic approach involved multi-steps procedures which include: Synthesis of 4-hydroxy benzene sulphonamide ( I b ), synthesis of Naproxen and Ibuprofen acyl chloride and then reacting them with 4-hydroxy benzene sulphon
... Show MoreBackground: The present study aimed to assess the distribution, prevalence, severity of malocclusion in Baghdad governorate in relation to gender and residency Materials and Methods: A multi-stage stratified sampling technique was used in this investigation to make the sample a representative of target population. The sample consisted of 2700 (1349 males and 1351 females) intermediate school students aged 13 years representing 3% of the total target population. A questionnaire was used to determine the perception of occlusion and orthodontic treatment demand of the students and the assessment procedures for occlusal features by direct intraoral measurement using veriner and an instrument to measure the rotated and displaced teeth. Results a
... Show MoreIn this research, The effect of substituting sucrose with different level of DS and DG (0, 25, 30,50,70 and 100%) on the physiochemical, microbial and sensory properties of cake were studied. Cake models were as well construed for microbial content and organic structure during, before then next 35 days storing at experimental temperature. Results showed no significant variances (p < 0.01) in the chemo physical structure of the date and grape test cake for protein values while there were significant differences for Asch, fiber and fat content values, Sensory assessment results showed high significant variance (p < 0.01) among the cake trials with the exemption of texture (6.04-6.
Image segmentation using bi-level thresholds works well for straightforward scenarios; however, dealing with complex images that contain multiple objects or colors presents considerable computational difficulties. Multi-level thresholding is crucial for these situations, but it also introduces a challenging optimization problem. This paper presents an improved Reptile Search Algorithm (RSA) that includes a Gbest operator to enhance its performance. The proposed method determines optimal threshold values for both grayscale and color images, utilizing entropy-based objective functions derived from the Otsu and Kapur techniques. Experiments were carried out on 16 benchmark images, which inclu
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for