Kilicdaroglu’s “dark web” allegations, experts say distinguishing fakes is often difficult

Kilicdaroglu’s “dark web” allegations, experts say distinguishing fakes is often difficult
Publish:
A+ A-
Claims by Kilicdaroglu’s Communications Director Altin that “deals are being cut on the dark web” sparked debates on cyber interventions in the upcoming election. According to experts, users often cannot distinguish deepfakes from real content.

CENGIZ ANIL BOLUKBAS- Republican People’s Party (CHP) Chair and the Nation Alliance’s presidential candidate Kemal Kilicdaroglu voiced on April 29 during his Kayseri campaign rally that “dirty games will be played in the last ten days before the election.” Late on May 1, Kilicdaroglu also shared a post on his Twitter account saying, “There are two more days until the final ten days. Let me make my final warning. Fahrettin Altun, Serhat, and his teammates Cagatay and Evren; the dark web world that you are trying to make a deal with will get you in trouble with foreign security [agencies]. Playing Cambridge Analytica is out of your league, kids. This is my final warning.”

Kilicdaroglu’s allegations sparked a debate on “cyber interventions in the election.” According to claims by CHP members, “dark websites" characterized by “dark, dirty networks” will be used in an “attrition campaign” of fake voice recordings and video montages that put Kilicdaroglu in a bad light.

"IT IS POSSIBLE TO PURCHASE FAKE CONTENT"

Strategy Coordinator of NewsLabTurkey Ahmet Alphan Sabanci stated that although the technical definition of the "dark web" is broad, it is generally used to define websites that are accessed anonymously and where many illegal services and products are sold and shared. Sabanci mentioned that it is possible to buy illegal products such as drugs from these sites, as well as fake social media accounts, harmful software, and other things. He also said, "These sites can be used for campaign or propaganda purposes during election periods, and you can buy services from someone who produces fake content, stolen personal data, or bot social media accounts. You can also make these bot accounts share the requested content and campaigns on social media to campaign on your behalf."

"ORDINARY PEOPLE MAY NOT BE ABLE TO DIFFERENTIATE BETWEEN REAL AND FAKE CONTENT"

Artificial intelligence expert and Bogazici University faculty member Professor Cem Say noted that the use of "deepfakes" has been around for a number of years, but has developed significantly recently. Say pointed out that it is possible to replace an image of a speaking person with another person entirely, and that high-quality montages can be made as well as amateur versions. According to Say, it is quite difficult for people who are not knowledgeable to differentiate between real and fake content:

"Amateur content is immediately recognizable, but only those knowledgeable in the matter can recognize them. Ordinary people may not be able to distinguish between [real and fake content]. Besides that, [we see now] higher-quality works that even experts have difficulty recognizing. People who know their way around a computer can make applications that detect fakes with little effort. However, there are others who will fall for even the lower quality content. These people may be affected by the fake images being spread. Besides, these works can also be produced in high quality. It is all actually a matter of money. If there is $128 billion at stake, a few million dollars can be spared to make high-quality fakes."

Sabanci, who pointed out that even though there are differences in the quality of content produced, there are telltale signs that reveal the artificiality of “deepfakes” and similar content, still agreed with Say that it is not always possible for an ordinary internet user to detect fakes. According to Sabanci, as the skill levels of the producers increase, these traces are minimized and more difficult to detect. Therefore, he suggests that users adopt a more skeptical approach to the content they encounter during this period and pause to examine it before sharing:

"Details that are not immediately noticeable may become visible after spending a little more time. More importantly, users should pay attention to other details related to the shared content, such as who is sharing it and how, their degree of reliability and social media history of the people who share it, and what journalists, technology experts, and verification experts say about the authenticity of the content.”

"PEOPLE COULD BE DISCOURAGED FROM VOTING"

Say emphasized that it is not necessary for fake images and recordings to influence a large group of people, as the impact on even a small group can change the outcome of the election. Say noted that similar attacks have been experienced during many countries' election campaigns before, and mentioned the "Cambridge Analytica" scandal.

"The fate of the election may not necessarily be determined by millions of people in Istanbul. The 10,000 people in a smaller city can also change the election’s outcome. Such people may be specifically targeted. Strategies to influence the 3,000 voters who are currently undecided but who are leaning toward the Nation Alliance could be put into motion. After all, there is an alliance, and there may be differences of opinion within it. These people may be discouraged from voting as a result. This was the gist of Cambridge Analytica."

"VERIFICATION TOOLS SHOULD BE PROVIDED"

Sabanci stated that as technology develops, it becomes increasingly difficult for internet users to identify and verify such content on their own and that actions should be taken to help users become more critical media consumers.

Experts agree that it is difficult for normal people to distinguish between deepfake content and real content, which raises the question of the availability of preventative mechanisms.

In this regard, Sabanci expressed that the bulk of the responsibility often falls to journalists, verification platforms, and similar entities:

"One of the most important things that can be done is to support these institutions and provide resources that will help them work as quickly as possible. We need to increase the visibility of these institutions, ensure that news and verifications on these issues reach more people, and increase confidence in these sources."

"INTERNET LITERACY IS A MUST"

Say stated that people must be internet-literate as a precaution, but that this is not a norm well-developed in Turkey, and that there are too many people who believe fake images and videos on social media. Say said that there is no foolproof method to prevent fakes and that political parties should quickly create organizations that can quickly spread the truth when faced with fake images and recordings:

"The biggest problem is that closed ads are closed. For example, if Kilicdaroglu knew that only one person had been exposed [to fakes], he could talk to that one individual and convince them of the truth. However, there is no way for him to know who has been exposed. There is a need for organizations that can say “Do not spread such content” and which can rapidly prepare responses to fake content when needed. We also need to instill a culture in which people do not immediately believe what they see. This is akin to ‘hacking’ people. Just as we install programs on our computers to prevent them from being hacked, we need to be ready and able to respond in this matter too. However, when it comes to this issue, it may already be too late. Moreover, it is a fact that no response to a lie garners as much attention as the lie itself."