Please ensure Javascript is enabled for purposes of website accessibility

5 Ways AI Could Fuel Global Persecution

April 28, 2023 by Josh Depenbrok in Persecution updates

In 2015, prominent technology experts including Elon Musk and Steve Wozniak signed an open letter calling for a pause in AI development to allow more time to research societal impacts, particularly around issues of ethics and safety. If we’re not careful, “The development of artificial intelligence could spell the end of the human race,” warned Stephen Hawking.  

The misuse of AI could certainly spell the end of freedom for Christians and religious minorities around the world. We’re already seeing national governments, terrorist groups and other bad actors abusing digital technology for nefarious uses, including violent attacks and oppression. Persecuted Christians and religious minorities are often the most vulnerable communities around the globe—and the exploitation of AI could make things even worse for them.  

Here are five ways AI could fuel global Christian persecution:

Surveillance and facial recognition 

It’s now easier than ever to track someone going to church. AI-powered surveillance and facial recognition cameras can be used to monitor the activities and movements of individuals and groups, making it easier for persecutors to identify and target them. China already leads the way in this, creating a high-tech surveillance state that uses facial recognition to monitor its citizens. More than 500 million street cameras can pick people out of crowds with facial recognition software. The recorded data can be easily automated, searched, and potentially combined with China’s powerful social credit score system.  

Chinese companies have already exported this surveillance technology to 60 countries. Protesters in Myanmar say that CCTV surveillance cameras powered by Chinese-made facial recognition software are being used to track and arrest them. In Iran, police will soon utilize smart cameras to identify violators of the hijab law and penalize them. Women who violate the strict dress code will receive warnings followed by summonses to appear in court.  

Censorship and content filtering

AI-powered tools like ChatGPT reflect the data on which they are trained. That can make them easily susceptible to censorship by governments who want to target particular groups or ideologies, like persecuted Christians. Queries like “Where should I go to church on Sunday morning?” or “Should I attend church?” could receive responses like “We don’t recommend going to church” or “attending church will negatively impact your social score.”  

Basic search engine results can be easily, yet subtly manipulated. A recent study found differences between an algorithm utilizing words from Wikipedia versus Baidu, a search engine platform based in Beijing. When using Wikipedia, the algorithm found that words like “election” and “democracy” had a positive association with nouns like “stability.” With Baidu, terms like “surveillance” and “CCP” (Communist Chinese Party) had positive associations, while “democracy” was correlated to negative words like “chaos.” 

Deepfakes 

Perhaps one of the most potentially worrisome abuses of AI are deepfakes. Using a form of artificial intelligence called deep learning, deepfake videos have been digitally manipulated to replace one person’s likeness with another to create fabricated events or speeches. And they’ve become more misleading than ever. Fake videos of Barack Obama and Mark Zuckerberg have made the rounds and deepfake technology can even create convincing characters who don’t actually exist.  

In the hands of the wrong people, this technology could be weaponized with devastating consequences. Videos of pastors or faith leaders could be manipulated by bad actors to make them say something blasphemous or insulting, giving enemies a pretext for harassment, arrests and violence. Fictional churchgoers could be created to coax “fellow” Christians into revealing personal information that can be used against them or divulge locations of secret underground churches. Doctored videos could be used by hostile states, terrorist groups or criminal organizations for blackmail and shakedowns. Unfortunately, the possibilities could be endless.

Predictive policing 

U.S. police departments have begun embracing predictive policing algorithms to anticipate where crimes are likely to occur. The downside is that the information supplying the algorithms is often influenced by arrest rates, which can disproportionately impact minority communities. Police departments then double down on these communities, leading to over-policing.  

Similarly, hostile governments could easily weaponize this technology to predict where Christians and religious minorities are likely to meet for worship services, whether in churches or in small groups in homes. Police officers, government agents or terrorists could lay in waiting until a religious group gathers and then move in to arrest, violently attack, or even kill. Governments could even collude with mobs, sending them to locations where they believe religious minorities might be gathering.

Autonomous weapons 

Another potentially frightening risk for the persecuted are weapons controlled by AI. Lethal Autonomous Weapon Systems use artificial intelligence to locate and destroy targets on their own while abiding by few regulations. These weapons are already dangerous enough, but pose an even greater threat when they fall into the wrong hands. Hackers working on behalf of bad actor groups could take control of the weapons and turn them on their enemies with devastating consequences.  

Terrorist groups like Islamic State West Africa Province (ISWAP) already operate elaborate communication and technology systems that use drones, social media, satellite wi-fi, archiving software, even a media team. Research shows that ISWAP is already testing delivery drones to carry improvised explosive devices (IEDs), including assessing the weight that can be carried and the distance they can travel. Persecuted Christians and religious minorities in countries like Burkina Faso and Nigeria already face extreme levels of violence at the hands of jihadist groups. The addition of IEDs delivered by the air would be catastrophic.

These examples barely scratch the surface of other issues AI raises, like its potential to negatively impact important topics like socioeconomic inequity, phishing, and invasion of privacy, much less the unintended consequences we haven’t even yet thought about. As artificial intelligence evolves ever faster, we need to pass legislation now to regulate how AI is developed and used. Otherwise, it will be too late—especially for the persecuted and other religious minorities around the world who are already among the most vulnerable.

About The Author
Josh Depenbrok is a staff writer for Global Christian Relief, a nonprofit Christian ministry that works to strengthen persecuted believers and raise awareness regarding Christian persecution. For more information, visit our website at GlobalChristianRelief.org.