A viral video allegedly showing Philippine President Ferdinand Marcos Jnr doing drugs has caused serious concern among government officials, who called it a “maliciously crude attempt” to undermine his government.
Analysts warn that the deepfake technology likely used to make the video represents an increasingly serious threat to the Philippines, which has already seen several other high-profile instances of AI being used to generate disinformation meant to cause political instability.
The new viral video was released by Duterte supporters during a rally in the United States before Marcos Jnr’s State of the Nation Address on Monday, according to the Department of National Defence.
“The obviously fake video being circulated emanating from a Maisug gathering in Los Angeles is again a maliciously crude attempt to destabilise the administration of President Marcos Jnr. They will not succeed!” defence ministry spokesman Arsenio Andolong said in a statement.
Maisug, which means brave, is a group associated with the Duterte family.
Defence Secretary Gilberto Teodoro Jnr assured the public the military would continue to support Marcos Jnr’s administration and follow the chain of command.
“It is obvious from the video that that is not our president. Their video is fake and obviously not real,” Teodoro said.
“We will resist and fight these childish attempts to weaken the constitution and our institutions. We will coordinate with all government agencies to suppress those behind this unpleasant and horrible plot.”
The Department of Justice condemned the “fake video”, saying it would pursue necessary actions to identify and prosecute those behind the material.
“The timing of the release of this fake video, occurring just before the President’s State of the Nation Address, unmistakably indicates an intent to undermine the credibility of the President and the critical speech he is set to deliver,” it said in a statement.
Police forensic experts held a news conference on Tuesday to prove the man in the video was not the president, presenting photos of Marcos Jnr and the unidentified man to compare their facial features.
Enlarged images of their faces and right ears were placed side by side to demonstrate that Marcos Jnr’s ear was larger in proportion to his face and had a different shape from those of the other man, whose ear curled over at the top.
“Be it AI or impostor or whatever it was, as far as the [police] is concerned that is not the president,” Interior Secretary Benjamin Abalos said on Tuesday.
“That is a different person based on the ear. That’s not even considering the jawline and the entire facial structure,” Abalos said, describing the video as “malicious”.
Abalos had earlier said those behind the proliferation of the video would be held liable under the Cybercrime Prevention Act of 2012.
This is not the first time Marcos Jnr has been the subject of a deepfaked video. In April, an “audio deepfake” clip of Marcos Jnr directing his military to act against China also caused serious concern among government officials.
In the manipulated audio, the deepfaked voice of Marcos Jnr can be heard saying he has signalled his military to “take action” if China attacks the Philippines, adding he can no longer allow Filipinos to get hurt by Beijing.
The Philippines is locked in a long-standing territorial dispute with China in the West Philippine Sea, Manila’s name for South China Sea waters that lie within its exclusive economic zone.
Duterte’s denial
On Monday, former president Rodrigo Duterte denied having a hand in the release of the video but said he believed it to be authentic.
Duterte doubled down on his allegation the president was a drug user, a claim he first made in January, adding that denial was the “weakest form of defence”.
“With due apologies to all the experts who vouched for the authenticity of the video, the refusal of President Marcos to undergo the hair follicle drug test is the best proof not only of the video’s authenticity but worse, his drug addiction,” Duterte said in a statement.
“The members of the Maisug leadership were just as surprised as the rest of the country when they saw the video for the first time.”
Serious threat
Edmund Tayao, president and CEO of the think tank Political Economic Elemental Researchers and Strategists, highlighted the threats that advances in technology, particularly cyberattacks and AI, could pose to the Philippine political landscape.
“Note especially that in terms of infrastructure, we are still catching up, not to mention that our governmental structure requires a more systemic coordination. China is now most advanced in this technology plus undeniably has a stake in Philippine politics, especially as to who is making critical decisions,” Tayao said.
When asked if the Dutertes had knowledge of a fake video of Marcos Jnr, Tayao said he hoped no Filipino was involved.
“If not, then we really are in serious trouble as it shows some of us are willing to work with the proverbial devil just to get what we’re after,” he said.
Ramon Beleno III, head of the political science and history department at Ateneo De Davao University in Davao City, told This Week in Asia that use of deepfakes during elections was accessible and cheap, and people would be easily persuaded.
“Given that it will be seen and witnessed by people with little knowledge, they will be persuaded that it’s true. If this is our behaviour during election time, we will not be able to choose the right people because we will be swayed by incorrect information,” he said.
Last week, Commission on Elections chairman George Garcia said the agency was set to come out with guidelines in August for the use of AI to promote the candidates in next year’s midterm elections.
Filipinos will vote in the May 2025 polls to choose 12 members of the Senate and more than 300 members of the House of Representatives, as well as thousands of officials, who range from governors and mayors to councillors.
Earlier this year, three lawmakers sought through a bill heavier penalties against crimes committed using deepfake technology.
The draft bill defines a “deepfake” as “any audio, visual or audiovisual recording created or altered through technical means, such as video recording, motion-picture film, sound recording, electronic image, or photograph, which are so convincing that a reasonable person would mistake it for an authentic representation of an individual’s speech or conduct”.
Illegal deepfakes can “infringe on copyrights, violate data protection, defame individuals, and intrude upon privacy”, according to the bill.
Additional reporting by Agence France-Presse