Google has announced updates to its advertising policies that require advertisers to disclose election ads that use digitally altered content to portray real or realistic people or events. The policy applies to images, video and audio content.
The disclosure requirements of the updated political content policy require advertisers to select a checkbox that reads “altered or synthetic content” when setting up their campaigns. The move is part of an ongoing effort to combat misinformation about the election.
“We believe people should have the information they need to make informed decisions when viewing election ads that contain synthetic content that’s digitally altered or generated. Therefore, certified election advertisers in jurisdictions where certification is required must prominently disclose if their ads contain synthetic content that misrepresents real people or events,” Google said.
The company said the disclosure must be clear and placed in a location where users are likely to notice it.
Ads excluded from this new policy
According to the company, ads that contain synthetic content that is altered or generated in a way that does not materially affect the claims made within the ad are exempt from these disclosure requirements.
“This includes editing techniques such as image resizing, cropping, correcting color or brightness, correcting imperfections (e.g. removing ‘red eye’), and background edits that do not create a realistic depiction of real-life events,” Google said.
Deepfakes ahead of Indian Lok Sabha elections
Earlier this year, a fake video of a Bollywood actor criticizing the ruling party went viral online, while ChatGPT’s developer, OpenAI, said in May it had disrupted five covert influence operations that sought to use its AI models for online “deception” aimed at manipulating public opinion and influencing political outcomes.
Last year, Facebook’s parent company Meta introduced a similar policy requiring advertisers to disclose their use of AI or other digital tools to alter or create political, social or election-related content on Facebook and Instagram.
The disclosure requirements of the updated political content policy require advertisers to select a checkbox that reads “altered or synthetic content” when setting up their campaigns. The move is part of an ongoing effort to combat misinformation about the election.
“We believe people should have the information they need to make informed decisions when viewing election ads that contain synthetic content that’s digitally altered or generated. Therefore, certified election advertisers in jurisdictions where certification is required must prominently disclose if their ads contain synthetic content that misrepresents real people or events,” Google said.
The company said the disclosure must be clear and placed in a location where users are likely to notice it.
Ads excluded from this new policy
According to the company, ads that contain synthetic content that is altered or generated in a way that does not materially affect the claims made within the ad are exempt from these disclosure requirements.
“This includes editing techniques such as image resizing, cropping, correcting color or brightness, correcting imperfections (e.g. removing ‘red eye’), and background edits that do not create a realistic depiction of real-life events,” Google said.
Deepfakes ahead of Indian Lok Sabha elections
Earlier this year, a fake video of a Bollywood actor criticizing the ruling party went viral online, while ChatGPT’s developer, OpenAI, said in May it had disrupted five covert influence operations that sought to use its AI models for online “deception” aimed at manipulating public opinion and influencing political outcomes.
Last year, Facebook’s parent company Meta introduced a similar policy requiring advertisers to disclose their use of AI or other digital tools to alter or create political, social or election-related content on Facebook and Instagram.