[ad_1]
LONDON (Reuters) – Cadbury chocolates maker Mondelez, Lidl, Mars and other consumer goods marketers have pulled advertising from YouTube after The Times newspaper found the video sharing-site was showing clips of scantily clad children alongside the ads of major brands.
Comments from hundreds of paedophiles were posted alongside the videos, which appeared to have been uploaded by the children themselves, according to a Times investigation. One clip of a pre-teenage girl in a nightie drew 6.5 million views.
The paper said YouTube, a unit of Alphabet subsidiary Google, had allowed sexualised imagery of children to be easily searchable and not lived up to promises to better monitor and police its services to protect children.
In response, a YouTube spokesman said: “There shouldn’t be any ads running on this content and we are working urgently to fix this”.
The UK arm of German discount retailer Lidl, Diageo,the maker of Smirnoff vodka and Johnnie Walker whisky, and chocolate makers Mondelez and Mars confirmed they had pulled advertising campaigns from YouTube.
“We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content,” said Mars in a statement.
“We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally… Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
A Lidl UK spokeswoman said it was “completely unacceptable that this content is available to view, and it is, therefore, clear that the strict policies which Google has assured us were in place to tackle offensive content are ineffective”.
Diageo said it had begun an urgent investigation and halted all YouTube advertising until it was confident the appropriate safeguards are in place. Computers and printers company HP blamed the problem on a “content misclassification” by Google and instructed it to suspend all of its advertising globally on YouTube.
The Times investigation alleged that YouTube does not do enough to pro-actively check for inappropriate images of children and instead relies on software algorithms, external non-government groups and police forces to flag such content.
On Wednesday, YouTube announced a crackdown on sexualised or violent content aimed at “family friendly” sections of YouTube. (goo.gl/dE343u)
Johanna Wright, YouTube’s vice president of product management, promised tougher application of its user guidelines, removing inappropriate ads targeting families, blocking inappropriate comments on videos featuring minors and providing further guidance for creators of family-friendly content.
German sports goods maker Adidas said on Friday it took the issue raised by the Times very seriously and it was working closely with Google on “all necessary steps to avoid any re-occurrences of this situation”.
British telecoms company BT said it manually tested Google’s brand safety measures 20,000 times to check they work, but it was possible that “a small number of ads slip through and appear next to inappropriate content or content with inappropriate comments”. Those ads are removed immediately and the publishers blacklisted, it said.
Britain’s ministry in charge of digital affairs said the government had put in place earlier this year a new code of practice for social media companies requiring them to ensure they offer adequate online safety policies.
“The government expects online platforms to have robust processes in place and to act promptly to remove content and user accounts that do not comply with their own policies,” a spokesman for the Department of Digital, Culture, Media and Sports said.
Reporting by Eric Auchard; Additional reporting by Martinne Geller, Maria Sheahan and Kate Holton; Editing by Tom Pfeiffer
[ad_2]
Source link
Leave a Reply