WordPress database error: [Table 'wordpress.wp_cleantalk_sfw' doesn't exist]
SELECT network, mask, status, source FROM wp_cleantalk_sfw WHERE network IN (59768832,59834368,59867136,59875328,59879424,59881472,59881536,59881568,59881584,59881585) AND network = 59881585 & mask AND 67750 ORDER BY status DESC

Fake Videos Create New Problem for Corporate Reputation Management

deep fakes

Advances in video editing enable fraudsters to create deep fakes: doctored videos that portray people doing things they never did or saying words they never said. The videos can dub words on speakers or replace an image of someone’s head with another person.

Imagine, for instance, a faked video of the CEO making racist or sexist comments or offering a bribe.

To demonstrate the devious nature of deep fakes, video production expert Jordan Peele and BuzzFeed teamed up to create a spoof public service announcement of Barrack Obama  warning of the dangers of such deep fake videos. Peele replaces Obama’s original words with his own. “You see, I would never say these things,” says Obama – or rather Peele. “But someone else would. Someone like Jordan Peele.”

What’s Real and What’s Not Real?

Corporate leaders and communications personnel will have trouble convincing the public that a video isn’t real. Even more troubling, as deep fakes become more common and the public becomes increasingly confused and cynical, businesses will also need to argue that real videos are indeed real.

Corporations need to add deep fakes to their PR crisis management and reputation management plans. They’ll need to be ready to communicate facts quickly and correct fictions before they spread.

Corporations will need to invest in new technology that detects video forgeries – and quickly. Experts predict a technological arms race between creators of fake videos and those trying to spot and debunk the forgeries before they spread widely on social media. Social media analytics will provide a critical weapon, predict experts writing in the Harvard Business Journal.

“As with any crisis, social media analytics tools are critical when it comes to tracking the spread of misinformation,” write Aviv Ovadya, founder of the Thoughtful Technology Project and former chief technologist at the University of Michigan’s Center for Social Media Responsibility, and Hal Bienstock, a managing director at communications firm Prosek Partners. “These tools can help executives see whether a story is gaining traction and identify the most-influential people spreading the misinformation, whether wittingly or unwittingly.”

Recommend a Unified Response

Social media platforms such as YouTube, Facebook, Snapchat, Twitter and Vimeo can take steps to counter deep fakes, but don’t expect swift action without pressure from regulators or advertisers, they warn. Corporations may need to hit social networks in their wallets to get their attention. For example, look to Procter & Gamble. It pulled $140 million in digital ad spending after its ads were placed next to questionable content.

Industries can also form lobbying coalitions and consider partnerships with consumer groups and NGOs. “We all must pitch in to support cross-company, cross-industry, and even cross-sector efforts to turn the tide. It will be incumbent on everyone with a stake in a reality-based society to work together to ensure that we can continue to discern fact from fiction,” write Ovadya and Bienstock.

The Danger of Shallow Fakes

A video clip that recently circulated on Twitter could be a harbinger of deep fake videos. The video shows CNN reporter Jim Acosta fending off a White House staffer who tried to grab his microphone during a press conference. White House Press Secretary Sarah Huckabee Sanders shared the video on Twitter, saying it supported the White House’s decision to revoke Acosta’s press pass. But CNN and others say the clip — posted by a conspiracy theorist news site that once suggested the 2012 Sandy Hook Elementary School shooting didn’t happen – was doctored to make Acosta appear more aggressive. Conway continued to deny that the video had altered by admitted it had been sped up.

Some say the clip was a shallow fake: a video with relatively minor, low-tech alterations. If a video with minor changes can fool viewers, imagine the realistic footage from high-tech tools.

“The incident underscores the fears that video can be easily manipulated to discredit a target of the attacker’s choice — a reporter, a politician, a business, a brand,” writes CSO senior writer J.M. Porup. “Reputational damage for an enterprise can cause stock prices to plummet, and result in long-term consequences for customers and shareholders who may no longer trust the truth about your business when they hear it going forward.”

Bottom Line: Deep fakes, fake videos that look remarkably realistic, present a new risk for corporate communications and PR. Organizations will need to be on guard against the emerging threat. Some experts recommend a concerted cross-industry effort to devise ways to identify and eliminate the fakes.