how we work:
As TikTok becomes the social platform of choice for companies looking to reach a teen audience, brands and agencies are learning the difficulty of maintaining brand safety around TikTok-specific content in influencer campaigns.
In the year and a half since formal ads were sanctioned on the platform, brands have fallen into scandals involving adult nudity and sexual activities, minor safety, and regulatory struggles stemming from user-generated content.
Popcorn Growth provides brands with a toolkit to maintain trust and legal integrity. With clients ranging from brands that are heavily regulated by outside agencies to companies that are rebuilding from PR missteps, our influencer oversight and legal framework ensure the safety of user-generated content in even the most delicate of cases. As demonstrated in case studies from Nurx, Zebit, and BetterHelp, we provide the regulatory services needed to run influencer campaigns that grow, not damage, a brand’s reputation.
Proprietary Software scans for vulgar imagery (nudes, etc.) and language in past captions
We check that all influencers are at least 18 years of age
Check with SAG-AFTRA
In February, 2021 the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) officially approved a new “influencer agreement,” allowing the union to cover social media influencers and content creators.
You and/or your agency may be unknowingly violating SAG-AFTRA guidelines.
Unless your brand is a SAG-AFTRA signatory, we will check with SAG-AFTRA Database and confirm with individual creators on their SAG-AFTRA status to ensure the creators are not covered under SAG-AFTRA.
While technology facilitates and expedites our compliance process, our compliance team adds as an extra layer of defense.
We ensure all creators understand and comply with FTC guidelines.
We conduct four-eye checks on legal checklist. Our approval rate from legal teams is 98% to 100%.
Content Specialist Team checks on brand guidelines/ checklist
Proprietary Content Moderation Software scans for 1. images (nudity, etc) and 2. text/captions (vulgarities, etc.)
NURX is a start-up that sells prescription drugs online (known informally as the “Uber for birth control”).
We detailed the legal constraints in a “most commonly-made mistakes” document to serve as the foundation for NURX’s social playbook
Working in the digital health space, the brand has strict limits on what kind of language they can use in advertising, and during their first Tik Tok campaign, they encountered a number of unforeseeable limitations in their proposed copy.
To ease the scripting process, we detailed the legal constraints in a “most commonly-made mistakes” document to serve as the foundation for NURX’s social playbook, which also included legal and brand safety guidelines specific to Tik Tok. Separately, we individually trained influencers to pronounce the company name— ”Nur-ex”— with custom videos, written phonetics, and personal phone calls, ensuring the aesthetic integrity of the brand while guaranteeing legal compliance.
When we began work with Zebit, a Buy-Now-Pay-Later Fintech company, they had 56 legal “dos” and “dont’s” in their campaign brief— a reasonable precaution for a company that deals in credit.
Our influencer manager personally texted and called influencers to update them each time a regulation was added or changed.
But their list of language restrictions only grew as their legal team revisited it over time. Our influencer manager personally texted and called influencers to update them each time a regulation was added or changed. After influencers submitted their videos, we ran four-eye checks on each one before we shared content with the client legal team, a service we run for all of our clients. And because we know clients don’t have time to go back and forth on legal questions, we take pains to ensure that our content is buttoned up before we submit it for review; across all our clients, our legal approval rate is 98% .
Because Zebit targets people with low credit to defer payment on consumer goods, a number of creators who would have matched the brand profile were ineligible because they had used vulgarities in their profiles. Our proprietary software scanned out nearly half of the prospective profiles on its first pass, identifying clean profiles and saving hours that a brand manager would have otherwise spent sorting through influencers.
BetterHelp, a remote psychotherapy provider, is a landmine for controversy: the brand needs to address mental health in their campaigns without allowing influencers to venture into dark personal subjects.
We worked with the brand to verify influencer age and therapist licensing, and to confirm that each personality had personally used the app.
Beyond creator content, comments must also be moderated carefully to weed out damaging replies. When the brand attempted to handle their influencer campaigns in-house, they made some damaging regulatory missteps: influencers promised “professional” help, although the terms of service explicitly didn’t guarantee treatment by licensed professionals.
We worked with the brand to verify influencer age and therapist licensing, and to confirm that each personality had personally used the app. We applied a particularly tight-meshed screen to their past content to filter out any potentially controversial content, and we engaged our 36-hour live comment-monitoring service to keep the conversation clean and brand-safe after creators posted their content. (We can take down comments of influencer content within 24 hours if necessary.)
We also implemented a custom ad disclosure overlay, going beyond a simple ad hashtag to signal branded content as clearly as possible.
Influencer marketing can have a great payoff when properly executed— a resource-intensive process that spans across several areas of corporate expertise. Our experience in talent management, regulatory compliance, and platform-specific guidelines streamlines the process for brands looking to reach young audiences, ensuring brand safety at every turn.