The meteoric rise of generative AI tools like ChatGPT and DALL-E has sent shockwaves through the digital landscape. Advertisers, eager to capitalize on its potential, are now facing increasing restrictions from brands who fear the technology's potential for missteps. Meanwhile, the gaming industry wrestles with the double-edged sword of AI, which promises innovation but also raises concerns for creative control and the future of artistic roles within game development.
AI in Marketing: The Agency's Perspective - AI as a Force Multiplier
Ad agencies understand one thing clearly – the future is powered by AI. The technology is already embedded in various aspects of advertising, from media buying and campaign analysis to brainstorming and creative ideation. Naturally, many agencies are racing to explore the full potential of generative AI.
“Recently, we won three new pieces of business and in the [master service agreement] it says, ‘you’re not allowed to use AI of any kind, without prior authorization,'" revealed an anonymous CEO of an independent ad agency.
This underscores the growing trend of stringent AI clauses popping up in agency deals. The Association of National Advertisers has even updated its guidance, encouraging brands to include AI-related restrictions and consent clauses.
The Client's Perspective: Fear, Uncertainty, and a Desire for Control
Clients are understandably apprehensive. The rapid rollout of publicly available AI tools – and the associated stumbles – have made them hesitant. We've witnessed high-profile blunders like Google's Gemini chatbot producing problematic imagery or the legal controversy surrounding ChatGPT's training on a publisher's proprietary data, leaving brands concerned about copyright issues and the uncontrollable nature of AI outputs.
The recent debate sparked by Under Armour's generative AI-powered ad using repurposed Anthony Joshua footage highlights the ethical dilemmas surrounding the use of such technology for creative execution.
Practical Concerns: Protecting IP, Maintaining Brand Voice
Brand owners' top concerns center around protecting their intellectual property (IP) and preserving their unique brand voice. Imagine a poorly-trained AI unwittingly incorporating elements of a competitor's creative into your brand's work. Worse yet, sensitive customer information fed into an AI system could find its way into a competitor's toolkit.
Robert Wrubel, managing director at Silverside AI, an AI lab within Pereira O'Dell agency, confirmed, "Every client meeting starts with this conversation...to make sure the systems and the processes we use are working within the standards and frameworks that the brands want to establish."
Walled Gardens: AI in a Bubble
To mitigate the risks brands perceive, innovative agencies and brands are experimenting with closed AI ecosystems. These "walled gardens" protect data and ensure that AI-generated work doesn't inadvertently cross-contaminate other models or leak into public systems.
But what exactly constitutes a 'walled garden'? From the agency perspective, this boils down to storing brand data in an isolated system and guaranteeing that generative AI processes never mingle with external models, assuaging brand IP anxieties.
AI's Ubiquity and the Need for a Nuanced Approach
Despite the caution, it's undeniable that generative AI is becoming deeply integrated into both commercial work and internet marketing. Brands from Coca-Cola to Hyundai have experimented with consumer-facing AI tools, letting the public generate personalized content. Major ad platforms like Google and Meta utilize AI automation for campaign creation and targeting.
Clients need to protect themselves and their IP, but excessively cautious "no AI" policies could stifle innovation. As Ashwini Karandikar, from ad agency trade group 4A's, points out, there's a crucial distinction between using generative AI for ad creation and employing machine learning for the vital tasks of data analysis, ad targeting, and measurement. Blanket approvals for every AI application might create unnecessary hurdles.
Generative AI in Gaming: Excitement Tempered by Growing Concerns
While the potential benefits of generative AI in gaming are intriguing, a growing wave of unease runs through the gaming community and even parts of the development world. Here's a breakdown of the prevalent worries and criticisms:
Gamer Worries: When Does AI Go Too Far?
- Lazy Design and "Sameness": Gamers fear that the ease of AI-generated content could lead to lazy development practices. If generic assets, quests, or even story elements become commonplace, games could lose their unique spark and start feeling like bland variations on a theme.
- The Death of Artistic Vision: While AI can be a powerful tool, gamers worry about a future where human creativity is sidelined. The distinctive styles of beloved artists and studios could fade away, replaced by homogenized AI-generated visuals and experiences. This concern goes beyond aesthetics. The ability of talented artists to express unique ideas, craft immersive worlds, and tell compelling stories through their art is a vital part of the gaming experience. Gamers are drawn to games not just for the mechanics or the challenge, but also for the emotional connection they forge with the worlds and characters brought to life by human artists. If AI takes over these creative tasks, games risk becoming soulless shells, devoid of the artistic spark that truly ignites a player's imagination and leaves a lasting impression.
The Threat to Artists and Developers
- Job Insecurity: It's no secret that certain tasks in game development are prime targets for AI automation. Concept artists, modelers, and even some programmers could see their roles diminished or made redundant if AI tools become too powerful. This raises serious questions about the future of creative careers in the gaming industry.
- Creativity as a Commodity: When AI can cheaply and rapidly generate art assets that are "good enough," studios might be tempted to cut costs and diminish the role of human artists. This devalues creativity and turns unique vision into easily replaceable output.
The Path Forward: Responsible AI in a Gamer-Centric Future
It's important to stress that these issues are not insurmountable. Here's how the industry can navigate them:
- Transparency is Key: Developers and publishers need to be upfront about the ways AI is used in-game. Gamers deserve to know when they're interacting with AI-generated content.
- AI as a Co-Pilot, Not a Captain: Imagine AI as a skilled assistant to the creative team. It can generate concept art variations, suggest level layouts, or create musical accompaniments based on a composer's style. However, the human artist or designer retains the final decision-making power, curating the best outputs, refining them, and weaving them into a cohesive vision. This ensures that the creative spark and the unique voice of the development team come through in the final product. In essence, AI should be a tool that empowers human creators to bring their ideas to life more efficiently and effectively, not a replacement for their creative decision-making and artistic vision.
Looking ahead
Generative AI, like many disruptive technologies, presents a complex mix of promise and peril in both advertising and gaming. In advertising, a delicate balance must be struck between innovation and maintaining brand identity. For gaming, the stakes are even higher. The unchecked use of AI threatens to erode artistic expression, endanger creative jobs, and diminish the unique spark that makes games a beloved art form. It's time for developers, publishers, and the gaming community to demand a future where AI serves creativity, not replaces it.