arrow-right cart chevron-down chevron-left chevron-right chevron-up close menu minus play plus search share user email pinterest facebook instagram snapchat tumblr twitter vimeo youtube subscribe dogecoin dwolla forbrugsforeningen litecoin amazon_payments american_express bitcoin cirrus discover fancy interac jcb master paypal stripe visa diners_club dankort maestro trash

The Newsletter | Edition 114
Progress Report is dedicated to providing inspiration for action. In this special newsletter series, The State Of, we dive a little deeper into the long-term work that comes after, in the places where we’re seeing new types of progress in action. From brand strategy to design, internet trends to sustainability, music to science, beauty to travel, and more.

When news recently broke of OpenAI’s AI-powered voice assistant “Sky," the tech and culture worlds shivered with Spike Jonze / Scarlett Johansson déjà vu. OpenAI CEO Sam Altman’s fast-follow, one-word post on X fed the fire, leading some—including Scarlet Johansson herself—to express discomfort that Sky’s voice sounds “eerily similar to mine.”

Johansson goes on to say: “In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity.” This situation goes beyond the usual suspects of AI boogeyman-nery—that AI is laced with and amplifies human biases; that AI’s approaching sentience will lead to man’s extinction; that increasingly convincing deep fakes will rewrite our understanding of what is ‘real.’

Nope, the OpenAI debacle has set ablaze the creative industry's most confounding question: how should creators interface with technology that can convincingly mimic original thought, works, and even identity?

And creative knowledge workers feel the tension: on one hand, many of us are early adopters who see promise and opportunity in exploring and trialing AI tools. But many corporations are rightly focused on the protectionist aspects of this new technology—namely around protecting IP and how to manage legal exposure. Hell, these concerns are shared by plenty of independent creative knowledge workers as well.

Regardless, a dominant result of this tension is fear, which leads to a slow down in invention. A slow down in creation. A slow down in imagination. In fact, more than 85 percent of innovation practitioners report that fear often or always holds back innovation, but only a quarter of organizations understand this fear, and fewer than 11 percent are doing anything about it. Our friend, colleague, and resident smartypants Joey Camire recently talked about this at length at Web Summit Rio.

The funny thing about these dark realities is that invention—and its inherent optimism—is exactly the light that can quash them.

As evidenced in any religion’s theology and history, the greatest power is the act of creation. Perhaps it's no surprise that many monotheistic religions also underscore the notion of “submitting” or “relinquishing” themselves to that power.

Rituals aside, there’s a powerful clue for us to follow toward unlocking the creative potential of AI. What if we gave up that control? What if we came to grips with the limits of our control? What if we allowed ourselves to see where this could take us?

I recognize it can be psychologically and sociologically overwhelming to consider respecting, listening, and giving space to AI. After all, doesn’t that imbue it with next level power? I don’t know…what if it doesn’t?

When I think about what this means in our creative practices, four commitments come to mind:

  1. Reveal the truth in humanity’s beauty as much as its flaws.
    For many of us, our instinct is to use AI to generate the perfect image, the perfectly copy-written line, or the most moving concept. But what if we pointed prompts at what we often get wrong, rather than what we want to get right? Can we use AI to point out our knowledge gaps, our biases, and our assumptions in an effort to avoid the blind spots in our own creative pursuits?

  2. Prioritize collective good.
    Whether it's photography, graphic design, marketing, or journalism, a great deal of thought is put into the spectator: the audience absorbing the thing we’ve created. In pursuit of designing that single-minded embodiment of “who this thing is for,” we often fall into the trap of flattening them and imbuing them with our own biases and preconceptions. Plenty of pixels have already been spilled on the way AI is hardcoded to match the latent perspectives and beliefs of the programmers behind the keys, and as far as we can tell, we can’t stop that.

    However, as users, we can stay aware of when and how that bias appears in our prompts and in what is fed back to us. After all, it’s good to think of AI as a highly capable assistant, not an oracle with all the answers. But beyond staying tuned in, it’s about considering the humanity of who we’re creating for and encouraging the models to play toward that.

  3. Find fuel in the fear of the unknown.
    Humans have a wildly deep-seated protective instinct—ask anybody who hates horror movies. And it’s so, so tightly wound into that which we don’t know or can’t imagine. Fear itself, though, is not inherently bad or dangerous. And we can use our responses to fuel our imaginations. Can we try filling in the gaps of what we don’t know or can’t see with similar ideas we do know and can see? And can we practice drawing inspiration from this practice instead of fear?

    This does more than help improve the way we use AI tools—it resets our brains into a basal creative state. It pushes us to seek out the dark corners and replace fear with desire.

  4. Get drunk on imagination.
    Cal Newport writes a great deal on the benefits of deep work. So does Jenny Odell. The notion of flow state has popped up in business pop psych for decades. We all know that our lives are full of interruptions, tasks, and left brain work (analysis, order), yet we still persist in holding our right brain (intuition, creativity) back. The oldest wisdom is the best wisdom: allowing ourselves the time, space, and freedom—as difficult as it is to seek out—unlocks the dopamine and energy we need to replace our vigilance with curiosity.

AI is more than a tool. It’s a spirit. One without form, one imbued with energy, and one that we can grow with if we choose not to control it, but to dance with it.

The truth is, we'll continue to see AI-driven "scandals" in the coming years (months? days?). We should explore the spirit of AI rather than its function. It is something that can ultimately help us if we take the leap to elevate our instincts beyond fear and embrace desire.

A version of this newsletter was originally published last month in Ad Age. See the original piece here.

Aaron Powers is a Partner at SYLVAIN, and leads the company’s business growth in the United States. He spends most of his time in the consumer tech, financial services, and entertainment spaces, and is a co-host of SYLVAIN’s weekly podcast, Critical Nonsense.

Shopping Cart