Recently, I wrote an article exploring the shift from the attention economy, known for its persistent pursuit of our time and eyeballs, to what is now emerging as the 'intimacy economy.' This new frontier sees technology striving not just for our clicks and scrolls, but to engage us in a new, deeper way by leveraging conversational AI and AI personas. You can think of these AI personas as advanced conversational chatbots, that have specific personalities, objectives, avatars/profiles and most recently, voices, programmed into them.
These advanced chatbots present an array of use cases and potential utility for users, from companionship and mental health support to education and coaching. However, as we confide in and share intimate details of our lives with these AI personas, every word, pause, and inflection becomes data—a web of human experience for AI creators to potentially utilise in ways we are only beginning to understand. Since our search data, likes, and digital interactions, are already being leveraged to shape the algorithms which then feeds us with increasingly tailored advertising, one begs to ask the question of how this even 'richer', more complex data will be used?
Introducing Meta's "AI Characters"
Given the shift unfolding, I found myself reflecting on the recent launch by Meta, where they released 28 “AI characters”, which they describe as “AIs that have more personality, opinions, and interests, and are a bit more fun to interact with”.
Interestingly, each of the 28 AI Characters are created to represent the “likeness” of particular celebrities and famous personalities. From Kendall Jenner to Naomi Osaka, from Snoop Dog to Paris Hilton, reports suggest that Meta paid these personalities between $1 million and $5 million for 6 hours of work in the studio, allowing Meta to use their “digital likeness” for a span of two years. Each AI Character was assigned an Instagram and Facebook profile, and users can follow them along on their digital journeys and converse with them through Instagram and Facebook messages, or via WhatsApp.
Source: Meta
Perhaps confusingly (or perhaps for legal reasons), each AI Character completely departs from reflecting the actual individual in which the likeness is based on, and instead, their “digital likeness” is packaged up into an entirely new digital entity. For example, Kendall Jenner is now “Billie, your ride-or-die older sister” (@yoursisbillie).
Among these AI Characters sits Sam Kerr, an iconic figure in Australia who, along with the rest of Matildas, captured the hearts of Australians during the 2023 World Cup. With her Indian heritage and as a proud member of the LGBTQIA+ community, Kerr provides representation to those who might identify similarly, while symbolising the rich diversity of modern Australia and serving as an inspiration to many, particularly to the younger generation. In fact, in September 2023, Kerr was named number one on the AFR’s list of the 10 most culturally powerful people in Australia.
Source: Australian Financial Review
Good Time Sally: Meta’s New AI Character
Enter “Good time Sally” (@goodtimesal). She’s your free-spirited friend, taking on life one wave at a time. Sally clearly has a beachside lifestyle, snapping shots of seashells, burning sage and chilling on the beach next to a bonfire. She is Meta’s new AI Character based on Sam Kerr’s “digital likeness”, and whilst the following photos look like her, they are not her. They are @goodtimesal:
Source: Instagram
Whilst we do not know the extent of the deal, the recent reports would suggest that over the next two years, Meta has the ability to craft "Sally”, her digital face, her digital body, her social posts and her digital interactions with her followers, in a way which they choose. And so far, it has left @goodtimesal’s followers somewhat confused by the eerie resemblance to Sam Kerr. It appears that we have now entered the “uncanny valley” of digital likeness:
Source: Instagram
The Potential Risk of Misrepresentation through Digital Likeness
But what happens when one’s "likeness" isn't under personal control anymore? This concept was explored at length in Black Mirror’s episode “Joan is Awful”, which follows the story of Joan who has unknowingly sold her digital likeness to a streaming platform, “Streamberry”. I won’t spoil the details of the episode, but it is worth watching if you are interested in exploring this concept.
Now, consider the following post by @goodtimesal which displays bundles of sage accompanied by the caption, “Nothing a little sage can’t fix”. At first glance, this might be harmless, but several followers proceeded to point out its resemblance to marijuana. Of course, this is not Kerr’s post, but from a personal brand and reputation perspective, it could pose a concern that an AI character based on her “likeness” elicited such a reaction.
Source: Instagram
Further, a quick Google search of “Good Time Sally” returns some off-brand results. The first result that pops up a song by Rare Earth, which follows the story of a young girl who receives $20 to show the singer “a good time”:
Or as urban dictionary defines Good Time Sally:
This makes me question the design process behind Meta's AI Characters, and what considerations are made when landing on the name "Good Time Sally" for this particular character.
This raises a further question: Could an AI character’s actions inadvertently tarnish the personal brand and reputation of the individual upon which the digital likeness is based?
Aligning AI Characters with the individuals they resemble
Moving forward, perhaps an improved approach to building an AI persona or character designed to reflect a person’s “likeness” would be through co-creation, building one which is closely aligned with the individual in which the likeness is based.
What might a more aligned AI look like for Sam Kerr? Well, we could start with some of Kerr’s recent initiatives. Take the Sam Kerr Football (SKF) launch, a football school supporting girls and boys aged 3 – 14:
Source: Instagram
Or her Unstoppable FC partnership with LEGO, an initiative that aims to break down barriers to play and helps build confidence, creativity and resilience in children:
Source: Instagram
Clearly, Kerr is not short of inspiring initiatives that are purpose driven and aligned with her personal brand. Imagine if an AI persona or character was built to support these kinds of initiatives and inspire a generation of kids? This type of “Kerr Bot” could fit into SKF or Unstoppable FC, where kids could log on and get encouragement and inspiration from interacting with a responsibly trained AI which was designed with care and purpose (and I am sure any football-loving kid would get a total kick out of interacting with a "Kerr Bot"!).
The Future of AI Characters
On the horizon, Meta has plans to allow anyone to create their own AI Character, likely in response to the success of platforms like Character.AI and Poe. This means that soon the 3.88 billion people (Source: Statista) that use Meta's platforms of Instagram, Facebook, WhatsApp and Facebook Messenger will soon be exposed to AI characters or be able to build AI characters themselves.
Given these characters can be designed to have "personality, opinions, and interests", it raises questions around what kind of opinions can be designed into them? And what kind of safe guards are in place to ensure misinformation isn't spread by future AI characters?
And how far could this go? What if companies are able to generate AI characters based on your digital likeness from your data? This might mean that your comments, interactions, likes, dislikes, interests and behaviours, your photos, your videos, your digital footprint that has been built over the years, all may become the source to create a digital AI character of you.
This is an evolving space and certainly one to keep a keen eye on.
More soon.
Comments