Getting access to OpenAI’s powerful text-to-video generator, Sora, is like winning a golden ticket to meet Willy Wonka. OpenAI still hasn’t announced a general release date for Sora. As a result, the world has only seen demos of what it can do in creative hands from those who have access to the red teaming model.
On the latest episode of Creativity Squared, though, we take you inside the chocolate factory with Nik Kleverov, whose been experimenting with Sora through his relationship with OpenAI.
Nik is an Emmy-grade storyteller known for his clean and captivating commercial style paired with a distinctive design sensibility. As an immigrant and son of a nonconformist artist who was persecuted by the Soviet government, Nik’s approach to creative work is informed by his unique life story. Beyond A.I. films, his credits also include work on main title sequences for shows like “Narcos” and live-action commercials.
Nik has won numerous industry awards as a creative director and launched innovative businesses as a successful entrepreneur. He’s the Co-founder and Chief Creative Officer at Native Foreign, a woman-owned creative agency and production company all-in-one which is based in Los Angeles.
It’s been almost exactly one year since Creativity Squared’s interview with Chad Nelson, who collaborated with Nik on Critterz, the groundbreaking first-ever animated short film that was 100% designed using A.I. visuals produced by DALL·E.
Nik Kleverov
In Episode 48, you’ll hear Nik’s experience working with Sora in addition to other text-to-video tools. Nik’s perspective is informed by working with clients on commercials, as a filmmaker, and on a team creating original IP.
Nik also shares an agency owner’s perspective, discussing why Native Foreign is going all-in as next gen creatives and establishing an A.I. labs division within the agency.
Tune in to hear about one of Nik’s creative projects he got out of his vault to bring to life and how it will evolve as genAI video does as well. He dives into the creative process, the ever-changing A.I. landscape, and the paradigm shift underway in the advertising industry.
Artists need to be part of the development of A.I. tools and this is your chance to hear what Nik is sharing with OpenAI related to Sora and how you can stay ahead of the curve with A.I. and storytelling.
Critterz (2022) is the first animated short film designed entirely wth A.I.-generated visuals from DALL·E. The film stood out among other videos of that year for how it maintained character consistency which was really hard to do with DALL·E at that time. More than that, it used A.I. and narrative storytelling, going beyond just tech demos of GenAI image-making and video tools.
The essence of “Critterz” is a fun new spin on the classic theme of not judging a book by its cover. It’s a two-minute film that’s set in a fantasy world populated by otherworldly forest critters. The narrative starts out like a David Attenborough nature documentary before veering into a Monty Python-style sketch.
“Critterz” marked the beginning of Nik and Chad’s relationship with OpenAI, which provided a grant for the project. OpenAI has continued to commission work with Native Foreign, including a video for the company’s first DevDay, highlighting the human side of artificial intelligence.
With “Critterz,” while they set out to showcase the beautiful visuals that are possible with DALL·E, Nik says that human creativity drove the project forward. Alongside DALL·E tools like in-painting and out-painting, they used traditional filmmaking techniques to bring the film to life.
Nik Kleverov
The “Critterz” release made waves in film and tech, showing at film festivals around the world and contributing to discussions around A.I. in creative industries. To this day, Nik feels like there still haven’t been many story-driven projects like “Critterz” created with GenAI. Looking forward, though, he says he’s excited to see how tools like Sora empower creators to produce unique stories that otherwise wouldn’t get made and from new voices.
Nik is the son of Valery Klever, a distinguished nonconformist artist who suffered persecution for refusing to make Soviet propaganda art. His rebellion drew the KGB’s scrutiny, leading to his imprisonment. Eventually, Nik’s family emigrated to America, escaping his status as a quasi-enemy of the state. Today, Valery’s artworks are exhibited in prestigious museums worldwide.
Nik says he sadly didn’t inherit his dad’s fine art skills. Instead, the creative gene manifested as a knack for storytelling that he’s had since he was a kid. He compares himself and his camera compulsion to the Ricky Fitts character in “American Beauty.”
Nik Kleverov
Nik describes how Sora seems to interpret space in three dimensions — not just horizontally and vertically but also in depth. He says it’s almost like Sora understands the storyboard in his head, turning conceptual scenes into visual sequences and quickly becoming his most effective A.I. tool as a creative director.
When asked about the kinds of prompts he uses in Sora, Nik emphasizes the messiness of text-to-video prompting. Sometimes, you get the right look on the first shot, and other times, it might take an hour. As a result, Sora is great for storyboarding and other parts of the ideation phase, so to speak, to get what you’re thinking on paper.
The quality of the output isn’t a problem for two reasons. First, to contrast the overpolished commercial work, Nik doesn’t mind the imperfect output that Sora and other text-to-video models produce as a breath of creative fresh air.
Second, in Nik’s line of work, every frame undergoes meticulous scrutiny and editing — whether it’s filmed, CG, or animated. So even an imperfect Sora video can be adjusted to meet the quality standard. The bottleneck is in getting the A.I. to reproduce what you envision and what the right prompt is to conjure the image in one’s mind’s eye.
That high bar for production quality has historically distinguished amateur from professional work. New A.I. applications are promising to close that gap, both in the world of professional video production and amateur filmmaking.
Nik sees a future where A.I. tools are the majority of the filmmaker’s toolkit alongside traditional methods. In the short term, he says A.I. tools will continue to be helpful mainly for fleshing out ideas and pitching concepts.
That’s no small feat though; enabling creatives to quickly produce nearly finished-quality work is driving a big shift in creative productivity. Nik recognized this quickly through the “Critterz” project. Now he’s building that ethos of A.I. experimentation into Native Foreign by launching an A.I. labs division with plans to take on client work and build a pipeline of proprietary content.
In the long term, Nik foresees A.I. evolving to handle not only more of the post-production work but also during the production process. He hopes A.I. enables more amateur and indie filmmakers to bring their visions to life by reducing barriers. By removing the need for support from big production studios, A.I. could help empower unique and underrepresented stories.
He referenced writer and director Cord Jefferson’s recent Oscars acceptance speech, where he asked why Hollywood prefers to fund a single $200 million movie when the same investment could support 40 films at $5 million each.
Nick says that the true value of GenAI, however, is allowing designers and creatives to focus more on what they love — being creative. He sees the A.I. shift not in terms of replacing jobs but transforming them, enabling his team to spend less time on tedious side tasks. He’s witnessed this value himself as a creative at heart who also has to manage the needs of his business ahead of indulging his passion projects.
Nik Kleverov
His work on one of those projects shows how much A.I. can truly streamline creative development. Inspired by Anton Chekhov’s short story, “The Kiss 1.0” is a film that Nik’s website says “will never be finished.” It’s an animated retelling of a fictional World War One-era soldier whose life devolves into madness after a brief and mysterious kiss from a stranger in the dark.
After years of thinking about it, Nik used ChatGPT to rewrite Chekov’s original story into first-person prose so he could have it translated into Russian and generate a voiceover. He then brought the characters to life and recreated the opulent brutalism of the late Russian Imperial period using a combination of DALL·E with Runway Gen 2, altogether overcoming the major obstacles that kept “The Kiss” stuck in the idea stage and on a creative shelf collecting dust.
Nik’s goal is to periodically remake new versions of the two-minute film by feeding the same inputs into increasingly advanced generative video models, showcasing the evolution of A.I. video tools. His vision is to display all the versions simultaneously in an exhibit documenting how A.I affects creativity and vice versa.
Nik envisions human creativity collaborating with A.I. tools like a human composer orchestrating the instruments in a symphony, where each A.I. tool has its part to play in the overall creative vision.
There’s already a wide variety of powerful generative video tools on the market. Nik walks through some of them, explaining how he uses them and what trends he’s excited about. We also discuss the copyright issues around GenAI and how they affect creators.
He reiterates that Sora is remarkably powerful — just type in words, and it transforms them into video. Runway is similar, offering a tool for editing videos and another tool for adding motion to still images.
The tools have their limits, though. Currently, you can only create clips up to four seconds long, and typically only the first few seconds are good enough to use before the quality deteriorates. Despite the constraints, Nik says they’ve been invaluable in his experiments with GenAI.
Pika is another tool offering text-to-video, but it’s trained on different data from Runway, which results in a unique look and feel. Pika is also developing features like a sound effects generator. You upload a clip, and it designs sound around it. This feature is still text-based, but one day it might allow sound editors to simply drop in a clip of a sword fight and automatically get a synchronized audio clip of swords clashing.
Despite these powerful tools, we still have to figure out how to balance innovation with responsibilities like content ownership and respecting artists’ intentions for their work. Especially in commercial work, creators and clients alike can’t risk publishing content that infringes protected work, even if it’s generated content, for obvious moral and liability reasons.
There’s still a lot of gray area as companies like Adobe work to normalize content authenticity credentials and governments move to regulate. Will existing A.I. content be subject to future copyright laws? How will commercially-safe data sets affect the diversity of content on the market?
Nik Kleverov
Regarding copyright, Nik says it’s important for creators and clients to communicate openly about the risks and tread carefully. It helps to work with a professional creator like Nik, who has experience to get the most out of A.I. tools while minimizing liability. After all, certain IP like Mickey Mouse is so widespread in our cultural imagery that a tool might inadvertently replicate something iconic. Ultimately, Nik says the responsibility is on creators to avoid infringing on some one else’s work.
As a well-known artist working closely with the world’s most prominent A.I. developer, Nik has a seat at the table where history is being made for artists. Several Creativity Squared guests like Gerfried Stocker, Marlies Wirth, and Domhnaill Hernon have discussed the critical role of artists in helping us interpret, interrogate, and steer technology to benefit society.
The same concept came up differenty on a recent episode of Hard Fork with award-winning A.I.-collaborative filmmaker Paul Trillo. Host Kevin Roose asked Paul how he feels as an artist about helping an A.I. company develop and market a product that might result in less jobs for human artists. We posed the same question to Nik, as an artist in a similar position.
Paul told the Hard Fork hosts that he sees GenAI as an opportunity for those without skills to create new content, rather than as a money machine for creative industry executives. He also said that, as an artist with a stake in how A.I. evolves, he’d rather be involved in the process from the inside than looking in from the outside. Nik shared those thoughts as well.
Nik Kleverov
For Nik, the diversity of the existing A.I. video projects out there testifies to A.I. being a net positive for creativity. He points to the collection of clips that OpenAI commissioned for Sora’s announcement to show how different artists like himself, Paul Trillo, and Josephine Miller can produce vastly different content with the same tool and access to the same data. He says that the initial buzz around the clips wasn’t just hype for the sake of A.I.; it was well-deserved admiration.
Nik believes that getting creatives involved was a smart move by OpenAI because artists bring a different perspective, pushing the boundaries of what engineers designed the tools to do. He compares it to the perfect marriage of both sides of the brain, creative and logical. At the end of the day, he says his partnership with OpenAI isn’t about being in a good position in case A.I. takes over, it’s simply about showcasing what humans can create with A.I. and seeing how the world reacts.
The main question for Nik is if Sora is going to be a pro tool or not as that has different implications when it ships publically.
As the rest of us wait to get access to Sora, Nik is continuing to work with it on projects he says he’s excited to come back and talk to us about when they release.
Nik Kleverov
Standing on the cusp of the next generation in generative video, Nik encourages reluctant or aspiring creators to stay ahead of the technology curve and dive in to curb fears.
Thank you, Nik, for joining us on this special episode of Creativity Squared.
This show is produced and made possible by the team at PLAY Audio Agency: https://playaudioagency.com.
Creativity Squared is brought to you by Sociality Squared, a social media agency who understands the magic of bringing people together around what they value and love: http://socialitysquared.com.
Because it’s important to support artists, 10% of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized non-profit that supports over 150 arts organizations, projects, and independent artists.
Join Creativity Squared’s free weekly newsletter and become a premium supporter here.
TRANSCRIPT
Nik: And since I’ve gotten Sora, I will say that I feel this genuine childlike creativity kind of reawakened in me, where I’m getting to explore ideas that have just been sitting in a vault because who’s going to give me the money or, you know, resources to explore some stupid ideas that I had, but it’s so fun and invigorating to do it.
Helen: With the golden ticket to test out OpenAI’s Sora, Nik Kleverov joins Creativity Squared to discuss his experience with the text to video tool and the childlike creativity it unleashed in him. Nik is not only an Emmy grade storyteller, but is also an immigrant and the son of a non-conformist artist who the Soviet government persecuted.
Helen: Nik’s life story and varied experiences inform his diverse approach to creative work, from exploratory use of AI within storytelling to design led title sequence work and live action commercials. He’s won numerous advertising and creative industry awards as a creative director.
Helen: As an entrepreneur and executive, he has started and operated successful and innovative business enterprises, including Native Foreign. He’s the co-founder and chief creative officer at the woman owned creative agency production company Hybrid based in LA, that’s worked with OpenAI on different projects.
Helen: I first connected with Nik through Chad Nelson who’s been a guest on the show to discuss the first ever animated short film that was 100 percent designed using AI visuals with Dali. Nik and Chad collaborated on the groundbreaking short. Today you’ll hear Nik’s experience working with Sora, in addition to other text to video tools.
Helen: Nik’s perspective is informed by working with clients on commercials, as a filmmaker, and on a team creating original IP. Nik also shares an agency owner’s perspective discussing why Native Foreign is going all in as next gen creatives and establishing an AI labs division within the agency. Tune in to hear about one of Nik’s creative projects he got out of his vault to bring to life and how it will evolve as gen AI video does as well.
Helen: He dives into the creative process, the ever changing AI landscape, and the paradigm shift underway in the advertising industry. Artists need to be part of the development of AI tools. And this is your chance to hear what Nik is sharing with OpenAI related to Sora and how you can stay ahead of the curve with AI and storytelling.
Helen: Enjoy.
Helen: Welcome to Creativity Squared. Discover how creatives are collaborating with artificial intelligence in your inbox, on YouTube, and on your preferred podcast platform. Hi, I’m Helen Todd, your host, and I’m so excited to have you join the weekly conversations I’m having with amazing pioneers in the space.
Helen: The intention of these conversations is to ignite our collective imagination at the intersection of AI and creativity to envision a world where artists thrive.
Helen: Nik, welcome to Creativity Squared. It’s so good to have you on the show.
Nik: Hey, Helen, how are you?
Helen: I am good. We’ve actually got to meet in person twice, first at the Tribeca film festival and then South by Southwest this past March. So it’s so wonderful to have you on the show.
Nik: Yes. Likewise. Thank you.
Helen: Oh, and how we first got connected was through Chad Nelson. Cause you collaborated with him on Critterz and goodness, I think that was like episode three. I was right after Critterz had come out and I think I said congrats on Instagram to Chad and he replied. I was like, “Oh, would you be open to being on the podcast?”
Helen: And, you know, fast forward, here we are almost a year later. But for those who are meeting you for the first time, can you introduce yourself and a bit about your origin story? And then we’ll dive into the foray with AI and Critterz and go from there.
Nik: Yes. I’m Nik Kleverov. I’m the chief creative officer and co founder of Native Foreign.
Nik: We’re a creative agency production company Hybrid here in Los Angeles, California. I come from a line of filmmaking and all sorts of creative, including design. I worked on a lot of notable main title sequences, including Narcos and Bloodline and several others. So we have a bit of an ethos at our agency to bring a high level design and integrity into our work.
Nik: We are primarily brand work, but we are also in the entertainment side and we have a long term goal of creating IP.
Helen: Amazing. And I also saw, which I didn’t know when we first met, that you have a WBENC status because it’s women owned as well. So I didn’t know if you wanted to plug that as well.
Nik: It is, yes. We are officially a women owned business. Thank you. My partner, Rebecca, is at the helm of this. She started the company and I actually came on board after. So we have an initiative to, you know, employ and support the work of female and diverse artists and creatives all over the world. And it’s honestly just such an honor.
Helen: As a female founder, I always love to hear that and other businesses that support diverse voices in all forms.
Nik: There was a big conference that Rebecca just went to in Denver. I don’t know if you’ve heard of it, but you know, they do, I think like an annual conference and all sorts of different events that are really cool.
Nik: And I think some of them you can go to even if you aren’t WBENC, you know, if you’re aspiring or you just want to connect with like other female founders and female executives and stuff like that. So I haven’t been, but I heard that it’s great and you should check it out.
Helen: It’s on my list. The application process is very long, so maybe when I take a break from the podcast, I will actually get to filling out all the application work.
Helen: Well, since you are one of the very special few who have gotten access to Sora and the text to video, which is all the hot news in AI, I want to dive into that, but let’s start with how the relationship initially started and that was with Critterz. And for those who haven’t heard about Critterz before can you introduce that project and kind of your relationship with OpenAI as well to kind of set the stage?
Nik: Yeah. So Critterz is actually the first animated short film that is a hundred percent designed using AI generated visuals with Dali. At the time, nobody had done it and Chad, who’s been a friend now for quite a while, was kicking around some ideas. And, you know, he talked to me about this kind of world.
Nik: And he was one of the first Dali users. He brought me into that. I got a chance to play around with stuff, but you know, Chad’s a really creative and industrious guy. So he created this world of like forest critters and started kicking around the idea of like a story and, you know, who these kind of critters are.
Nik: And the point of the whole critter story is that everybody has assumptions about who someone is, but deep down inside, you know, we’re very deep and layered. And the critters are [a] representation of people. And so the kind of short, the way that it goes, and you can watch the short critters dot – critters with a Z dot TV (critterz.tv).
Nik: The way that the short is, and the way we kind of pitched it is, it’s a David Attenborough planet earth style film that clashes into a Monty Python sketch. So we wanted to highlight and show off the beautiful visuals that were possible with AI tools back in the stone ages of ‘22 with Dali, Dali2 is when we were creating these things.
Nik: And, you know, you’re kind of going through this and we brought the images to life with very traditional means, meaning like we roto-ed everything. We separated all the layers. We used inpainting and outpainting to add stuff in, and then we created the Z depth and then we hooked up the characters to Unreal Engine and had their mouths moving and stuff.
Nik: But anyway, you’re kind of, you’re going through this beautiful world. You’re seeing these wonderful critters that, you know, were kind of created with AI. I put in quotes because just like anything, it was an iterative process that, you know, I think there might be misconceptions about AI that you press a button and it’s done.
Nik: But I mean, there was like hundreds of iterations of each character fixing limbs or eyes or things until like they were exactly the way that Chad envisioned them. And we’re going through this beautiful world. And then the critters all start talking back to this omniscient narrator. And so, you know, they’re like, that’s actually all wrong.
Nik: And you know, this like ferocious red spider is actually named Blue, the red spider. And, you know, everybody has their own stories and their own things. And so it’s… I think it picked up some steam. It got us a lot of press and went into like many film festivals. I think we’re up to like 16 or 18 now, but you know, I think the, it’s, it was a little different than a lot of the AI stuff that was coming out at that time, which was also very like, tech demo-y feeling, you know, it wasn’t storytelling in that traditional sense.
Nik: Funny enough, I still feel like there aren’t that many things like Critterz out there. A lot of the kind of AI films I see are definitely still more on like the artsy side. And [00:10:17] there are certainly the capabilities and tools to be doing more storytelling, you know, with each day that passes, but anyway. So after we pitched Critterz to Open[AI] and got basically a small grant to create it, that created a relationship with us and them. Open[AI] actually hired Native Foreign for a couple of things.
Nik: One of them, probably the most notable one, was this dev day film. So they had their first annual dev day, which is their keynote. So at the top of it, I believe right before Sam came out there was a film that played and the film was a bit of a Chat GPT retrospective. So I interviewed hundreds of people all over the world and just about their experience with Chat GPT in its first year, because the timing of it, Chat [GPT] had just been out for right about a year and all walks of life, all different types of people with this kind of common unifying thing that in a sense, it helps you enjoy more time to be human.
Nik: Which I think was the thesis and kind of what I discovered from just naturally talking to all these people. And so we put together a little poignant film that played at the top of it. But anyway, I think it was a mixture of things and, you know, our kind of relationship with them and doing some of those creative projects together that got me the invite to Sora, which I’m extremely grateful for every day, because it’s been a blast.
Helen: Yeah, that’s amazing. And congratulations on the, I mean, I know I follow you on LinkedIn and Instagram, and it seems like, yeah, you guys were in Con and we saw each other at Tribeca, but yeah, it seemed like a very amazing year just in getting to do the festival circuit with the film.
Helen: And it was really nice getting to watch it on a big screen at the Tribeca Film Festival because the quality of Critterz I mean, it looks so good on a big screen too. So congratulations.
Nik: Thank you.
Helen: And I tried to – I think I applied too late for that dev day. So maybe next year I’ll apply early enough and get to go
Nik: Hopefully we’ll get to do something again with them for that.
Nik: But yeah, I mean, you said you’re used to watching on a screen like this and you’re like, yeah, this is cool for that. And then you’re like, Oh, no way. The digital cinema print comes back and you watch it. There was another film festival that played it in IMAX and I was a little worried.
Helen: Oh wow.
Nik: How is this going to hold up in IMAX?
Nik: And I gotta tell you, it was just, incredible. You know, seeing it so big. It gives you hope that, you know what, these tools can actually create a real product that people can go watch and enjoy.
Helen: Well, I think this is a good segue because Critterz and you said it, all the imagery was created with Dali but it was actually animated and brought to film life with traditional filmmaking tools.
Helen: So it was kind of like a combination of Dali and traditional filmmaking. And now with text to video with Sora, we’re getting more into the gen AI tools actually making the video itself. So, I’d love to hear kind of your experience with Sora. Cause you’re one of the few, you have like the golden ticket, the Willy Wonka golden ticket, playing with it and kind of where it’s at, your experience and where you see it’s going to just cause you’ve already been so knee deep in the evolution of it. That was a lot of questions piled into one.
Nik: My father was a notable artist in Soviet Russia and in Soviet Russia, you were only allowed to paint socialist realism and he didn’t do that.
Nik: And so because of that, he was a bit of an enemy of the state. Certainly the KGB had their eye on him and his whole group of artists, friends, he ended up in jail, he got out. We basically left the Soviet Union and so I’m an immigrant, although I came to America when I was quite young.
Nik: But I say that to the point of, you know, his work now is hanging in famous museums all over the world. But if you walked up to me and gave me a piece of paper and asked me to draw a stick figure pony, it would look like a five year old’s. I am not that kind of artist. But I did inherit some sort of creativity from him and I think it’s come in the sense and sort of storytelling that I have always been into.
Nik: I’ve always walked around with a camera since I was a young boy filming everything. I mean, almost to like an American beauty point where I’m just like constantly – I have like so much footage of just my life, which is kind of funny, but filming things, telling little stories, making short films where you’re editing in camera, literally.
Nik: “Okay, we got that shot. Now let’s get this. Now we need the closeup. But now the -” you know, so I’m the type of creative that’s always thought in motion and in sequences and in stories. And some of the, like the [generative] text to image tools are incredible, and I’ve certainly gotten a lot out of utilizing them. However, for me, the ability to think in motion and think in sequences and stories, that’s where Sora truly excels.
Nik: Sora is really incredible because it thinks in X, Y, and Z space and Z not just being depth, but also being time. And it’s almost like when you create your prompt and you create your scene, it can somehow keep track of, say the person or object doing the thing you asked it to do through some sort of space.
Nik: And obviously I’m not an engineer. I have no idea how it works, but it is for me, the most useful AI tool I’ve ever used, because obviously as a creative director, text to image is fantastic for communicating ideas and helping people see your vision and heck in some cases, being used as some degree in finished work or a portion of that, but to actually be able to say, okay, here’s a story in my head or a sequence or a scene and being able to see that happen, even if it’s not perfect, is kind of amazing.
Helen: That’s very cool. And I do that with photos. Like I have so many photos. And one thing that, a question that came to mind as you were saying that, have you either built your own model or have thought about it of loading your entire, you know, history of all the video and then working with that? Cause that, I mean, as far as like a personal or I forget what they call them, but the personal language models That sounds like a very fun toy to play with all of your life’s work of footage that you’ve captured since you were a little kid.
Nik: That is a really good idea and I will look into that.
Helen: I think that aspect for creatives or even just anyone that has access to all of their IP and work and to play with it in real time I think is a really interesting aspect of AI that’s coming. But going back to Sora so you kind of said some of the aspects that you really find interesting.
Helen: So what are some of the text to video prompts that you’ve used and how you’ve played with it and what’s been some of the most surprising aspects of working with Sora?
Nik: Creating clips just based off of text prompts is not the easiest thing in the world always. And as incredible as it is, sometimes I feel like it kind of gets me instantly. Sometimes the idea or concept is a little bit more complicated and does take some drilling down and wordsmithing to kind of get it to that point. But again, it not being perfect is kind of okay. I saw a bunch of people post online after my clip, my Sora clip came out.
Nik: There was one bicycle repair shop in it. And I just saw people say like, “bicycle repike” because it was spelled wrong, you know, and they’re like, “this is ridiculous.” Like, but to me, I come from a world now where, you know, our bread and butter is very much brand storytelling and commercial work and every frame of every commercial that we do is processed.
Nik: Okay? Like, it doesn’t matter if it’s filmed or CG or animated. Every single frame, even if it’s a person sitting on a couch talking to someone, there’s a smudge on the wall, there’s a hair on the couch, you know, there’s like, we live in this world where everything has been so perfected and so kind of glossy in a sense.
Nik: And now also, I think that people kind of expect that. And when things aren’t kind of pristine, you’re just, it’s becomes a bit of a separator between amateur work and professional work. And honestly, with some of the new tools that Adobe has been announcing and stuff like it seems like that stuff is going to be extremely easy to do there, you know, comp artists and VFX masters that have been doing this stuff.
Nik: I mean, a lot of this stuff is definitely becoming more democratized in the sense that you’re going to be able to kind of do it on your own in some senses. I mean, obviously I haven’t played with those tools from that video that they announced, but I’m curious, you know, how good it really works.
Nik: But again, it’s like, even if we were to ultimately use Sora clips for finished client work, I don’t need it to be a hundred percent done out of the engine. Like it’s okay because the, we can still finagle and, you know, adjust and add things and change and tweak as we would anyway, even if we went out and shot the actual shot with regular cameras and all that kind of stuff.
Nik: So I’m okay with it [00:20:40] not being a perfect thing because it’s pretty damn close. And again, this is the first iteration of what it could be. So I’m excited to also see what the future of it’s going to be.
Helen: And I appreciate that you mentioned kind of all the other, I guess, tech stack involved with filmmaking and production of commercials with the VFX and all of the tools.
Helen: How do you see Sora in text to video kind of fitting in the tech stack? Cause the Hard Fork interview, which I know we were talking about before we started recording, Paul Trillo mentioned that he kind of sees it as B-roll. How do you say it in your creativity workflow or in the tech stack as you’re experimenting with it now?
Nik: I think in the future, the AI tools will be kind of our entire chest of things that we use along with our traditional like NLEs and all that. Sure, it can be B-roll [but] I think immediately and in the very immediate future, it’s going to be used to iterate and to basically create animatic sell through ideas. You know, brainstorm on things like that.
Nik: And I think in general, AI tools have been really, really instrumental in allowing people to iterate at near finished level work. And that’s been kind of the big shift that has happened over the last year or so, right? You’re able to really let people know what your vision is and I think that’s part of this democratization of creativity that we’re seeing.
Nik: But yeah, in the immediate, I think it’s that, I think it’s being able to create an almost like shot by shot animatic. I mean, we already do that now for commercials, right? I mean, not only do we storyboard, but we cut animatics before we shoot anything. So everybody’s on board. Everybody knows what they’re going to get. There’s no like major surprises when you get on set for commercial of like, “Oh my gosh, I never thought that was going to happen.”
Nik: It’s a little less exploratory in that way. But yeah, I mean, if you can basically cut your scene and say, Hey, this is basically [what] we’re going to make, I think that’s the initial use of it. But I think as the tools continue to evolve, I do think there will be finished work elements to Paul’s point. Yes. There will be B-roll applications.
Nik: I would be curious if there’s a world where, if I’m a filmmaker and I’ve made a film, let’s say it’s an indie film and I don’t have money for pickup shots. What if I uploaded a cut of my film that was graded and had all the kind of like tinkering to it, to a data set and all my raw footage. And then I was able to generate potentially some sort of, you know, pickup shots, B-roll.
Nik: Other like maybe special effects shots that we didn’t have the budget for. I think There was a guy at the Oscars this year who was saying, “Why are we greenlighting one $200 million movie? Let’s greenlight 54 million dollar movies instead.” And I think that is the direction we’re going to head. And the cool thing about that is we’re going to get to hear more stories.
Nik: More people are going to be able to tell more things. So there’s kind of this like, Oh, AI is going to take our jobs. But in my world also, I’ve seen a lot of budgets coming down, jobs just going away. So in a sense, it’s like, well, I’d rather have a client take a chance on a job that is able to be done for a little less that’s more work.
Nik: It might not be at the same level of finance, but you know, if we get more of those projects and we’re able to keep employing all of our designers and creatives, to me, you know, that’s actually kind of cool. Cause then, a lot of my designers and people that work with us, they’ve been able to spend less time doing menial tasks and more time doing creative tasks, which I think is what everybody ultimately wants to do.
Nik: Like, no, nobody gets into this saying like, “Oh gosh, I can’t wait to rotoscope some stuff out today. It’s going to be awesome.” So I think the more we’re able to empower creatives to be able to flex their voice, I think the better.
Helen: I agree. There was a gal I spoke with at SXSW, it was two years ago and it was, we were talking about text to image and she’s at an ad creative agency.
Helen: And you know, a lot of, especially when you’re doing like social media ads, you’re just creating hundreds of versions of the same thing, changing the colors. Like no one thinks that’s fun. And if you can outsource that to AI or delegate it to AI, there’s a lot of grunt work in production and advertising and creative.
Nik: For sure.
Helen: But yeah. I don’t know if I told you when I first got a demo of Chat GPT, it was in October of 2022, and I have a miniseries in my head that I’ve been playing with and the first time I ever played with it was the opening scene and it captured my imagination and I was like, “Oh, maybe I’ll have the first mini series ever done with AI,” but I launched the podcast instead, but maybe as these tools evolve, I might actually get around to it.
Nik: There’s one thing you just said, which I just want to say, which is I’ve taken the very traditional route of, I started as a creative am still creative, obviously, but as you are a partner at an agency and you have the day in day out of work, it’s just, you know, your perspective and your focus has changed just naturally, right?
Nik: And you do have to think about bottom line and other things more. It’s just like a part of what happens. And since I’ve gotten Sora, I will say that I feel this genuine childlike creativity kind of reawakened in me where I’m getting to explore ideas that have just been sitting in a vault because who’s going to give me the money or, you know, resources to explore some stupid ideas that I had, but it’s so fun and invigorating to do it.
Nik: And [it] kind of makes something that’s like either real or real ish, but you know, it’s you can get your idea out onto paper, so to speak. And that’s part of what’s been really fun and why I’ve fully updated my sleeping schedule into processing around rendering Sora clips and, you know, putting together little projects.
Helen: I love that. And I saw an interesting interview with Android Jones and Sam Altman. It was actually at Burning Man 2022. We’re going to do a blog post on it and in terms, cause you mentioned the evolution of AI, what Sam Altman said in that discussion is, we’re soon just going to be able to talk to computers and in real time, have it render what we’re thinking.
Helen: And it reminded me of Westworld season four. Did you happen to see that by any chance?
Nik: I did not even realize it went up to season four. Is it really? Is it still going?
Helen: You have to go through season four, it’s worth it. Season 4 was the last one.
Nik: I’m behind on a few shows. I got to really catch up.
Helen: Well, the main character is a game maker and she literally just speaks and like these holograms come up for the gameplay.
Helen: And that’s how I, as Sam Altman was saying, we’re going to speak and have things rendered in real time, kind of how I imagine it maybe in 2d, but with all of this spatial computing, we might see that in 3d too. So I think it’s only going to get the play box and that childlike wonder. You know, these tools are just going to go crazy with their capabilities.
Helen: I think it’s only going to get wilder on what we can do in real time. Yeah.
Nik: It is wild. Fun fact, the original 1973 Westworld film is the first film ever in movie history that had a heads up display or a HUD of a robot’s perspective. So the robots in that movie, there’s some scenes where you can kind of see through their POV and that was the first time it had ever been done in cinema, which is pretty fun.
Helen: I did not know that, but I will have to go back and watch that film.
Nik: The movie is very cheesy. It is nothing like the HBO masterwork, but I mean, it’s a masterwork in its own way and obviously set the tone for, you know, that whole franchise and series to go. So.
Helen: Yeah, very cool. Well, one thing I wanted to make sure that we talked about is one kind of where Sora is at as a research tool.
Helen: And something that you said that I think [is] really interesting is that Native Foreign has set up a whole AI labs as an experimental playground too, or sandbox. And that your reasoning is like, you really just want to understand the tools and where it’s going. So you guys are going all in. So I’d love to kind of hear you know, your thoughts on how you’re experimenting, researching, and playing with it at Native Foreign.
Nik: Oh, absolutely. Yes. So we are you know, creative agency production company that deals mostly with brands, but also entertainment. I mean, we do Netflix and Amazon titles and work and all that kind of stuff. But I think as we’ve been playing with these AI tools, we see how incredible the future looks with integrating this stuff into our pipeline.
Nik: So we’ve started an entire AI labs division where we have a fund and we have projects that we’re doing, and some of them are, you know, client work, and we’ve had some clients that are pretty open and really cool about, hey, like we’re going to take a chance on something and we understand that it’s this kind of nascent space, but let’s see what happens if we try something out.
Nik: So, you know, the interesting thing is we’ve had this vision for creating IP and launching IP, and it’s taking a little longer than we wanted it to. However, really right now with the AI tools, we see quite an opportunity to actually jumpstart that a bit and be able to create more. And we saw with Critterz that we were actually able to create a true piece of IP and you know, we have other plans for that as well, and we have other projects as well.
Nik: So there’s kind of this very encouraging space. We wanted to create within our shop that is saying, Hey, we are next gen creatives, and we are embracing the tools and tech that is becoming available. And we’re trying out all sorts of stuff. And some of them might be huge wins and some of them might be flops, but it’s okay to fail and it’s okay to create and see what’s new.
Nik: You know, I did a project, another project I’ve been wanting to do for a long time was, I’m a big Anton Chekhov fan and there’s a short story he wrote called The Kiss and The Kiss is about this kind of lowly infantry army, military man that goes to a party at this general’s mansion. And this is, you know, 19th century Russian literature.
Nik: And he’s a bit of a loner and whereas the party is kind of going on, he wanders off into some dark hallways and he stumbles upon this woman. And the woman comes up to him and grabs him and gives him a kiss because she thinks that someone else, she realizes it’s not the person she thought. She runs away.
Nik: He’s like in love. He couldn’t really make out her face in the darkness. So he didn’t know who she was. He goes down the hallways. He tries to find her. Can’t find her. And then in very traditional like Russian literature manner, this like kind of totally destroys his life. And he’s just obsessed with this person that he kissed once randomly in a dark room at a party.
Nik: And he just dies like this old lonely man. So I had used Chat GPT to basically help me rewrite. The short story into first person prose. So it could be VO. And then of course I obviously tweaked it and workshopped it. I also find, you know, I am Russian Ukrainian. And I speak it, but my writing is not great.
Nik: Like I’ll just admit, you know, I grew up here, so I’m very conversational, but I’m not like writing any Dostoyevsky any day, anytime soon. So then I find also chat is much better at translating things than like a traditional Google method.
Nik: So then it actually helped me translate the prose that I wrote and rewrote with chat, into Russian and then got that recorded as VO. And then for that one, I actually used a mixture of Dali and Runway. So, Runway gen 2, which is basically kind of bringing motion to life for images, right? And so I was creating the looks of the images that I wanted. And then I was using Runway to kind of bring this story to life. And there’s a lot of Runway trailers out right now.
Nik: You know, a lot of people are playing around with it and it’s cool to see people bringing all sorts of ideas to life, but that was another one where it’s just, there’s no aim. There’s no particular goal other than I’ve been wanting to make this short for a long time, but again, like, how am I going to shoot a period piece in Russia?
Nik: That’s just a short film that has no purpose other than just existing. And what I did was I created this film, it’s about two minutes long and I called it The Kiss 1.0 because I very decidedly made the plan to keep updating the same film over and over again.
Nik: And then I’ll make a Kiss 2.0 and a Kiss 3.0 with different tools and tools as they evolve. And then in the end, I’ll have them all kind of playing at the same time on a big wall. And you can kind of walk from one to the other. And in some senses, it’ll be a test of time of how the AI tools have come about.
Nik: But also I think it’s interesting to see how AI influences creativity and how creatives work with AI and how different creatives interpret. The tools in different ways. I mean, just like an artist, you know, same paint brushes, same paints and the wildly different results. And I think that’s kind of fascinating.
Helen: I love this and I can’t wait to see it. I don’t know if it’s something that we can embed with this interview or if it hasn’t been released yet.
Nik: It’s up on our AI labs. It’s there. Yeah. We finished it at the very end of last year. So I definitely think at some point soon there should be a Kiss 2.0 with some sort of, you know, other text to video tools integrated for sure.
Helen: Yeah. I love that. And I love that you’re gonna do different versions of it with a different tool.
Helen: So very, very cool. And since you did mention experimenting with different text to video tools, and you mentioned Runway, can you kind of walk us through the differences between the text to video players right now and maybe a strength of Sora over the other ones, or if the other ones have strengths that sort of might not have?
Nik: Well, I mean, Sora is just, right now anyway, it’s text to video.
Nik: So it’s just that. Like you have to type in words and it’s incredible and it’s incredibly powerful and amazing as I’ve already endorsed,said. But Runway is awesome as well. Gen 1 and gen 2 are really interesting tools. Gen 1 is allows you to upload, you know, video and like change the looks of it, which is kind of fun and exploratory and gen 2 brings motion to stills.
Nik: So that’s really useful if you have a very specific look in mind. I think the limitations anyway, as of right now, as of the time of this interview is, you know, you got four seconds and really only like the first couple seconds look kind of good. And then kind of devolves. So it’s limiting in that way, but [it] certainly [has] been really helpful to experiment and get my hands, you know, involved in AI and stuff.
Nik: And Pika is really cool too. Similar that they have text to video. It is obviously trained on different data than Runway. I mean, the look and vibe of Pika is definitely different. And I’ll say that Pika is kind of developing some other niche things that I think are really interesting. There is a, like a sound effects generator where you upload a clip and it will create sound design around that, which right now is also still text input and text based it’ll be really cool if that gets to a point where you just like upload a clip, save a sword fight, and then it can actually look at your clip and be like, [sword SFX noises].
Nik: You know, like, so that, again, like I do think the future of creativity is us sitting and with our creative brains and thinking about things and be like, “Oh, well, I use this for this. And I love the way this sound designer works. And I love,” you know, so like, kind of creating your own little suite of new tools that are not just post tools, but also production tools that you’re using from wherever you are.
Helen: That’s very cool. I feel like I saw a demo on Instagram where it was all in one tool where the sound effects were added. I probably saved it so I’ll have to find it. And I’ll also mention all of the links and stuff that we are talking about. We’ll be sure to link on the dedicated blog posts that accompanies this interview as well.
Helen: And since you mentioned training I would be remiss if we didn’t talk about some of the concerns related to some of these models on how they’ve been trained, making sure that they’re commercially safe from you know, especially on a client perspective and just like taking care of all stakeholders especially artists who may or may not have consented to the models, depending on which ones that we’re talking about.
Helen: So how do you think about that when you’re in conversation either in your labs or with your clients as you’re dipping your toes, and all the experimenting with AI in your AI labs?
Nik: Yeah. I mean, of course we’re in a very nascent stage of all of this. And I think there are just some things that any client that’s working with stuff needs to understand, which is like, Hey, there are bigger things at play here than just us.
Nik: You know, we are not developing the AI technology on our end, but we have gotten really good at knowing how to use it and the workarounds and kind of the best ways forward with the AI storytelling. But certainly in the same way that you could photoshop Mickey Mouse going down, you know, Times Square on a skateboard or whatever, I guess if the tools do accidentally make something, I do think it’s kind of your, responsibility to be like, “Oh, well, let’s not do that because that is a clear rip off of something. How did that get through?”
Nik: You know, say if you upload it, made something in Midjourney or whatever. But yeah, I mean, as far as being utilized as final assets, I think that we need to go through all those steps still. And I’m sure that every major player is doing the most they can because I’m sure they all want to be able to use this as enterprise solutions.
Helen: I guess, can you expand on what you mean by what you just said about going through all of the solutions, because I could understand that from two perspectives of one, training their own models and creating the data, which I think that would be really hard or working with commercially safe entities like Adobe. Anything that comes from their Adobe Firefly, they’ve consented and they get compensation and that type of thing.
Helen: So, yeah. Can you just expand on what you meant by that?
Nik: I think we’re heading in that world where we are going to, you know, it’s, there’s kind of a funny thing with some of the stock libraries too. I think it’s going to be, “Well, who were you trained on? You know, what, where is your tech trained on and where we trained on like a really cool stock library.”
Nik: Like, you know, like a film supply or something, it’s like, “Oh, well we have film supply. And so we have like cool artists and filmmakers that are making amazing visuals. And so our stuff is going to look cooler because it’s trained on that type of data.” Obviously I can’t speak to what’s being used to train.
Nik: I try to do my own due diligence of, Hey, let’s make sure that we’re not creating something that looks like something else. And we’re trying to, you know, safeguard creativity, but yeah, I mean, it is still very new, so I’m sure things are going to happen. Like if you make something now, is it grandfathered in somehow? If as long as there’s nothing that’s provable, I mean, who knows?
Nik: So we’re definitely in this kind of interesting world of sheer creative power, kind of being able to flex. And then I think everything else is going to kind of have to catch up.
Helen: Oh, I think one thing that’s interesting and I’m slightly obsessed with the Sam Altman and Android Jones interviews, because he actually did two so far, 15 months apart.
Helen: And in the first interview, Android Jones actually told Sam Altman on stage at Burning Man that he considered if this was a Hitler moment and he should put his life at risk and kill Sam Altman on stage. And then afterwards it’s like, no, you’re safe. But he really was this voice of, really all of the concerns for artists.
Helen: And then 15 months later, they got back on stage. I think it was in Oakland. It was either Vegas or Oakland, and had another conversation. And in that timeframe, Android Jones barn with all of his life’s work had burnt down. So he had a whole different energy about life in the world.
Helen: But the conversation shifted more to artists being part of the conversation. And again, Android Jones kind of representing what artists want. And then at the end, Sam was like, “give me 15 more months and we’ll get back on stage.” So I say this because in Sora, on the website OpenAI.com/Sora, that they are working with creatives like you and policy makers before they’re releasing it more widely to get artists feedback about their concerns.
Helen: So I’m curious, like, what would be your wish list for these tools to make it as mutually beneficial for all stakeholders like you as using them and the artists that are training the data and just all stakeholders?
Nik: For me, it’s all about, is it a pro tool or not? So a lot of the feedback I’ve been has been in that realm and regard of like, here’s what I need for me to use it as a pro tool in our various kind of situations.
Nik: And I think as an agency owner, I would like to know that, Hey, this is good to use. You know, I think that that is the big stuff. And I think certainly Adobe is making strides to kind of plant that flag and say, Hey, you know, the Firefly stuff is good to use.
Nik: And I think it’s in everyone’s best interest, that’s building technology in this space, that they got to kind of do that if they want it to be a pro tool. Now, if they don’t want it to be a pro tool, then maybe it’s not as pertinent.
Helen: Yeah, that’s fair. And one thing too I just. I have to plug every time is, since you mentioned Adobe, that anyone who’s listened to the show knows that I’m a big fan of the Content Authenticity Initiative, and I’m a member and they have the C2PA standard, which is metadata that goes into any media file for the creators.
Helen: If they want to show that, especially in journalism, it’s super important. And then for consumers to have the option or just who’s ever consuming the content to dig in deeper to see if any media file has been touched or not touched by AI and on OpenAI’s website that they’re going to integrate the C2PA standard in, I think one of the upcoming models of Sora.
Helen: So that’s really great just in the standard getting more widely adopted. So we all can understand what’s been touched by AI or what’s not. So I just wanted to give a plug and I don’t think it’s in the current one yet.
Nik: I think it’s important because in, what are those stats that everyone throws around, that in the next 10 years, 90 percent of every image and video is going to be generated.
Nik: So I think there will be a real value to understanding and knowing when something is AI or not.
Helen: Yeah, and I’ve actually seen a few articles. It’s like, especially with written text, that it’s almost like it’s already everywhere that almost every text, you should almost assume that it’s been touched by AI.
Helen: And I say that, even pre Chat GPT and the gen AI tools, you already had all of the squiggly lines saying when you’ve misspelled words and stuff, and it’s almost in everyone’s tool kit already.
Nik: Yeah, of course. I mean, AI in some sense has been around for a while. I mean, if you’ve used a clone stamp in Adobe Photoshop, like you’ve been using AI for.
Nik: 20 years. So you have to draw the line somewhere, I guess. But where is that drawn? You know, where is the kind of like, well, it is if it’s not, and it is if it is. So yeah, I, but yes, I’m also for that. And I think that in the future it’ll be really valuable to know when something was like really filmed or really real, I guess.
Helen: Yeah. And it definitely has a lot of implications. I think in the creative space, there’s a lot more leeway in the tools and the toolkits. And I’ve heard arguments where some creatives are like, I don’t need to tell you if I’ve used Photoshop or AI, it’s my prerogative to decide what tools I use.
Helen: Which I respect that opinion. And from a photo or from a journalistic perspective, we want to know, and like the Princess Kate issue, like raise the concerns of like manipulation and how it’s of public interest and whatnot. So I want to go back to the Hard Fork episode because they, I thought they asked a pretty thought provoking question to the filmmaker Paul Trillo.
Helen: So I’m going to read what they asked him from the transcript and get your feedback for this. So, and I don’t know if it was Casey or Kevin, but, “So I wonder if you could speak to this idea of OpenAI using artists and filmmakers to try to convince a skeptical public that all this stuff is just going to be good and it’s going to enhance creativity and that’s not going to replace anyone’s jobs while actually having a very different strategy behind the scenes?”
Helen: And then Paul’s answer is, “I’ll give you a second to think about it. Sure. I think it’s a very fascinating point and something that I grapple with all the time because I, again, I love to do it the traditional way. I still love to employ people, but then the other side of me is playing with all this new tech.
Helen: And I’m like, am I just some sort of pond in this great master plan of AGI? But what’s the opposite of this? Is that you don’t want artists involved in the research process.” So I’m curious since you’re one of the, one of the lucky artists who do get to play with Sora, kind of your reaction or your thoughts to this question and interview piece.
Nik: Well, first of all, shout out to Paul, the dude’s a beast. He’s incredible. I will say that I don’t think it’s that nefarious. I just know the people. And like, I don’t think anyone’s like [comically evil laugh] I think that you can actually tell an interesting difference. And I think that they curated the artists that they put out pretty nicely, because I think, again, you can see the tool in different hands and how different creatives made stuff, you know, Paul versus kids versus myself and Josephine, like everybody had kind of pretty different looking stuff.
Nik: And certainly the montage that I put together does maybe think about like what future agency work looks like. I do have several kinds of more products in mind and other storytelling kind of things. I have like a film noir in mind and whatnot. But yeah, At the end of the day, I do think you can see the difference.
Nik: And I say this in the sense that the initial clips that they put out, right, they got a very big buzz and a very big momentum because they are incredible. And as great as those clips are, when they opened it up to these artists and get the artists and creative perspective on utilizing the same tool, I mean, some of the results were like, so outstandingly different.
Nik: And I think maybe that was an aha moment for a lot of people like, “Oh, wow, well, here’s the tool, which is awesome.” And here’s the tool placed in the hands of creatives. And I think It’s smart of them to get creatives involved because it’s different sides of the brain, right? And creatives are just, they might not know this, but they’re going to push the tools and the things that the brilliant engineers made in a way that maybe they didn’t think about.
Nik: So I think it’s like this perfect marriage of both sides of the brain by doing something like that. And so I don’t think it’s like a pawn thing. I just think it’s like, well, here’s what we’ve made and how are, how’s the world going to react to it? And this is like a sneak peek or a taste of that.
Nik: And I think it’s exciting.
Helen: Thank you for sharing your perspective. I go back to the Sam Altman and Android Jones interview that he’s willing to work with creatives to make it mutually beneficial and that they’re working with creatives like you, I think it will come down to the execution and making sure that they follow through.
Helen: So we’ll see as they roll it out more what they do on that front.
Nik: We’ll see.
Helen: I’m cautiously optimistic.
Nik: We’ll see. I’m along for the ride like everyone else.
Helen: Yeah. I’m very excited about what’s going to happen too, and I guess since, you know, I’m actually, I have a social media agency too, and consulting with some other agencies who want to start integrating AI for some other creative agencies who might just either be cautious or haven’t started playing with it or dipping their toes.
Helen: What would be like some of like your advice since you’ve like fully dove into the deep end and embrace all things AI. What would you say to other like agency owners in the industry?
Nik: Just try it. Honestly, the people that I talked to that are the most afraid of the tools, I feel more times than not have not really used them.
Nik: I think if you start using the tools, you’re going to see like, “Oh, okay. Actually, we still need all these creatives and all this creative to do the things we do.” So just try it and don’t panic.
Helen: That’s fair. Well, I feel like we could go on and on and I could hear you talk about ideas that you’ve had boxed up and ready to come out and excited to see what you do with them.
Helen: Is there anything that you want to plug about Native Foreign or other projects in the pipeline before we wrap up the interview?
Nik: Helen, it’s been really good talking to you. I’d love to come back on when some of these other Sora based clips and things are able to be released and, you know, talk through the thinking on some of those.
Nik: I will just say, if you want, if anyone out there that’s listening, wants to engage with a very forward thinking agency that is thinking about next gen creative and next gen filmmaking, check out Native Foreign, NativeForeign.tv and Native Foreign.ai is our AI labs. And we’re in sunny Los Angeles.
Nik: Making things happen over here.
Helen: One thing that I like to ask all of my guests is if you want our viewers and listeners to remember one thing, what’s that one thing that you want them to walk away with from today’s conversation?
Nik: I think that the paradigm shift is underway and I think that it is a wise choice to get involved with next gen tools and at least be aware of what’s happening with them and start playing around with them because this is the direction things are headed. I mean, at a certain point we could have just said, “Hey, you know what? Boat travel is good and takes two months to cross the Atlantic. That’s fine by me. You know, let’s just stop here.”
Nik: So technology keeps advancing. It is the, you know, the way things go, and this is the newest, coolest stuff that’s happening, and I think there’s a real, real benefit to staying ahead of the curve on the AI storytelling tools that are available.
Helen: I play with all of them for the same reason.
Helen: Well, it has been an absolute pleasure having you on the show today. So thank you so much, Nik, for sharing your time and insights and all the fun that you’ve got to have playing with Sora. And I’m so excited [for] when [it’ll be] released for everyone to, to dip my toes into it too. So thank you again.
Nik: Thank you, Helen. Thank you, Creativity Squared. Talk soon.
Helen: Thank you for spending some time with us today. We’re just getting started and would love your support. Subscribe to Creativity Squared on your preferred podcast platform and leave a review. It really helps. And I’d love to hear your feedback. What topics are you thinking about and want to dive into more?
Helen: I invite you to visit CreativitySquared.com to let me know. And while you’re there, be sure to sign up for our free weekly newsletter. So you can easily stay on top of all the latest news at the intersection of AI and creativity. Because it’s so important to support artists, 10 percent of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized nonprofit that supports over 100 arts organizations.
Helen: Become a premium newsletter subscriber or leave a tip on the website to support this project and ArtsWave. And premium newsletter subscribers will receive NFTs of episode cover art and more extras to say thank you for helping bring my dream to life. And a big, big thank you to everyone who’s offered their time, energy, and encouragement and support so far.
Helen: I really appreciate it from the bottom of my heart. This show is produced and made possible by the team at Play Audio Agency. Until next week, keep creating.