Martin Grödl and Moritz Resl founded Process Studio, an experimental design studio based in Vienna that specializes in generative and interactive design for branding, web, installation, and print. As well as traditional graphic design solutions, Process designs and develops highly specialized software that is used as tools for and by clients who include MIT, Design Museum Holon, MAK – Museum of Applied Arts Vienna, The Royal Museums of Fine Arts of Belgium, and more.
In 2021, Process designed the official Austrian contribution to the London Design Biennale. In 2022 Grödl and Resl were appointed as guest professors in visual communication at FH Salzburg. Their work has been featured in design publications like Dezeen, WEAVE magazine, The Type Directors Club Annual, Gizmodo, and The New York Times among others.
Martin is a designer, programmer, and artist who works on the boundaries between Design, Art, and Digital Technology. He participated multiple times in Google’s Summer of Code working on further development of the Processing programming language and was a guest artist and researcher for Motion Bank in Frankfurt. He taught courses including Introduction to Programming, Physical Computing with Arduino and Generative Graphic Design for Printmaking at the University of Applied Arts Vienna and tutored courses including Data-modeling, Database systems, Distributed Systems, and Semi-structured Data at the Vienna University of Technology.
Martin Grödl holds a Master’s degree from the University of Applied Arts Vienna and a Bachelor’s degree from Vienna University of Technology.
Moritz is a Creative Director and Designer who Works at the intersection of Design, Art and Digital Technology. Previous to co-founding Process, Moritz worked for Sagmeister & Walsh in New York City, where he designed for clients like Adobe and The Jewish Museum in Manhattan.
Moritz holds Master’s degrees in Computer Science and Media Art from the Vienna University of Technology and from the University of Applied Arts Vienna. His work has won international awards including Type Directors Club, ARC, and LACP Vision awards.
When they finished their degrees, they realized the lack of career opportunities in their shared passion, and so, they forged their own way, focusing on designing and developing highly specialized software tools for their clients.
Moritz Resl
They marry their technical backgrounds in statistics, mathematics, and programming with their eye for design to carve out a unique place in digital art and computer science.
GenAI’s explosion this last year put it on everyone’s maps. Martin and Moritz were years ahead of the boom, not only dabbling, but developing artificial intelligence tools years before anyone had ever heard of ChatGPT.
One of their most prominent projects, UNCANNY VALUES: Artificial Intelligence & You, was featured as part of the Vienna Biennale for Change 2019. MAK — Museum of Applied Arts Vienna, commissioned the team to produce a key visual identity for their exhibition on artificial intelligence. Around the same time there was a project started by Nvidia, where they were recreating hyper-realistic human faces instantly. To achieve this, the team at Nvidia used top-of-the-line hardware and expensive equipment to achieve their results. Martin and Moritz only had their individual laptops.
The primary challenge? Finding a way of using less data without access to high-performance equipment but getting a similar result. The solution? emojis.
The duo set out to train a neural network that would generate low-resolution emojis, like distorted versions of the little yellow characters we know and love. They call them AImojis
They built a dataset of diverse emojis scraped from web pages and mixed the statistical patterns to teach a model to create new visual output. Moritz describes the result as “very weird little faces.”
A high-quality, streamlined result was not the goal. Instead, the AImojis look blurry, confusing, and grungy. Martin and Moritz celebrated this low-res approach and showcased it over the city on large banners and posters, blowing them up enormously, to highlight the poor resolution and create shock value beyond emojis’ usual domain within our screens.
Grungy, unrefined, weird, uncanny, and broken are some of the words they use to describe the AImojis. Created on earlier laptops, Martin fondly describes the low-res characters as being created by “kindergarten A.I.”
Martin Grödl
UNCANNY VALUES contemplates the emotional exchange between human and tech. A.I. is undoubtedly becoming more capable every day, but for what purpose, if not humanity’s?
Martin Grödl
Martin and Moritz lean in favor of A.I.’s dependability on the power of humans. Even if an emoji has a clear representation, we as humans still have the power to interpret it as we please, from person to person. Different cultures, generations, friend groups, and families may all use the same emojis in different ways.
When it comes to these uncanny, slightly disturbing AImojis, the added layers of not being able to quite understand what emotion they’re trying to convey is probably more reflective of human emotion than the emojis that we use every day in our phones that have these (mostly) assigned values.
So, not only are these funky AImojis challenging our normal dialogue, and expanding it with new options, and yet they’re not easily put into a box. At the end of the day, AImojis, among the countless other forms of communication and language, are all just different types of expression, powered and controlled by humans.
With the system successfully creating these evocative visual emojis, the duo decided that their next foray into building A.I. tools would be in the world of typefaces. They used images containing texts of various fonts to generate new typefaces, which resulted in an endlessly oscillating visual stream of new fonts. In the process, they created one that captured the spirit of their exhibition, dubbing it AIfont.
AIfont was born quite naturally and was technically similar to AImojis. Already having the system set up, they started testing typefaces instead of emoji faces. Especially for someone in a creative field, unlocking a tool to create a new font is valuable to have in an arsenal.
Moritz Resl
Currently, anyone working with A.I. tools knows that they can produce images with incredible depth and detail, but prominent image generation models like Midjourney still struggle to generate images with legible text. Meanwhile, Martin and Moritz’s model was reliably producing stylized text four years ago, long before the recent GenAI boom.
Martin Grödl
Essentially, they hijacked a GenAI system, which is not specifically focused on fonts, and once again trained it on a dataset of individual font characters scraped from the internet. This created a font, but with images of words or a sentence instead of just text.
The team was getting requests to use the font, but these were roadblocked by the cumbersome process of sending text, generating images, and sending it back. And so they decided, just this year, to make a real font out of that, which is AIfont, a usable font that anybody can download, install, and use in normal design and text processing programs. A typical typeface family consists of around 50 fonts, whereas AIfont has 500. Not all of them are finished fonts, so they call it a “typographic experiment,” offering a variety of bold new designs rather than professional tools. Martin shared that their approach was much more from an experimental, technical, logical, generative point of view instead of producing a perfect font.
Martin Grödl
MidJourney, ChatGPT, Dall-E, and dozens of other A.I. tools at our disposal have given everyone, even non-technical users, the opportunity to play in the artificial intelligence sandbox. Martin and Moritz prefer to work on and use their own models.
Moritz explains that, while they definitely dabble to stay up-to-date and in the know, that’s about as far as they go with widely-used A.I. tools.
Moritz Resl
Moritz says the decision about what tool to use should depend on the project, acknowledging that tools like ChatGPT and Midjourney are powerful creative aids. Martin says he sees the transformative power of advanced GenAI accessible to everyone.
Martin Grödl
Despite their degrees, experience, and expertise in the art community, Martin and Moritz say they don’t consider themselves as A.I. artists. They identify as designers by craft, who take new technological developments and incorporate them into their workflow.
Moritz Resl
Their passions align with simple, bold, low-resolution designs rather than high fidelity, seemingly ‘perfect’ results that mainstream GenAI tools produce.
Generative design is sneaking its way slowly but steadily into the design curricula everywhere, which is not surprise considering all that’s happening with generative A.I. that school systems would be adopting it much faster now.
So, when faced with the questions of ‘how is my future job affected by this industry?’ by current design students, they shared their opinions.
Either you embrace it fully, dive into learning these tools to get ahead of the curve and focus primarily on the skill set of prompt writing or data curating, which they foresee as being a prominent job in the future.
Another perspective worth considering, they predict, would be to take a step back and think of one’s role as a designer more as a directorial role instead. As a curation role of defining a concept, rather than producing a specific image or a specific text.
Martin Grödl
Martin also says that younger generations have an inherent advantage of being able to adapt to new technology as it arises.
Before the rise of these tools, Martin’s answer to this question came easy – no.
Martin Grödl
But he says his perspective has shifted. ChatGPT is gradually becoming a general-purpose assistant with computer vision and listening capabilities. LLMs and other models are flawless at reproducing, much like the creative geniuses of our past, and their predecessors. They all interacted, learned from, and built upon the knowledge that was there before them.
Ultimately, Martin says that machines are creative in some sense; their limitless power to produce mass quantities of images, for example, would take a human an incomparable amount of time longer to produce. Past that one element of creativity, however, Martin argues that the human is the creative force.
Martin Grödl
It can’t be forgotten that these tools, regardless of their level of creativity, are mirrors of who we are individually and as a society and that we’ve all kind of got to decide together what that means. Even if things are generated by A.I., people still need to use their own personal judgment and critical thinking skills to move through life.
Martin Grödl
Thank you, Martin and Moritz, for being our guest on Creativity Squared.
This show is produced and made possible by the team at PLAY Audio Agency: https://playaudioagency.com.
Creativity Squared is brought to you by Sociality Squared, a social media agency who understands the magic of bringing people together around what they value and love: http://socialitysquared.com.
Because it’s important to support artists, 10% of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized non-profit that supports over 150 arts organizations, projects, and independent artists.
Join Creativity Squared’s free weekly newsletter and become a premium supporter here.
TRANSCRIPT
[00:00:00] Martin Gertl: We’re not isolated and all the big geniuses that we think of the peak creative minds, they were not on their own, not at all. They interacted, they learned from all their precursors and they built upon this knowledge. And this is the point that these tools do so well, but that’s only one part of creativity.
[00:00:24] Martin Gertl: You know, all that personal style, all that personal experience, all values that you learn, you know, this comes into play as well. And so I think in the end, yeah, I can admit that these tools are creative in this sense of they can produce images that I would have to spend hours and hours of learning to draw or whatever.
[00:00:50] Martin Gertl: But that’s that’s one part of creativity and we can now focus on maybe the other part.
[00:00:58] Helen Todd: Pushing the boundaries between design, art, and digital technology, Martin Gertl and Moritz Russell founded Process Studio, an experimental design studio based in Vienna that specializes in generative and interactive design for branding, web installation, and print. As well as traditional graphic design solutions, process designs and develops highly specialized software that is used as tools for and by clients who include MIT Design Museum, Holland, Mac Museum of Applied Arts, Vienna, the Royal Museum of Fine Arts of Belgium, and many more. In 2021, Process designed the official Austrian contribution to the London design finale.
[00:01:41] Helen Todd: In 2022, Martin and Moritz were appointed as guest professors in visual communication at the Salzburg University of Applied Sciences. Their work has been featured in design publications like Dezeen, Weave Magazine, the Type Directors Club Annual, Gizmodo, and the New York Times, among others.
[00:02:03] Helen Todd: Martin is a designer, programmer, and artist. He participated multiple times in Google’s Summer of Code, working on further development of the processing programming language, and was a guest artist and researcher for MotionBank in Frankfurt. He’s taught courses on programming, generative graphic design for printmaking, data science, and more. At the Vienna University of Technology, Mort is a creative director and designer.
[00:02:25] Helen Todd: Previous to co-founding process, morts worked for Sagmeister and Walsh in New York City where he designed for clients like Adobe and the Jewish Museum in Manhattan. I met Martin and Moritz through Lotte Kristoferitsch, who’s a member of the Austrian Chamber of Commerce / Bold Community. I’ve thoroughly enjoyed every provocative conversation I’ve had with them, and I’m excited to share this one with you.
[00:02:49] Helen Todd: Are machines creative? Will artificial intelligence replace creatives? We dive into these questions on today’s episode, along with exploring the role of design, critical thinking, and data curation, and helping envision the future in which we want to live. Listen to learn more about the custom built AI tools that Martin and Moritz designed for their AI emojis and AI fonts projects and how the grungy punk aesthetic of their emojis reflect our emotions. Martin and Moritz also discuss how we’re not separate from AI as the creators of the tools and how crucial it is that we understand how they work and the values that are fueling and driving them. Enjoy.
[00:03:39] Helen Todd: Welcome to Creativity Squared. Discover how creatives are collaborating with artificial intelligence in your inbox, on YouTube, and on your preferred podcast platform. Hi, I’m Helen Todd, your host, and I’m so excited to have you join the weekly conversations I’m having with amazing pioneers in this space.
[00:03:58] Helen Todd: The intention of these conversations is to ignite our collective imagination at the intersection of AI and creativity to envision a world where artists thrive.
[00:04:13] Helen Todd: Martin and Moritz, welcome to Creativity Squared. It’s so good to have you on the show.
[00:04:19] Moritz Russell: Hi. Nice to be here.
[00:04:20] Helen Todd:This is the first time we have two guests in one interview, which I’m super excited about. And Martin and Moritz are based in Vienna and Lotte put us in touch, so she gets credit for two different interview guests for today’s show.
[00:04:34] Helen Todd: And I met Lotte through the Austria chamber of commerce is amazing, Bold Unconference and was so excited to meet Martin and Moritz cause they’ve done some really interesting things with AI and creativity, which we’ll dive into. But for those who are meeting you for the first time can you introduce yourself in your studio and kind of share your origin story with us?
[00:04:58] Moritz Russell: Okay. Hi, my name is Moritz Resel. I’m co-founder and creative director of Process Studio that we have founded eight years ago together with Martin.
[00:05:08] Martin Gertl: Hi, I’m Martin. I’m co-founder of Process Studio as well. So we work together as a small team now for about eight years, I think. So we actually work together as a two people studio and we’re called Process Studio.
[00:05:25] Martin Gertl: And I guess our our origin story is kind of, we have a similar background in that we both studied computer science and later on media art. And we met at the, like at the end of the computer science study and both went on to study art as well. And after that, after the diploma in, in, in media art, we kind of were in this in between world of, you know, generative art using computers for art and for design. And we thought about where we could work and we actually didn’t have any real option. So we just said, okay, let’s try try it ourselves and found a company. And that’s when we started with Process.
[00:06:10] Martin Gertl: And now that’s eight years ago. So, as a really small team, two people freelancers from time to time, but we haven’t really grown since, except for, you know, connections with other people. And well, in terms, what we do is mainly like graphic design and nowadays it’s called generative design or creative coding.
[00:06:36] Martin Gertl: So, we do a lot of you know, design projects and use our own tools, you know, use programming, use data, use all those tools, not. Traditionally associated maybe with arts and design. I mean, it’s such a big world and such a big field and everybody uses the computer of course, but maybe not in, in the, on that level.
[00:07:00] Moritz Russell: Yeah. It seems weird at first glance, but actually totally made sense to us. At least we combined this rather technical background or technical approach that we, we learned that, you know, the technical university with all statistics and mathematics and programming, which can be, you know, a very bumpy ride to get into, but we immediately knew that we also wanted to do something creative, which programming sometimes is not regarded as being creative, at least not in a sense that is commonly acknowledged.
[00:07:32] Moritz Russell: But, we were also quite interested in doing little experiments. And as Martin already said, doing this small little tools for individual projects and that actually happened quite naturally to combine, you know, building our own tools for the projects that we did, because we basically wanted to do something in the beginning.
[00:07:51] Moritz Russell: And then we found out, Hey, there is…there is no tool that does exactly this or that. And so we were kind of, left alone with, you know, with the idea of, you know, we, we just had to make our own tools in order to realize what we have thought of in the first place for the projects. So this combination nowadays is I think not so uncommon anymore.
[00:08:16] Moritz Russell: And we also can see also at universities, design universities, we are teaching their workshops and so on. And generative design is sneaking its way slowly, but steadily into the design curricula as well. And a lot of designers, you know, come in contact with programming. On a very approachable level as well.
[00:08:35] Moritz Russell: So it totally makes sense as a designer to not only be limited to the tools that are given to you by big companies, but also as a small team, you can basically build your own tools and fool around and experiment and do the stuff that does exactly what you want it to do. And not being left with, you know, only the capabilities of the software that you can buy.
[00:08:59] Helen Todd: I love that. And I have a feeling with all that’s happened in generative AI this year, that the schools will be adopting it a whole lot faster than pre 2023. But you all actually got into AI building some of your tools. Before the Gen AI blew up with the onset of chatGPT. So can you tell us about your first project and yeah, how you kind of built your own tool for your AI application?
[00:09:24] Martin Gertl: So I will just say how it led up to us working with AI and through that you know, approach. Using creative coding and coding in general for design projects. We got more and more involved with, I’d say like museums and clients you know, clients like this as opposed to commercial clients advertising agencies and so on, because they have these complex, more complex topics, I would say they have, you know, whole exhibitions and so on. And they were kind of, they like this approach that we can incorporate data maybe and interactivity and stuff like that. So, in two, 2019, the museum for Applied Arts Vienna called the Mach approached us to do a you know, key visual and visual identity for their exhibition on artificial intelligence.
[00:10:20] Moritz Russell: Yeah. And the subtitle. For this exhibition was artificial intelligence and you. So it was this kind of dialogue set up that happened. And Marlies Wirth and Paul Feigelfeld, the guest curator, they basically brought us in and they told us the first draft, the first concept of the exhibition and they left us with, you know, with the idea, maybe you can come up with some key visual, some visual idea that is also generated by an AI and we’re talking 2018. So, this was way before the big Gen AI hype really took off and the most most famous or one of the most famous things was this person doesn’t exist at that time.
[00:11:02] Moritz Russell: The project done by NVIDIA. And they were using, you know, they were basically creating faces instantly that looked quite realistically looked like real human beings, but they were all generated and you could see it because they had this uncanny teeth. You know, there were some imperfections with the ears and the eyes and the, you know, where they were cross-eyed sometimes, but in general, at a first glance, they all looked very realistically. And then we had to look into the tech that was used back then, the GPUs. And you know, they had, of course, you know, extremely expensive hardware that was needed to develop and train something like this. And we had only our laptops basically at that time.
[00:11:43] Moritz Russell: So, we needed to find a way of using less data, but having an easy Similar result in the end from a method perspective, right? And we came up with the idea of using emojis. So, these little low-res faces and we use them as a training set. So, we had a scraper that would scrape, you know, all of the emojis from web pages and then we built our data sets coming from this, and then we basically had it create new emojis. It was mixing these statistical patterns that it learned to create new visual output. And that was basically very weird faces that came out. So, basically I’m Kenny, which was the title of the exhibition was super fitting to that visual output. And when we did the project, this was I think our third line, our third visual line, so our third idea that we were pitching to the team. And we said if you guys don’t want to use it, we have to do something with it because we were so, so interested and it was so uncanny and so felt so, so weird and new to us that we just had to do something with it.
[00:12:55] Moritz Russell: And the team, to our surprise, really loved it as well. And it was actually a no brainer. And then we went from there and we really celebrated this low res approach as well. So we had huge banners all over the city with posters and banners and and really big imagery, but using only the super small resolution of, I don’t know, 64×64 pixels that, that were forming the faces.
[00:13:23] Moritz Russell: And we blew them up enormously. And then we also, you know, created the whole graphic… the whole graphic part for the exhibition. So all the labels, all the explanation texts that would go onto the walls and so on. And we also needed a typeface for that as well. And then we basically had our system set up and thought, okay, maybe we can do something similar with typefaces.
[00:13:49] Moritz Russell: And then we used images containing type to generate new typefaces basically. And then we had again, this weird shift of styles and weights of typefaces that resulted in, you know, basically an endless stream of new typefaces. So, it was this endless visual stream and each frame would be its own typeface basically.
[00:14:14] Moritz Russell: So years after this I think it’s now four years that we have done this project. We finally came to finalize it in some way and put it up as a real typeface that people can get, and it’s called the AI font. And yeah, people can have a look at it and you can really, you’re not buying only one typeface.
[00:14:33] Moritz Russell: But it’s basically process of generated typefaces.
[00:14:37] Helen Todd: I know like for the people who are listening that it’s hard to visualize the emojis that they did the uncanny values emojis. So I’ll be sure to include include links to their AI fonts and embed images and whatnot into the dedicated blog posts that accompanies this interview but actually have interviewed Marlies Wirth — who’s the curator that worked with Martin and Moritz on this project. So you’ll actually get to hear her interview after this where we talked about it, but I was trying to like figure out how to describe the emojis.
[00:15:10] Helen Todd: They have in my interview with her, they kind of have this like, I don’t know, grunge smeared, like look to them. I don’t know how you would describe them, but they, it seems more like punk emoji-esque type of feel to them.
[00:15:25] Martin Gertl: Yeah, totally. I mean, they are really… I don’t know, messed up.
[00:15:29] Martin Gertl: I would say they are really, you know, broken and it’s really, I would say this is like kindergarten AI, this is not refined. This is really low-res, rough, grungy on old laptops.. And, but we actually like that, right? We liked that approach that what comes out if you remove all the polish and all the guidance and all the filtering, so what is really the core, so to say, and yeah, it’s dirty, it’s messy it’s grungy, but we liked it and that’s our kind of, you know, style. We like this kind of rough things, but not everybody does, but to our surprise, nearly everybody picks out one or two of them and really likes them and builds a connection and it’s, I don’t know, it’s because these are faces, right?
[00:16:20] Martin Gertl: And we know these faces and these emoji, we have an immediate emotional connection. And that’s why I like this project now in, in retrospect, it was not really big. You know, big concept behind, so to say, was just trying to do something with AI tools as a really small team.
[00:16:43] Martin Gertl: But in, in retrospect, I really think this symbolizes our, you know, relationship to to technology in, in, in a very interesting way, because we are not separate from technology. We are the ones who produce these. AI’s and we make it happy and confused and angry and greedy and whatever. So this is a good reflection.
[00:17:09] Martin Gertl: All these emotions these AI emojis throw at us is a reflection of ourselves. And I really liked it. And I like that people can connect. And sometimes I think it’s like. You know, they we, at first we tried to categorize them in like happy, sad, neutral, but then we had to make this other category that was just like horror and like out of any category that really messed up.
[00:17:37] Martin Gertl: But I think it’s like, people, you know, like, like people like this dog with a missing leg, it’s kind of still, it’s cute, you know? So yeah, that’s, that, that was a nice starting point getting into AI.
[00:17:53] Moritz Russell: Yeah, and basically they are they are very hard to decipher because they don’t really fit into the emotions that we are used to, especially when we are you know, using emojis for communication and that’s another aspect why it was quite fitting because it was about this dialogue and emojis are used for expressing basically feelings through text using these little images.
[00:18:19] Moritz Russell: And with this AI, there emerged this kind of dialogue that was very hard for us to decipher because there were these emotions, as you can call them that were not easy to read because they were, as Martin already said, really weird. And it was not, okay, this one is sad. This one is happy. This one is disgusted.
[00:18:42] Moritz Russell: This one is, I don’t know, looks like this or that. They were all kinds of different things combined into one emoji. And that’s what also makes them very interesting to me as well on a personal level because they are not what they look like at a first glance, maybe they’re definitely provocative.
[00:19:01] Helen Todd: And I feel like the added layers of not being able to quite understand what emotion they’re trying to convey is probably more. Human are reflective of human emotions, then the emojis in our phones that have like assigned emotions kind of to them. So in that regard, they, you guys might be ahead of the emojis in our phones.
[00:19:22] Martin Gertl: We use them differently. They’re actually, if you look at the Unicode standard, they are not assigned a meaning. They are just descriptive. It’s just, it’s. It doesn’t say, but it says image of a peach, right? It doesn’t say what you have to assign. And as you know, the peach is not used for peach, you know, we assign them ourselves, these meanings ourselves, which is quite interesting if you look at emoji.
[00:19:51] Martin Gertl: And also I’ve noticed that people use the same emoji differently. Right. I have some emoji that I use with my girlfriend, for example, that has a specific meaning, but nobody else uses it that way. So that’s quite interesting.
[00:20:06] Helen Todd: I just think all the different types are forms of expression that we have.
[00:20:11] Helen Todd: Like we’re just trying to express ourselves and we have text emojis to GIFs. I probably spend as much time looking at gifts as I do emojis of like, how do I want to compliment this text of what I’m trying to send to video? And now we’re getting, you know, generative. I mean, what was it just last week?
Facebook announced their text to image creation tool, which I think will be almost like text to gif type of thing with their video on demand more or less generative AI video. So, it’s just really interesting and how expression we get all these different tools that we’re trying to figure out how to express what we feel and whatnot.
[00:20:49] Helen Todd: So, I find it all very interesting too. So how many of these emojis did you all end up creating and how many ended up in the exhibit?
[00:20:56] Moritz Russell: I think we created 10,000 still images and they were also, you know, we printed them out and made little buttons that we would hand out at the opening of the exhibition.
[00:21:08] Moritz Russell: And then we had a few still images that we picked for the, you know, for the posters and for the banners, for the advertising material. And then we also had small little AI pods. That’s what we called them. These are raspberry pies, you know, attached to a projector. And there was…there was running a little AI model as well in real time.
[00:21:32] Moritz Russell: So when people went to the exhibition, they would actually see new AI emojis being created on the spot as well. And that’s actually, you know, an infinite number of AI emojis. So, this doesn’t end. So, there is no end to the stream. So, in that sense. There is no number to that, but I think really using them for a specific images for a specific image, we only ended up using 10, maybe or so.
[00:22:01] Helen Todd: Wow. 10 out of 10,000. That’s a lot to curate and boil down to. Well, and then you mentioned Moritz, as you were describing the project, you also kind of mentioned your AI fonts as like your second foray into building the tools and working with AI. So what got you, I guess, specifically into fonts from the Uncanny Valley or did that inspiration come somewhere else?
[00:22:28] Helen Todd: Like how did you decide to dive into fonts?
[00:22:31] Moritz Russell: Well, you know, as a graphic designer, typefaces come up at one point, especially when you’re designing the visual identity for an exhibition, of course, type is a vehicle for, you know, transporting information basically. But the AI font really happened quite naturally because we had this, the system already set up for creating these new AI emojis.
[00:22:54] Moritz Russell: And then we were basically just curious, hey, what happens if we are doing typefaces instead of faces. On a technical side, it’s basically the same, so we are only using images and the system doesn’t really care if, you know, if the image depicts I don’t know, a tree or a face or a font or anything basically.
[00:23:12] Moritz Russell: So it’s the same system and we just wanted to try it out as well and as I said already before, the result was also this endless stream of different styles, you know, oscillating from one style to another. So we had, you know, upright to italic, to really bold fonts, to super thin fonts, to script fonts, to monospace.
[00:23:35] Moritz Russell: So, anything and everything in between. And it was at that point quite new visually as well. So everybody was kind of, I dunno, disgusted or impressed or found it interesting. We liked it. If I can, you know, only say that, but it was at least something new and it was weird.
[00:23:55] Moritz Russell: And that’s why it was very fitting as well to the concept of the exhibition.
[00:23:59] Helen Todd: Oh, and I know when DALL-E and all the Gen AI tools came out this past year, like fonts are actually really hard for these tools. And I think that later models are able to produce them better. But that’s kind of one of the signs that it’s AI generated is that the words don’t actually come out as like words.
[00:24:18] Helen Todd: It’s just kind of like garbled letters and stuff. So, how did you all back in 2018, 2019 actually get I guess high fidelity fonts for, you could tell that they were at least fonts or letters. Cause I, I know that’s one of the challenges with some of these tools right now too.
[00:24:35] Martin Gertl: Well, we basically tricked the system or like, you know, if you have a generative AI system, like like mid journey or DALL-E or something, they’re not specifically focused on fonts, right?
[00:24:47] Martin Gertl: Fonts is just something that happens to be in a lot of images. A lot of photos and it doesn’t really recognize them. It doesn’t know this is an A or a B or something. Right? These images are not annotated that way. So, and we just hijacked the system by using, I mean, it’s quite an obvious hack just using a sheet of ABCs.
[00:25:15] Martin Gertl: Like a sprite sheet where the A is always in the top left corner and the B is next to it and so on. So we just put in a lot of a lot of glyphs in the end of all different kinds that we found that we scraped from the web again. Right. And what it ends up doing is having a lot of a big data set of different A’s and different B’s and C’s and all the letters and all the punctuation and so on.
[00:25:43] Martin Gertl: And that way it, it can spit out new fonts and new. A new sprite sheets at the first stage. So, it was a kind of a manual process because then you have just your ABC’s. But if you want to have a specific text, you had to pick out all the right letters for your text and stitch them together. So, this is a second step that we did.
[00:26:10] Martin Gertl: And then you end up not with a font, but with images of your word or your sentence, and we got a lot of requests from students and people that said, Hey how can I use this AI font? And it was really cumbersome because they had to send us their text and we generated an animation for them or images and we send it back.
[00:26:32] Martin Gertl: So it’s quite, you know, a lot of people are turned off. It’s too hard to send somebody something, get it back. And so we decided just this year to kind of make a real font out of that. And through another, you know, technical hack to kind of vectorize all of these sprite sheets of different fonts and yeah, clean it up a bit.
[00:26:58] Martin Gertl: As necessary and throw it into a a variable font in the end. And through that process, we ended up with a usable, you know, font that anybody could download and install and use in all your normal design programs and text processing programs. But it’s not really the AI system, you know, it’s limited to a few of those it’s not generated on the fly, it’s all pre generated.
[00:27:28] Martin Gertl: But it’s the next best thing to, to try it out and it’s much more convenient. And yeah, it’s kind of a little product that you can get and try.
[00:27:39] Moritz Russell: Normal traditional typefaces, maybe the family consists of, I don’t know, 50 fonts if it’s a lot, or maybe 70, if it’s really big. And here we have 500 and they basically span from the beginning of the training where it’s only noise.
[00:27:57] Moritz Russell: And we let it in there as well. So the first font, if you want to call it that it’s just noise, it’s not legible in any sense, and then it gets progressively better and then you can skip through the training steps basically. And we named them one, two, three until 500. And they get better, or at least they get they get better in a sense that you can read them better, but they are still it’s basically a typographic experiment.
[00:28:26] Moritz Russell: This is not a finished typeface. And it. It shouldn’t be actually, and it’s also not comparable to a normal, you know, all the craft that goes into really creating a typeface. This is a very different approach. We are coming from this experimental, technological generative point of view.
[00:28:45] Moritz Russell: And not not so much, we are not so much interested in creating a perfect font, basically. This wasn’t the goal for the whole project. Yeah.
[00:28:54] Martin Gertl: It’s still more an AI project than a font project. It’s again, shows this kind of low level baby AI without supervision, without reinforcement, without selection.
[00:29:05] Martin Gertl: It’s just reproducing what we put in this case, a lot of fonts, a lot of rendered glyphs, and then it tries to reproduce. It starts from white noise and gradually produces better in, in quotation marks results as it learns to mimic and it gets better in the sense that it mimics what we define as fonts.
[00:29:31] Martin Gertl: And it’s more like a, you know, more like a communication project about, about AI as it is trained. So it’s a variable font and you have just one axis as it’s called in variable fonts, and this is the training step. And as you go from 0 to 500, you see how it learns to reproduce and mimic fonts better, and you also see how it drops back into this kind of local minima, what it’s called in, in, in machine learning, where it finds a really good solution and then it can, when it’s getting overtrained, it can become bad again, right?
[00:30:10] Martin Gertl: Because we’re not really supervising it as any AI company doing a real product would do, they would try to optimize it. They would try to implement feedback to get a good result, but we’re just letting it run wild and thrash about. And that’s why it’s also grungy and a bit dirty.
[00:30:32] Helen Todd: I love it. Well, and I’ll be sure to link to this project as well, because it’s available to the fonts are available to be bought and downloaded.
[00:30:42] Helen Todd: If anyone listening or watching wants to play with it too. And I love that you guys build your own tools just to experiment now with all the Gen AI tools that have come out. Are you playing with the tools out there like Mid Journey or DALL-E? Are you. more interested in building your own tech. Tell me where you’re at with the state of Gen AI today in your experiments and projects.
[00:31:05] Moritz Russell: I mean, we are playing a little bit, at least I am, but only from a very curious standpoint. It’s not about, hey, what’s the next big thing or something like this? I’m just interested in what’s currently available basically from a user’s perspective and. But only very sparsely, we have just, I’ve just fooled around a little bit with, you know, ChatGPT and so on.
[00:31:29] Moritz Russell: And also in the early days of DALL-E and so on and Mid Journey, we’ve tried it out as well. But, it got quite stale quite fast or quite quickly. And it really depends on if you have something that you want to use this tool for, if you just want to fool around and do something with it, I think it’s not so much interesting.
[00:31:53] Moritz Russell: It’s not at least to me and it greatly depends on the project. If you have something that you want to do and use the tool for it, go ahead. But it’s not that we are, you know, constantly texting each other, hey, have you seen this new tool? And so on. So we’re not so much ahead of the curve in that regard.
[00:32:15] Martin Gertl: Yeah. I’d say we follow it as a kind of a high level perspective. You know, these tools do make news all the time now. And I mean, I also don’t like to try every version and, you know, it takes too much time, but. You know, when things make news and when my dad, who is over 70 years old, asks me about some of those tools that I haven’t heard about, then I will look into them right and see what it’s all about.
[00:32:44] Martin Gertl: So, and from that, you see what a big impact. Those tools have like everybody has heard about it especially with ChatGPT now, because it’s so general purpose, right? Much more than any generative image program. So, yeah we to follow the, this, these trajectories that, that come out on a bit of a higher level.
[00:33:09] Martin Gertl: And I’d say I’m mostly interested in what the, do these tools, how will they change society and, you know, jobs and things like that, which is kind of hard to anticipate, of course, but it’s also. It seems obvious that it will have a kind of big impact, right. Or an, a substantial impact at least. So this is kind of interesting to, to also look at the circumstances where these tools come out, you know, out of tech companies, but then there are also open source versions, then there are, you know, Also on the dark net, people are misusing the stuff, right?
[00:33:54] Martin Gertl: Because of course on, on ChatGPT on the big commercial sites, you can’t do porn images or, you know, some things are forbidden and they should be, but of course what people. Want is things that are forbidden and they do want violent stuff and porn stuff. And, it’s natural.
[00:34:16] Martin Gertl: So it’s also being done. So, this is kind of interesting as well.
[00:34:22] Moritz Russell: I just wanted to throw in something that I read yesterday about an insurance company that had an AI model that would decide whether some people get insurance or get, you know, their costs covered or not. It had a 90% false rate, so it was detecting stuff wrong, but it was actually by design.
[00:34:44] Moritz Russell: So, it makes sense for in an insurance company to decline more cases that would need support. My point is that it already has a big impact onto society right now already. And I think nobody, nobody denies that this will only get bigger and more broad as AI is, you know, finding its way into so many more applications than just, you know, Photoshop, for example.
[00:35:10] Moritz Russell: So it’s not only this, but if I just look at the design scene or the creative scene stuff, like, you know, being an illustrator will be incrementally difficult because you can basically train any AI on all. Individual styles, and you can replace a lot of general use cases with just using an AI or stock photography that is easily replaceable by an AI or also web design.
[00:35:42] Moritz Russell: I mean, there are already big companies that are investing a lot of money to create these tools to have automated web design as well. So I think a lot of the general applications. will be replaced by an AI quite soon, I think in the next five years for sure. But I was talking to my father the other day and he’s a musician and he’s also interested in this new technology things.
[00:36:09] Moritz Russell: And he told me about, he used ChatGPT for writing lyrics for a song, including all the, you know, all the chords and he said, yeah, you get a result, but it’s not, it’s quite stale basically. So the result is not really interesting from a musician standpoint. And he told me that it’s still impressive.
[00:36:31] Moritz Russell: If somebody, you know, goes up on stage, just bringing his or her saxophone, for example, and just being able to play, but then everybody else, of course, is also able to write a song. By just prompting some text into an AI generator and then you get a result as well. So, I think music is also quite a good example for this, because when this autotune was introduced, I don’t know, 10 years ago, 15 years ago, I can’t remember.
[00:36:58] Moritz Russell: It was a big deal by back then because everybody said, okay, nobody needs to sing anymore because you can also, you can change and alter the voice afterwards after it has been recorded. So, even if somebody’s singing a wrong note or something, you can change it. And maybe, maybe AI is doing something similar because you can, of course you can generate an image, but maybe if an image is really crafted by hand it might still be different or it might still be more appealing to people. It’s quite hard for me to put the finger on at the moment, but that’s what’s my hope that ideas are, you know, being really invested as a human being as well. Might still be valid also in, in 10 years ago.
[00:37:44] Moritz Russell: And then again, as Martin already said, it’s not machines that are building these machines or everywhere. And it’s people who are designing and maintaining and developing the tools as well. So it’s hard to make a differentiation here as well.
[00:37:58] Helen Todd: Well, I know like, I mean, new technology opens up new job titles and careers as well.
[00:38:06] Helen Todd: I mean, before graphic design, I guess, digital graphic design, it was all hand, but now with Photoshop, when you think graphic design, at least in my mind, it goes almost all to digital and in a lot of ways you know, photography needs cameras, videography needs the video equipment at least before Gen AI.
[00:38:27] Helen Todd: But I have a question because I know I’ve had other guests on the show who identify as AI artists. Do you all identify as like AI artists or AI graphic designers, or are you designers that experiment with AI? I’m curious since we talked about identity and expression earlier in the show where you lean on that right now.
[00:38:48] Moritz Russell: I don’t really identify as an AI artist. I think from the stuff that we are doing, we are basically designers. We get hired to fulfill a specific need and to, you know, create something specific for a client usually. So in that sense, we are very much designers by craft. I just think we are interested in this new technological developments and we try to incorporate them into our workflow. If it makes sense…
[00:39:17] Moritz Russell: But not in a way that we just take what is out there, but mostly we take a step back and really try to analyze what it is that this technology makes special. And then we, most of the times try to recreate it ourselves and make it somewhat our own because we are not so much interested in creating this most, you know, high fidelity, super high-res imagery. We are basically very much interested in this low-res, you know, general graphical shapes, very bold and simple in, in some sense. So, we are kind of in this safe spot as well that we are not, you know, always running after this new visual improvement because the basic concept behind it stays the same. And that’s also just coming out of our interest.
[00:40:15] Helen Todd: Yeah, that makes a lot of sense. And I know before we started recording, I think Martin, it was you who mentioned that you get a lot of like students in university coming and asking advice about, you know, what they should do if they’re. You know, designers are studying design, you know, what they should do about creative jobs..
[00:40:34] Helen Todd: So I’m curious what’s the feedback that you’re giving students right now related to that?
[00:40:39] Martin Gertl: Yeah, that’s true. Like, especially when we put out the AI font, a lot of design students like requested us to do an interview and they wanted, you know, they wanted to do some bachelor thesis or master’s thesis about AI and design and so on.
[00:40:57] Martin Gertl: And of course the question comes up, how is my future job affected by this, you know, am I being replaced? And it’s quite, of course it’s, I mean, I ask that myself too, right? If AI is doing my job in the future, what should you do? Right. The answer is my first answer was a few months ago.
[00:41:24] Martin Gertl: I said, okay, either. You embrace it fully, you know, try to learn these tools get ahead of the curve be good at writing prompts, right? It’s, I mean, it’s I think it’s going to be a job, right? To write good prompts for producing…images for producing anything, you know, even websites, all the stuff that’s will be coming with generative AI.
[00:41:55] Martin Gertl: So, you need to prompt engineers, you might need data curators, even. Further down the line, right? If you think about when we can tailor these systems even more to our needs, we might have an influence to the data. This is that is being used. It’s still further out because the training is kind of expensive, but still there will be a lot of jobs now going back to design.
[00:42:19] Martin Gertl: Yeah. If you become good at writing prompts, you for sure you will have a career ahead of you. Or on the other hand. If you don’t like, you know, chasing this path and maybe, you know, it’s kind of unclear you know, which tools will be come out on top and so on, and maybe you learn the wrong one and so on.
[00:42:44] Martin Gertl: Another option would be to kind of take a step back and think of your role as a designer more as, directorial role, right? As a curation role as defining a concept and not so much about producing a specific image or specific text and so on because that is still something that I think is not being covered so much by AI.
[00:43:13] Martin Gertl: Of course, in some cases you can do anything by AI. For example I saw a few weeks ago, you know, there’s all these tutorials now out there, how to make a quick. Buck by, you know, it was in this case, it was okay. Do a children’s book and sell it on Amazon. And you can basically generate the story.
[00:43:37] Martin Gertl: You use a, like a free story, a fairy tale story, a little red riding hood. You put it to ChatGPT. It writes it in a different style or whatever. You have your story and then you generate all the images by a mid journey. And then you. Well, you still have to lay out it, but maybe that will be quite easy as well.
[00:43:59] Martin Gertl: If you prompt some tool layout, these images and these texts, you get a PDF and you do it on, you can sell it on Amazon by print on demand. So this kind of mass produced really low effort products will be done as well. But as soon as you’re talking interesting design challenges, I think it’s so important to have a curatorial role, somebody who talks to a client and understands them and kind of translates what they want and what they need to some design product, right?
[00:44:36] Martin Gertl: To some images and some, brands, design, key visuals, whatever. So you can go in that direction as well, I think. And it’s up to you. And I think what I also just one more sentence, what I also saw that the young people are not afraid so much. Right. And. It’s just a, when you’re a certain age, you’re kind of ingrained in your way of doing things, and then you’re afraid of new stuff.
[00:45:04] Martin Gertl: And then I came across this quote of it’s from Douglas Adams, you know, the Hitchhiker’s Guide to the Galaxy. And in one of those books, he comes up with this three rules, how we react to technologies, to new developments in technologies. And he says, well… up to the age of 15, like anything is normal for you.
[00:45:31] Martin Gertl: It’s just natural part of the world. So if a new AI comes out, it’s just like it is, it’s nothing special. And between 15-25 years of age, it’s kind of exciting, you know, it’s revolutionary and you can probably get good at it and get a career. And then over about 35, every new invention is kind of. Against the natural order of things.
[00:45:56] Martin Gertl: And it’s, you know, you’re opposed to it. And that’s, so there is a lot of hope for, you know, young people. They can cope with this stuff much better than we can do probably.
[00:46:11] Helen Todd: Yeah, it kind of seems I like that framework. Although since we’re on the the latter end of that spectrum, I’d like to say that we’re, since you guys experiment and I’m diving into all things AI, that maybe we’re younger at heart when it comes to technology.
[00:46:27] Helen Todd: But it does seem like there’s kind of this. Commodification of content and what you were saying about the AI for books, you know, I think just because it’s gen AI doesn’t mean that it’s going to be good by any means. So you really do need that the creative eye to determine what’s good or what’s not good for sure.
[00:46:45] Martin Gertl: It’s good to kind of dive into how these systems work and. I think kind of, it seems like most people know already that these systems reproduce patterns that they find in data and they get fed like huge amounts of data. It’s like for jet GPT, it’s about like what 10, 000 people can read. In their lifetimes, ChatGPT has knowledge of and has read, and based on that huge amount of data, it kind of can produce texts that look good.
[00:47:20] Martin Gertl: And that’s look and sound good. But that’s, I mean, that’s a very simple explanation. Of course it’s being refined, but it’s just this statistical pattern. So it has been described as a kind of statistical parrot. So it’s a machine that parrots stuff, but it’s very sophisticated.
[00:47:45] Martin Gertl: It’s really a high tech, sophisticated parrot, right? And that’s enough for a lot of things. Yeah. It’s, that’s enough for doing a lot of jobs and a lot of good probably results, right? But it’s not the same kind of creativity. I think that humans kind of exhibit.
[00:48:09] Moritz Russell: I also think that in general, it makes sense to not really take the things for granted that you have.
[00:48:14] Moritz Russell: And especially in a world that is. Shifting towards more digital instead of less digital. It makes actually really makes sense to get some sense of understanding how things work in a technological sense and not only, you know, if you have a basic understanding about how an app works about how, since we’re talking about AI, how about image generation works, it makes sense to maybe on one hand, you know, identify it as being generated by an AI, maybe but also it always helps you to question things, basically.
[00:48:52] Moritz Russell: And this is something that cannot be taught by an AI. And if in worst case, if everything is generated by an AI, you always have your own system of judging things, or maybe not judging is the wrong word, but you know, having a critical look at things and question them. And This is also something that, that would be I guess is more and more important also in schools and universities to really teach people to dive deeper into the stuff and not only take it for granted and you know, always have a look behind the facade and how it’s produced. And also to get an understanding of about how it might have been altered as well. And also if we think about fake news and so on and deep fakes and. everything else that might possibly be harmful for society and the whole planet as well.
[00:49:50] Moritz Russell: So as I think this critical thinking and gaining an understanding about how these things are crafted and generated will play a much bigger role in the future. As well.
[00:50:03] Helen Todd: Yeah. And to a point earlier about you know, the forbidden things that these AI tools don’t allow right now. Well, there’s some out there that do, and I know Elon Musk just recently announced his X AI tool, and I’m not sure if it’s just text or text to image or video, but yeah, we’ll get.
[00:50:22] Helen Todd: Really interesting with some of these, with the guardrails and, you know, I think one of the things that I’ve said before on the show, and I think Martin, you might’ve said it too, that these tools are just mirrors of who we are and the society and whatnot, and that we’ve all kind of got to decide together what that means.
[00:50:42] Helen Todd: But I know also when we were talking You know, we started recording one of the questions that came up was like, can machines be creative? And I know you, you all have been in the space for a long time. So I’d love to hear your thoughts and have you expand on that for our listeners and viewers.
[00:51:01] Martin Gertl: Let’s say ChatGPT. So, so even after doing our own like generative emojis and fonts and so on, I, when I was asked this question, I always said, no, these machines are not creative, not at all. It’s always. You know, we build a system, we as humans or as society or as a company or whatever, we build a system that produces stuff.
[00:51:28] Martin Gertl: So in the end it’s us who are the creatives, right? It’s a mechanical deterministic tool, but with now with ChatGPT. Mainly the question is, has shifted for me a bit because it’s so far reaching it’s so general purpose the way that it’s using that it exhibits certain characteristics, you know, and it has to be seen how it can fulfill all of these roles, but then it’s being combined also with images and so on.
[00:52:06] Martin Gertl: So there, I would say now we. reflect differently on the role of creativity, because it seems to be, there is a certain part of human creativity, at least that I feel we have valued as creative that is being replaced, or as at least being done as well as we can do it by machines. And that is all that reproductive like reproducing things, learning by reproducing.
[00:52:40] Martin Gertl: And I think this is a big part of creativity. This is how we learn. This is how we grow up. We’re not isolated and all the big geniuses that we think of that. peak creative minds. They were not on their own, not at all. They interacted, they learned from all their precursors and they built upon this knowledge.
[00:53:03] Martin Gertl: And this is the point that these tools do so well, but that’s only one part of creativity. You know, all that. personal style, all that personal experience, all that values that you learn, you know, this comes into play as well. And so I think in the end, I can admit that these tools are creative in this sense, in the sense of they can produce images that I would have to spend hours and hours of learning.
[00:53:36] Martin Gertl: To draw or whatever, but that’s one part of creativity and we can now focus on maybe the other part. I think it also greatly depends on your definition of what creativity actually is.
[00:53:51] Helen Todd: It’s an interesting question of how do you define creativity? One guest asked me and different guests across the show have had different definitions of creative.
[00:54:01] Helen Todd: I think one thing that’s kind of interesting and one, one reason why I love the show is like, we assign creativity to be such an inherently human trait. And now that these machines can. Pair millions and trillions of ideas together. But I think one of the interesting things is, you know, what’s novel and new when these tools are more generative.
[00:54:23] Helen Todd: And like what Martin said is like a parrot, like how do you define like a good idea or a good design that we haven’t seen before? And and I’m not sure that these tools and their generative nature will. We’ll be able to arrive at that, but I’m curious about your response or thoughts are on that.
[00:54:41] Martin Gertl: Yeah. I stumbled upon a few papers recently that dealt with this topic of creativity. And it’s quite interesting how they try to study this. And one of them did use, I mean, there is like in psychology, there are measures. There are tests for creativity. Some, something is called the Torrance test, something like that.
[00:55:02] Martin Gertl: And they kind of quantify creativity in a way and AI is acing them, totally acing them, like. 99 percent success rate. So they are doing better than 99 percent of than people. And then, I mean, this is, then you’re getting, get into, okay, are these tests valid or not? And then they are dealing with, you know, they have different.
[00:55:27] Martin Gertl: Qualities they, they check for are these are you fluent? Can you elaborate? Are you flexible? Can you produce multiple you know, ideas? Can you elaborate stuff like that? And all these. are aced by, as you know, when you talk to ChatGPT, for example, it’s just, it doesn’t stop, right? It can produce variants endlessly.
[00:55:52] Martin Gertl: So, these might not be the best tests. So they tried in a different paper, another approach, which was try to come up with ideas. And they did one with business ideas, and then they evaluated those, you know, by experts, so to say, people who invest in new businesses. So it’s kind of a really more realistic measure.
[00:56:18] Martin Gertl: And also in these, they did quite well. Did he, the generative AI’s do quite well. And what I. really liked in this paper, they didn’t pit human ideas against AI, but they specifically said human on the one side and human plus AI, because it was still was a person using the AI tool to come up with ideas. So this was quite interesting.
[00:56:50] Martin Gertl: And. It’s, it seems like there could be a path to creating, maybe also new ideas and solving problems. And, but it’s always, I think a collaboration, right. And then evaluation and that no, nobody can really. Decide, is this idea novel or good or what, you know, I mean, you can vote, maybe you can do a popular vote.
[00:57:20] Martin Gertl: You can put it out there and see, you know, there is no objective measure, but it could be a tool working with AI. And just the last point, of course, being these kind of high tech, sophisticated parrots, you could say on a kind of philosophical level, they could never come up with new ideas. Because they are replicating their inputs, but on a more practical sense, I think these papers, they do show that a lot of ideas depend a lot on a combination and on recombining existing ideas and make them viable and so on.
[00:58:00] Martin Gertl: So there is some value to be had for sure. But I think. Always in collaboration, right? I think there is no, no sense in automating AI’s on a level and nobody’s thinking about that, right? I mean, it doesn’t really make sense right now.
[00:58:18] Helen Todd: Yeah. I often on the collaboration point, there’s a quote by Michelangelo that I often think about and this is just kind of the gist of it of I saw the angel in the marble and set it free.
[00:58:33] Helen Todd: And like, if you think about the marble as like the generative AI model and everything that it’s been trained on, you know, scrapes. You know what’s online on the Internet. It’s like if you just have the idea and the prompts that you can really get whatever out of it that you want in that regard.
[00:58:51] Helen Todd: But you have to know what you want in the right the question to set it free. But it’s still kind of limited to what’s available online. And there’s still so much that you know, isn’t online as well. But I love that quote by Michelangelo when thinking about AI sometimes.
[00:59:07] Martin Gertl: Yeah, I think that has been brought up by Aristotle as well before Michelangelo that it was that substance and form, right?
[00:59:17] Martin Gertl: You have this kind of, you have this substance, the marble, but somebody has to bring the idea and the form to, to carve it out. So the substance has all of the ideas inside. And then you. Kind of curated and it’s kind of interesting. It’s doesn’t talk about technical skills at all. So, yeah, well, I know. I feel like we could keep talking for hours and hours.
[00:59:42] Helen Todd: I wish we were in a coffee shop in Vienna sharing this conversation over coffee, but I want to make sure To let you guys plug any projects or your website that you want our listeners and viewers to know. And again, I’ll be sure to link everything into the dedicated blog posts.
[01:00:00] Moritz Russell: If people are interested, they can follow us on Instagram.
[01:00:04] Moritz Russell: Of course, we’d appreciate it. And they can also have a look at our website. Our studio is running for almost eight years now. So we have, we’ve done quite a few projects already, and we are always happy about people reaching out to us. So if you want to have a look just go over to process that studio and have a look around.
[01:00:24] Helen Todd: And one, one question I like to ask all of my guests is if you want our listeners and viewers to remember one thing from this conversation or from the work that you do what’s the one thing that you want them to walk away with.
[01:00:37] Martin Gertl: I’d say, especially with technology, keep in mind that technology doesn’t happen on its own.
[01:00:50] Martin Gertl: It’s us as a whole, as society who builds technology and especially designers not only designers of the technology, but also like we are, you know, visual designers and product designers I feel that this is kind of. Very similar to technological designers. We also build stuff that is being used or at least consumed by people.
[01:01:19] Martin Gertl: We have I think a big responsibility to to think about the implications of our work because we do influence people and we do build things that are being used by people. So we kind of. Always have to think what kind of future do we want and what is my work, my, my API, my tool, my website, or my project, what kind of future is it promoting?
[01:01:54] Moritz Russell: Well, in my sentence would be to have a critical look. behind the scenes and put the results, have a look at the system in which it was created, and the values that are fueling and driving the system. Because it’s, there are big you know, implications that, you know, if you change the system. Everything else coming out of this system changes as well.
[01:02:20] Moritz Russell: And especially thinking about the future of media in general and media consumption and the internet and connected world and so on. It’s really more and more important to have a, you know, really maybe a slower look at things and question stuff basically. And not only take everything for granted and everything that you see is real.
[01:02:45] Moritz Russell: I mean, this, you know, is of course also a very old idea. If you think about the invention of photography it’s, this has always been there. But I think it gets more, more important as we go as humanity. And because Things are altered and things are altered in specific ways in order to achieve some, something.
[01:03:08] Moritz Russell: And this is something that we need to be aware of as well. Especially as somebody who designs these things as well.
[01:03:15] Helen Todd: That’s so well stated, both of you. And I couldn’t agree more about. The intentionality going behind the tools and the designs and for the ones creating and then the ones on the consuming side to get curious about every layer in the process.
[01:03:32] Helen Todd: What a wonderful note to, to end the show on. So it’s been an absolute pleasure having you both on the show. So thank you so much for sharing all of your thoughts and your amazing design work. And I can’t wait to see what else you all produce out of process studio.
[01:03:47] Moritz Russell: Thank you very much.
[01:03:48] Martin Gertl: Thanks for having us.
[01:03:52] Helen Todd: Thank you for spending some time with us today. We’re just getting started and would love your support. Subscribe to Creativity Squared on your preferred podcast platform and leave a review. It really helps and I’d love to hear your feedback. What topics are you thinking about and want to dive into more?
[01:04:08] Helen Todd: I invite you to visit creativity square.com to let me know. And while you’re there, be sure to sign up for our free weekly newsletter so you can easily stay on top of all the latest news at the intersection of. and creativity. Because it’s so important to support artists. 10 percent of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized non profit that supports over 100 arts organizations.
[01:04:33] Helen Todd: Become a premium newsletter subscriber or leave a tip on the website to support this project and ArtsWave. And premium newsletter subscribers will receive NFTs of episode cover art and more extras. To say thank you for helping bring my dream to life. And a big thank you to everyone who’s offered their time, energy, and encouragement and support so far.
[01:04:54] Helen Todd: I really appreciate it from the bottom of my heart. This show is produced and made possible by the team at Play Audio Agency. Until next week, keep creating.