
The internet has done a commendable job of mocking NFTs to death, or at least into remission – big game developers like Ubisoft, who initially showed interest, have mercifully stopped creating them – and now some hope the “make it so annoying that no one will touch him “The tactic can be used to thwart another trend: the fast-forwarding AI image generators spitting out flattering and false portraits of our friends and imaginary film stills by David Lynch Warhammer (opens in new tab).
I think they will be disappointed. The “art” of AI is not going anywhere.
In a sense, NFTs and AI art are opposites: NFTs promise that every piece of digital art can be a unique and valuable commodity, while AI art promises to eradicate the value of digital art flooding the Internet. with an endless supply of it. If Jimmy Fallon wanted to hoard all those stupid NFT monkey pictures, I don’t think most people would mind, but fast and cheap AI imagery has made it hard not to see more and more of them. If you’ve used social media in the past year, you’ve seen AI-generated imagery.
And I highly doubt it’s a passing fad. Where blockchain investment is criticized as meaningless garbage generation, AI art is lamented as threatening illustrators’ jobs. Everyone can see the value of a machine that turns words into pictures. It’s hard to resist trying it, even if you don’t like it on principle. If someone tells you they have a machine that can take pictures of anything, how could you not want to test the claim at least once?
Something perceived as profoundly human has been turned into a party trick.
The way we interact with these machine learning algorithms reminds me of the way people tease babies, delighting in their every response to new stimuli and pointing to anything that might be taken as a sign that they’ve understood us. When an imager seems to “get” what we ask for, there’s a pleasantly odd feeling – it’s hard to believe that a computer program would successfully translate a complex idea like “John Oliver looking lovingly at his cabbage as he realizes he’s falling in love”. in an image, but there it is, undeniably on the screen in front of us.
And that’s what really makes AI art so offensive to many I think. It’s not just the automation of work, but the automation of creative work that seems so obscene. Something perceived as profoundly human has been turned into a party trick.
AI art generators don’t tear up their flaws, get bored or frustrated by their inability to represent hands that could exist in Euclidean space.
The good news and the bad news for humanity is that the trick is easily found: imagers do nothing unless trained on piles of artwork and man-made photos, and in some cases this has been done without the consent of the artists whose work was used. In fact, the popular Lensa AI portrait maker often reproduced distorted signatures (opens in new tab): the mangled corpses of the true artists who were fed on it.
An early attempt to save AI art from this criticism is easily dismissed, if you ask me. The claim is that by collecting artist portfolios online for training material, AI art generators are “just doing what human artists do” by “learning” from existing artworks. Of course, humans learn in part by imitating and building on the work of others, but casually anthropomorphizing algorithms that track millions of images like living beings that are very quick to go to art school is not a position I take seriously. It’s totally premature to concede human nature to silicon chips just because they can now spit out pictures of cats on demand, even if those pictures occasionally look like they were man-made.
I’m cutting this out for privacy reasons/because I’m not trying to get any individual’s attention. These are all Lensa portraits where the mangled remains of an artist’s signature are still visible. It’s the remains of the signature of one of the many artists he stole.A 🧵 https://t.co/0lS4WHmQfW pic.twitter.com/7GfDXZ22s1December 6, 2022
In addition to flattering portraits
What’s interesting about AI-generated images to me is that they often not look like man-made. One way the inhumanity of machine learning manifests itself is in its lack of self-awareness. AI art generators don’t tear up their flaws, get bored or frustrated by their inability to represent hands that could exist in Euclidean space. They can’t judge their own work, at least not in a way that a human being can relate to, and that fearlessness leads to amazing images: photos we’ve never seen before, that some artists are using as inspiration.
Rick and Morty creator Justin Roiland toyed with the AI art generation in the making of High on Life, for example, he told Sky News (opens in new tab) which helped the development team “come up with weird and funny ideas” and “makes the world feel like a weird alternate universe of our world”.
Imaging is just one of the ways machine learning is being used in games, which are already rife with procedural systems like level generators and dynamic animations. For example, a young company called Anything World uses machine learning to animate 3D animals and other models in real time. What would a game like No Man’s Sky look like, whose procedurally generated planets and wildlife stop looking new after so many star system jumps, after another decade of machine learning research? What will it be like to play games where NPCs can behave in genuinely unpredictable ways, say, “writing” unique songs about our adventures? I think we’ll probably find out. After all, our favorite RPG of 2021 was a “procedural storytelling” game.
I don’t want Epic to be a company that stifles innovation. Been on the wrong side of it many times. Apple says “you can’t make a payment system” and “you can’t make a browser engine”. I don’t want to be the “you can’t use AI” company or the “you can’t do AI” company.December 25, 2022
As valid as the ethical objections are, the expansion of machine learning into the arts – and everything else people do – currently looks a bit like the ship that crashed on the island at the end of Speed 2: Cruise Control . (opens in new tab)
Users of the art portfolio host ArtStation, which Unreal Engine and Fortnite maker Epic Games recently acquired, have protested the unauthorized use of their work to train AI algorithms, and Epic has added a “NoAI” tag that users artists can use to “explicitly disallow the use of the content by AI systems.” But that doesn’t mean that Epic is generally opposed to AI art. According to Epic Games CEO Tim Sweeney, some of their own artists consider the technology “revolutionary” in the same way Photoshop is.
This ethical, legal, philosophical swamp has just begun to open up.
“I don’t want to be the ‘you can’t do AI’ company or the ‘you can’t do AI’ company,” Sweeney said on twitter (opens in new tab) . “Many Epic artists are experimenting with AI tools in their hobby projects and see them as revolutionary in the same way as previous things like Photoshop, Z-Brush, Substance and Nanite. artists”.
Of course, it’s possible to train these algorithms without swallowing other people’s art without permission. Perhaps there is a world where artists are paid to train machine learning models, although I don’t know how many artists would consider that better. All sorts of other anxieties arise from the widespread use of AI. What biases might popular algorithms have and how might they influence our perception of the world? How will schools and competitions adapt to the presence of AI-washed plagiarism?
Machine learning is being used in all sorts of other fields, from graphics technology like Nvidia DLSS (opens in new tab) to self-driving cars to nuclear fusion, and it will only get more powerful from here. Unlike the blockchain revolution we keep rolling our eyes at, machine learning represents a genuine shift in how we understand and interact with computers. This ethical, legal, philosophical swamp has just started to open up: it’s going to get deeper and swampy from here. And our friends’ profile pictures will become more and more flattering.
Comments
Post a Comment