Art “AI” tools, why they are not AI at all, and why they will never replace artists

  1. They won’t replace artists, not for any purpose, nor in any capacity

These “AI” tools are not AI at all

Defining what counts as “Artificial Intelligence” is never easy, but when it comes to just about anything that has a public profile on social media, it’s not AI — it’s something much more primitive, conceptually speaking, a form of data analysis rather than intelligence. To understand this better, I will explain what precisely each of those things entails..

How do art “AI” tools work?

What tools like DALL·E and others do, in a basic form, is to take huge amounts of input data provided by a human (such as a researcher), and perform massive amounts of computations on it to find common features or attributes of those images. For example, it might find that:

  • Swimming pools are more often accompanied by daylight than dark lighting
  • Andy Warhol paintings and derivatives are often in many colours and distinctly separated out into regions

Training a machine — “machine learning”

Training, in the context of machine learning, means a human telling the machine what was good and what was bad about the data it captured. For example, if it generates two pictures of a horse — one with the horse’s head missing and one not — the human can tell the machine that one of these pictures is good and one is bad. The machine, given this extra data, can use this to try and avoid making images with the “bad” attribute again — but it has learned nothing on its own. A human has added data to it, nothing more.

A CAPTCHA example.
Okay, this one might be a problem for a machine no matter what you do, but still.

They won’t replace artists, not for any purpose

First of all, current “AI” tools are nowhere near complex or good enough to be useful for anything; the best they can currently do is act as a minor inspiration source for human artists, in the sense that they can act like a condensed form of Google Search. Google Search frequently throws up unhelpful images when looking for references — as it isn’t designed for that purpose — but art tools are more streamlined and are better equipped to handle specific requests for reference material.

The problem of context in AI and machine learning

Conceptually speaking, you could argue that humans are not much different from machine learning. We, too, don’t understand anything unless taught — we can interpret the data around us (e.g. that bee stung me, therefore I now know it can do that and should avoid it), but we cannot actively “learn” except through the data we are given or taught. Human decisions can be compared in a very similar manner to machine decisions — we run algorithms on the data we have and come to a decision.

We all know which of these is the most powerful… right?
Almost too accurate, except for the actual doing what he says he will do part.

Without context, true “AI” art tools cannot exist

One day — if there is a world where robots walk around on the streets or humans have all of their senses recorded 24/7 for some giant machine to decipher (in itself, a more or less impossible feat) — it is *plausible* that machine learning and “AI” could become comparable enough to humans to be considered intelligent. That time is nowhere near here; we are not remotely close to it yet.

What about replacing artists for concept sketches and thumbnails?

This also isn’t going to happen.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store