I went to the theater yesterday. A play based on a script from thirty years ago. Reserved tickets, the commute, the anticipation. Two hours in a chair. And from the first minutes, a growing feeling that everyone knows who has ever invested an evening in something that turned out to be a waste of time.

Farce. Humor at a preschool level. Dialogues where every sharp thought drowns in a torrent of banality. There's supposedly a message — not hard to extract — but the form is so flat, so unpolished, that the message doesn't land. It gets buried under text that simply isn't well written.

And that text was 100% human. Manual. Organic. Hand-made.

And on the way home, I thought: if the author of that script had access today to a tool that could help sharpen the dialogue, build a better rhythm, eliminate repetitions, raise the level of language — would that be a betrayal of art? A sabotage of creativity?

Or would it simply be a better play?

The tool that has always existed

Open any text editor. Write a sentence with a spelling mistake. A red line appears under the word. Fix it. Nobody will say your text is "worse" because autocorrect helped you.

Now go one step further. Use a grammar-checking tool — Grammarly, LanguageTool, a built-in corrector. It will scan your text, flag unnecessary passive voice, overly long sentences, unclear constructions. You fix them. Nobody will call that cheating.

Go further still. A thesaurus. A synonym dictionary. A tool that suggests a better word. It has existed for centuries — first on paper, then digitally. Writers have always used it. No demon jumped out of a box when someone swapped "big" for "monumental."

And now go even further. An editor. A living human whose sole job is to polish another human's text. They improve style, restructure sentences, cut unnecessary paragraphs, rebuild the architecture. Has anyone ever called a book "fake" because the author had an editor?

Hemingway had Maxwell Perkins. Without Perkins, "The Sun Also Rises" wouldn't exist in the form we know. Carver without Gordon Lish would have been a different writer. Tolkien without the feedback and commentary of C.S. Lewis might never have published The Lord of the Rings.

Tools that support creativity have existed since the beginning of creativity. AI is the next tool on the same axis — just much further along the scale.

The line that doesn't exist

Critics of AI-assisted creativity draw a line. Here — acceptable tools. Here — unacceptable tools. Autocorrect — yes. Grammar — yes. Thesaurus — yes. Editor — yes. AI — no.

But where exactly does that line fall? At what percentage of assistance does a text stop being "real"?

If AI fixes my spelling — that's fine?
If AI fixes my grammar — still fine?
If AI suggests a better phrasing — that's where it crosses?
If AI restructures a paragraph while preserving my thought — that's a scandal?
If AI helps me construct an entire text around my thesis, my arguments, my experiences — that's forgery?

Where does "tool" end and "fraud" begin? Nobody can answer that question. Because the line doesn't exist.

What exists instead is a spectrum. At one end — a pencil and paper. At the other — a fully autonomous text generated by a machine with no human involvement. Between them — thousands of gradations. And every one of those gradations is a tool. Every one sits on the same axis.

Someone here is afraid

When I watch the debate about AI in creativity — I don't see concern for quality. I see fear.

I see people who "did it by hand" for years — wrote, coded, designed, edited — and suddenly face a question they don't want to ask themselves: what if what the end user prefers is better than what I made alone?

That's a painful question. Because it touches identity. If you were a "writer," and someone with AI writes texts that people enjoy reading more — then who are you? If you were a "programmer," and someone with AI builds apps that work better, faster, with fewer bugs — then what is your craft worth?

These are questions without easy answers. But instead of confronting them, many people choose the easier path: deprecate the tool. Because if AI is "inherently worse" — you don't need to compare results. You don't need to measure quality. You just say: "AI wrote it" — and feel relief.

Except the audience, the reader, the viewer, the user — they don't feel that relief. They feel the text. They read the words. They experience the emotions. And either they're moved — or they're not. And they don't care what tool made it.

Human as source. Machine as form.

There is a fundamental difference between two scenarios that public debate routinely ignores.

Scenario 1: A person types "write me an article about economics" and publishes whatever comes out. No thought. No thesis. No experience. No editing. That's fast food — and it deserves criticism. The machine has no experiences. No theses. No world. It produces something that looks like text but isn't thought.

Scenario 2: A person has a thesis. Has experience. Has a thought that won't let them rest. Has a question they want to pose. And they use AI to shape that thought — to give it precision, structure, rhythm. So every sentence is polished. So reading is a pleasure. So the audience walks away richer.

In the first scenario, there is no human. In the second — there is only a human with a better tool.

And that is the crucial distinction. The human as the source of the idea, the experience, the thesis, the question. The machine as the tool that gives it form. Just as a chisel is a sculptor's tool — but the sculpture is in the artist's mind, not in the chisel.

A "worse" app because AI helped build it?

Let's shift to another domain. Software.

When someone builds an AI-assisted application — writes code with Copilot, generates components, debugs with an assistant — and that application is fast, stable, polished, beautifully designed — is it a "worse" application?

Worse than what? Than an application where the programmer spent three times longer, made more mistakes, wrote less readable code — but did it "by hand"?

Compilers have existed since the 1950s. Instead of writing in machine code, programmers write in high-level languages — and the machine translates. Nobody calls Ruby developers "cheaters" because they don't write in assembly. Frameworks, libraries, engines — each of these tools distanced the programmer from "manual labor." And each of them raised the quality of the final product.

AI in programming is the next step on the same trajectory. It doesn't replace the programmer. It amplifies.

The theater test

Let's return to the theater. To the evening that inspired these reflections.

Let's imagine two scenarios.

Scenario A: A playwright writes the play alone. By hand. On a typewriter, if we need to emphasize the craft. Delivers the text to the theater. The director stages it. Opening night. The audience sits for two hours. The text is flat, the humor tacky, the dialogues predictable. But it's "authentic."

Scenario B: A playwright has a vision for the play. Has a subject, a message, scenes in their head. Works on the text in dialogue with AI — tests dialogues, sharpens punchlines, checks the rhythm of scenes, experiments with variants. The final text is theirs — every word passed through their judgment — but it is polished in every detail. The audience laughs, falls silent at the right moments, leaves with questions.

Which scenario is "more real"?

Which one does the audience want?

And whose perspective should matter — the creator defending their process, or the audience who came for the experience?

The fetish of process

Somewhere along the way, we started confusing two things: the value of the work and the value of the process.

"Hand-made" became a quality label. Like organic on eggs. Like craft on beer. It implies: someone worked hard, so it must be good.

But that's a fetish. Process doesn't guarantee quality. The creator's suffering doesn't guarantee artistry. A thousand hours at the keyboard doesn't guarantee a good book. Manual work is not a synonym for excellence. It is a synonym for one particular method. Not a result.

The result is what reaches the audience. What they read, what they watch, what they listen to, what they experience. And the audience doesn't see the process. They see the effect. And either that effect is outstanding — or it isn't.

Money, time, and risk

There's another dimension nobody talks about. The economic dimension.

Theater tickets cost money. The commute costs money. The evening costs money. Two hours of life cost money. And when you leave a mediocre play — that cost is gone forever. You invested — and got a low-quality product.

The same applies to a book. A podcast. An online course. An app. Every intellectual product we pay for — with money, time, attention.

If AI enables the creation of products polished in every detail — if it reduces the risk that the audience wastes their time and money — that's not a degradation of creativity. That's respect for the audience.

Because hand-made mediocrity isn't craftsmanship. It's laziness in elegant packaging.

The sound that changed everything

The works of Bach are genius. Beethoven's symphonies move us to tears. Nobody in their right mind disputes that. This is music written by hand, on sheet music, with acoustic instruments, by people who devoted their lives to the craft.

And then, in 1976, Jean-Michel Jarre recorded "Oxygène." He had no orchestra. No choir. He had synthesizers, cables, and a vision. The album sold over 12 million copies and opened the door for an entire genre of electronic music. Critics said: this isn't music. These are machine sounds.

In 1983, Yamaha released the DX7 synthesizer. Suddenly, any musician could generate sounds that previously required an orchestra. Brian Eno used synthesizers to create ambient. Kraftwerk built an entire aesthetic around them.

And then, in 2001, Daft Punk released "Discovery." Music made almost entirely by machines — samplers, synthesizers, vocoders. An album now considered one of the most important musical works of the 21st century. "Digital Love," "Something About Us," "Harder, Better, Faster, Stronger" — compositions that move, touch, and stay in your head for years. Not a single "classical" note played on a "real" instrument was involved.

Is "Oxygène" inferior to the "Pastoral Symphony"? Is "Discovery" less valuable than Vivaldi's "Four Seasons"? The question sounds absurd — because it is absurd. These are different tools, different eras, different forms of expression. But the artistry is the same.

Every time a new musical tool appeared, the reaction was the same: "it's not art because it's not handmade." Every time, it turned out that the new tool didn't kill creativity. It opened a new dimension. People who couldn't play the violin could now make music. People who had no orchestra could compose symphonies. The barrier fell. Creativity exploded.

AI is the DX7 synthesizer of our era. And the reactions are identical.

The hidden assumption

Behind the deprecation of AI-assisted content hides an assumption that few people state outright: that the creator's suffering is part of the work's value.

That a text is better because the author sweated over it for three months. That a painting is more valuable because the artist starved. That software is "more real" because the programmer debugged it through the night.

This is a romantic myth. And like most romantic myths — it is beautiful and untrue.

The audience doesn't care how many hours you spent at your desk. They care whether the text changed their thinking. Whether the music moved something inside them. Whether the app solved their problem. The value of a work is in the work, not in the process of its creation.

Picasso painted "The Dove" in a few minutes. Nobody said it was worth less because it didn't take a month.

The honesty problem

There is one real question in this debate that deserves a serious answer: transparency.

If someone creates with AI's help — should they say so? Does the audience have a right to know?

Yes. Our portal says it openly. In the "About" section we write clearly: ideas, theses, directions — that's human. Writing, editing, structure — that's a dialogue with AI. Final control — human again.

This is honest. And it should be the norm. Not because AI deprecates content — but because transparency is respect for the audience. Just as editors are listed on the title page. Just as ghostwriters — though often hidden — should be disclosed.

But transparency is not deprecation. Saying "this text was created in dialogue with AI" is not a confession of guilt. It's information about a method. Just as "this bridge was designed with the help of a computer" doesn't mean the bridge is worse than one drawn with a pencil on tracing paper.

What if this is exactly the point?

Let's pause for a moment. Set aside fears, ideologies, anxieties.

What if a world where people with ideas have access to tools that help them express those ideas with the highest precision — is exactly what creativity is about?

For centuries, the barrier wasn't thinking. People have always thought deeply, provocatively, beautifully. The barrier was expression. Not everyone who had a brilliant thought could write it down in a way that moved millions. Not everyone who had a vision for an app could code it. Not everyone who felt music could play an instrument.

AI lowers the barrier of expression. Not the barrier of thought. The thought still must be human. The experience still must be human. The question still must be human. But the form — the form can be better than ever before.

And if that form means more people read with pleasure. More people experience intellectual satisfaction. More people leave a text, a play, an app feeling that their time was well invested — isn't that exactly the point? In art, science, philosophy?

· · ·

We're not claiming that everything created with AI is good. Plenty of it is bad — just as plenty of "handmade" texts are bad. Quality is not a function of the tool. It is a function of intention, thought, taste, and intellectual effort — regardless of whether that effort is supported by a pencil, a keyboard, an editor, or an algorithm.

Next time you hear "AI wrote it" as an accusation — ask: "did you read it?" Ask: "did you like it?" Ask: "did it move you?"

Because if the answer is "yes" — the tool is irrelevant. And if the answer is "no" — it's not AI's fault. It's the fault of whoever had nothing to say.

"Hand-made" doesn't mean better. It just means: made differently. And "differently" is not a quality criterion. It is a description of a method.