Who the Machine Says You Are

Technology

Who the Machine Says You Are

Algorithmic character, population management, and the philosophers we forgot to invite

Dr. Kem-Laurin Lubin9 min read

On how AI inherited rhetoric’s most dangerous art, and what it means for all of us.

It would be convenient to treat all this as a matter of merely theoretical interest, a clever transposition of ancient categories onto modern technology, the sort of thing one publishes and then discusses at conferences. But the consequences of algorithmic ethopoeia are brutally concrete, and they fall disproportionately on those least equipped to contest them.

What computational rhetoric reveals

The ancient rhetoricians understood something that our technologists have largely forgotten: that the construction of character is a political act. When Lysias wrote speeches for Athenian litigants, he was not merely helping them communicate; he was constructing their ethos—their character as perceived by the jury. Aristotle codified this in the Rhetoric: persuasion depends not only on the logic of your argument or the emotions of your audience, but on who they believe you to be.

Algorithmic systems do the same thing, only at scale and without the courtesy of telling you. Every credit score, every risk assessment, every recommendation engine is engaged in ethopoeia—the construction of your character. The machine says who you are, and then it acts on that determination.

The consequences are not abstract

Consider the well-documented case of clinical risk algorithms in American hospitals. As researchers at institutions including Cedars-Sinai have shown, these systems consistently underestimate the severity of illness in Black patients. The algorithm constructs an ethos—a character profile—that says, in effect: this patient is less sick than they actually are. The consequences are not theoretical. They are measured in delayed treatments, missed diagnoses, and preventable deaths.

Or consider hiring algorithms, which construct the ethos of job applicants from patterns in historical data. When that data reflects decades of discriminatory hiring practices, the algorithm faithfully reproduces them—not out of malice, but out of mathematical fidelity to a biased past. The machine says who you are, and who you are is who people like you have been allowed to be.

The poverty of “ethics” without philosophy

The technology industry’s response to these problems has been to create “ethics boards” and publish “principles.” The irony is rich. Ethics, in the philosophical tradition, is not a set of principles one posts on a website. It is the systematic investigation of how to live well—a practice that requires exactly the kind of sustained, dialectical inquiry that the Socratic tradition exemplifies.

When Google disbanded its AI ethics team, when Timnit Gebru was shown the door for asking inconvenient questions, the message was clear: ethics is welcome as decoration, not as practice. The philosophers were not invited to the table because they might actually do philosophy—which is to say, they might ask questions that cannot be answered by shipping a product update.

Values are architectural decisions

Every dataset encodes a set of assumptions about what matters, what counts, and who counts. Every model architecture embodies a theory of relevance. Every deployment decision is a judgment about who bears the costs of error. These are not technical decisions that happen to have ethical implications. They are ethical decisions that happen to be implemented in code.

Aristotle would have recognized this immediately. In the Nicomachean Ethics, he argues that the virtues are not abstract principles but habits of action—dispositions that are built into the structure of a life through repeated practice. The same is true of algorithmic systems. Their “values” are not stated in a principles document; they are built into their architecture through repeated training.

Why philosophy belongs at the table

The construction of character—whether by a speechwriter in ancient Athens or an algorithm in modern Silicon Valley—is too important to be left to engineers alone. Not because engineers are incompetent, but because the questions at stake are not engineering questions. They are questions about justice, identity, power, and the kind of society we want to live in.

These are questions that philosophers have been thinking about for twenty-five centuries. It is long past time we invited them back to the table.

The ancient question in modern dress

Gorgias claimed that rhetoric was the art of persuasion. Plato responded that persuasion without knowledge is mere flattery—a knack, not an art. The same debate is playing out today, only the rhetoric is computational and the stakes are civilizational.

The machine says who you are. The question is whether we will let it have the last word.