Useless Knowledge

countess × (x²) ≤ a father’s love ÷ empty days in a stately home, where x = ∬ {the latest hat from Paris}


In her final work, “Dark Age Ahead,” the great Jane Jacobs lamented the rise of what she called “credentialing vs. education.” In this model a university degree is now considered necessary for decent, well-paid employment, but not because it is evidence of the highest levels of learning and expertise in a particular field of study.

Instead, it is simply a credential that proves to a prospective employer that the degree-holding graduate has jumped through the necessary hoops and shown that her values and skills are in alignment with those of the company. By graduating, she is by definition reliable, honest, able to turn up on time, work well with others and finish projects efficiently and on-budget,

Paraphrased, it doesn’t matter so very much what you’ve studied, but it does matter that you’ve been mixed, rolled and stamped out with the cookie cutter of compliance. Universities are just a pre-filtering step in the assembly-line process of creating corporate workers. Go ahead and finish your thesis on Lesbian Poetry Cave Paintings of the Early Neanderthal—what’s really important is you can work until midnight without complaining. Go team, go!

We are in thrall to the concept of “efficiency.” Getting the most end product from just the right amount of input of time and resources, which boils down to maximising profit by minimizing, or, even better, externalizing expenses (getting others to foot the bill).

What do you acquire at university? Useless knowledge. Knowledge without “practical” application.

Or, as another great Canadian writer, Alice Munro, has framed it, through one of her skeptical, small-town female characters: “Who do you think you are?

In 1854, self-taught English mathematician George Boole had the inventive notion to apply the rigorous procedures of algebra to logical thought, and by doing so made possible a digital revolution that would occur one hundred and fifty years later.

Boole’s concept was a set of operations that, working with propositions instead of real numbers, and using only three operators, AND, OR and NOT, could determine their truth or otherwise (a proposition is a simple statement, for example “the window is open” or “all boys like hockey”).

This method of treating logical propositions with a scientific method and the stringency of mathematics became known eventually as Boolean algebra. And, brilliant an insight as it was recognized to be, no one knew what to do with it.

So there it sat, interesting but effectively useless, a brilliant abstraction, and without any practical application, for eighty years.

Mary, George’s wife—also a self-taught mathematician; a species which was apparently raging through academia like crabgrass during the early nineteenth century—was not necessarily impressed.

Fun Fact: Mary was a believer in homeopathy, the cretinous quackery that takes as its gimmick the doctrine that a substance causing symptoms of a disease in healthy people would cure similar symptoms in sick people, or “similia similibus curantur”, like cures like. Which effectively means curing by placebo, as there is literally nothing in homeopathic medicines.

At any rate, George walked home in the rain one day after a hard day’s AND, OR and NOT-ing at the Uni, and subsequently caught a severe chill. That might have been the end of it had Mary not gone all homeopathic and covered him with sheets soaked in ice-cold water. He died a week or so later, or, “Like kills like”. Mary’s reaction was not recorded. Personally, I think she should have been publicly stoned for being a ninny.

At home with George and Mary Boole: An imaginary evening meal.

MARY: Hello, dear, nice day at the University of Cork, Ireland, as its first mathematics professor, even though you didn’t even go to university yourself?”

GEORGE: “Pretty good, Any more biscuits or did you and your bridge club scarf the lot?”

M: “What did you do today? Any more ponderous speeches to the students about their loose morals? That’s rich coming from you, pumpkin!”

G: “If you must know, as a matter of fact I discovered Boolean operators, which I’ve named after me, obviously, and can be employed to determine the truth or falsehood of logi—”

M: “Tripe?”

G: “Well, really, Mary! At least hear me out before you start tearing everything—

M: “No, no, darling, PASS the tripe, please. Thank you. And the potatoes, if you’d be so kind.”

G: “Oh, rather. At any rate, then in the afternoon Harry, Algernon and I chatted up that new prostitute who’s been hanging about at the Royal Society meeting rooms, then we took her for a spin, checked the tires, saw what’s under the bonnet, if you see my meaning .Then in the afternoon… why are you laughing, for heaven’s sake?”

M: “Well, you see—IF (George with prostitute NOT male) AND (tells me about it) THEN (hard to take) OR IF (George catches cold from walking in the rain) AND (given ice bath by me NOT hot cocoa) THEN (sudden death of George from pneumonia). AND (at least I’ll have your insurance policy!)”

G: “My word, you picked that up fast! Jolly impressive!”

George knew that his Boolean algebra had major significance, but was understandably shaky about its possible applications. Boole had simply come up with an interesting but random idea, like a Cockney barrow boy who’d dreamed something up in his spare time when he should have been shovelling coal or selling cherries from a cart in Covent Garden.

Fast forward: In the 1930’s, Claude Shannon, an American mathematician, had a flash of insight. It was a world-changing moment. He realized that these binary variables could be used to represent the low- and high-voltage states of electric circuits, or OFF and ON.

Zero and one…

Boolean algebra applied to the the switching of electric circuits! These switches, the zeros and ones, are the machine language of computers, and Boolean algebra would provide the conceptual and practical basis for our entire technological world. Unprepossessing George Boole became the architect of a digital age which he could barely imagine; though he spoke of his most important work with a touching sense of purpose. Reading an excerpt from a letter to a colleague in 1851, you can sense his pride and barely contained excitement about the work he was about to commence:

I am now about to set seriously to work upon preparing for the press an account of my theory of Logic and Probabilities which in its present state I look upon as the most valuable if not the only valuable contribution that I have made or am likely to make to Science and the thing by which I would desire if at all to be remembered hereafter …

George Boole, letter of 1851, referring to his groundbreaking ” An investigation into the Laws of Thought, on Which are founded the Mathematical Theories of Logic and Probabilities” (1854)

But the results of his work really started life as a succès d’estime. In other words, useless knowledge.

Because I am shallow, and a male, I take huge, puerile delight in the fact that the first person to envisage the universal computer was named Ada, Countess of Lovelace. <snicker>.

Man up, David. This is the big time, OK?

Augusta Ada King (1815 – 1852) was the daughter of Lord Byron, a.k.a. Count Bisexual, and his only “legitimate” child. He adored Baby “if my name were Ada, I’d be Ada, even backwards I’d be Ada,” but nonetheless abandoned her after four months to the care of her mother and continued gallivanting around the eastern Mediterranean, writing the occasional epic poem for posterity and gifting his magnificent body, lily-of-the-valley eau de Cologne and epicene facial features to all and sundry. He did manage to squeeze out the word “Ada” with his dying breath.

Jeepers. There’s nothing like the devotion of an obsessive helicopter dad, right?

Ada King was wealthy and grand and eccentric enough that she was able to devote herself to her preferred useless pastime, which was—no, Murgatroyd McGraw, not needlepoint or home production of quince jelly, but mathematics. Mathematics, the only perfect language humankind has discovered apart from music. I think she may also have dabbled in watercolors, invented ciphers and then written secret messages in her journal and even tinkered a bit at the old Joanna, as they say, but math was her “thing.”

Everyone who wasn’t of the aristocracy believed that the aristocracy were just a bunch of useless tits, so nobody got on Ada’s case about “doing something meaningful” or how she was wanting “something for nothing”. They never expected anything meaningful from a countess to begin with, and her inherited wealth was the something-for-nothing that’s always okay to those who have it, because they got there first.

Of course, there was at the root of this disdain her unfortunate choice of being a woman in an era when women were believed to be barely above cattle in terms of intelligence and common sense. And so this particular useless tit had LOTS of “me” time.

Now, under normal circumstances, Ada’s wealth could have funded a typical, tastefully extravagant and useless aristocratic life. Parties, balls, more parties, dancing lessons (the quadrille was big at the time), evenings at the Royal Opera House, a fashionable jaunt to see the Coliseum and the Pyramids, then back to more parties and more balls.

Not Ada. Instead, Ada spent her time becoming a mathematical genius. This girl did equations like an Olympic gold medallist does press-ups and chin-ups (unlike our attempts at algebra, where we more resemble a Diane Arbus photograph of special education students with Down Syndrome being group-punished in an asylum).

Ada’s training proves my hypothesis that real freedom—have you heard, apparently there is a type that involves more than one option to choose from, who the hell knew?— is, after all, everything to do with money. Money buys you two extremely valuable commodities: leisure and privacy.

Leisure: Because you don’t have to do anything, you can outsource every single tiresome thing except eating, breathing, defecation and micturition.

Privacy: Because now that you no are no longer forced to do anything distasteful or boring, you tend to talk endlessly about the one or two things you’re obsessed with. Your new glass plate photography kit or your experiments calling up the dead with a ouija board, or building a full-scale replica of the Trianon in the backyard. This yields privacy because this is not easy to be around.

The lifelong process by which Ada took her above-average intelligence burger and, with extra privacy and leisure on the side, and maybe a dill pickle, supersized it to genius, also proves my theory that genius would probably arise more commonly if we all had Universal Basic Income so we could sit around staring into space.

I do this already, so, like, hellooooo, genius, but if you could give it at least a try? We might want to start getting somewhere eventually. The rest of us would appreciate it.

So it came to pass that Ada, at the age of seventeen, met Charles Babbage (1791-1871), an English mathematician who, you guessed it, taught himself algebra and calculus while in his teens, went on to great achievements at Trinity College, Cambridge and was made Lucasian Professor of Mathematics at the age of twenty-five. (Sir Isaac Newton and Stephen Hawking also held this professorship.)

Babbage was, it is fair to say, no slouch with his math abilities, which is understandable considering every person he knew was a self-taught mathematician. Unfortunately he had what I would delicately call, so as not to offend any touchy old people hanging around, “a shitty personality.” Sorry to be so blunt. Call it “shitty personality,” call it borderline; Chuck was a tad difficult to get along with and this would have what we call “far-reaching consequences.”

(When I make this into a podcast, this is the part that will get a little “oomph” from some underscoring, perhaps music with an ominous mood. I’m thinking second movement of Schubert’s “Death and the Maiden” string quartet, and if you have that on CD, what is wrong with you? Fire up Spotify, luddite, and play away before the twentieth century opens its clammy arms and reverse-engineers you!)

Enter Ada, whose mother had insisted on her rigorous education in mathematics so she wouldn’t turn out “crazy, like her father” (crazy was the current psychiatric term for bisexual). So by the age of seventeen, Ada was a true débutante: accomplished at just about everything, charming in that “I’ve got more brains than you” way, and discussing Bernoulli numbers as casually as you and I discuss what mattress cover we’re planning to order from Wayfair, or whether Kamala just forgot to be Black enough or if it’s a strategy and she’s in cahoots with Trump to blow the whole thing.

Bernoulli numbers. Just saying “Bernoulli numbers” makes me feel smarter! Even though I couldn’t tell a Bernoulli number from a pile of lukewarm fettuccine, with sauce.

Anyway. Babbage got funding for his Difference Engine, a mechanical, room-sized calculator using punch cards that was not programmable. It got partially built but then Babbage’s shitty personality kicked in. He had a major fight with his head engineer, and suddenly his investors couldn’t see the point of all this “new-fangled stuff”—also because this was England, so, all together now: “Why would you ever want to change anything?”

With the Difference Engine still unfinished, and with his investors becoming less and less sympathetic to his project and refusing to fund him any further, he started designing his Analytical Engine. This was even grander and more complex than the first, and now its architecture included the feature of conditional branching, meaning that, in hindsight, it was a digital, programmable and Turing-complete computer.

But it was Ada, not Babbage, that had this flash of insight. In her words:

[The Analytical Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine…

[The Analytical Engine was suited for] developping [sic] and tabulating any function whatever. . . the engine [is] the material expression of any indefinite function of any degree of generality and complexity …

Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.

Not just numbers, but numbers that symbolically represented other values. Musical notes; letters of the alphabet… a universal computer.

But she had one final trick up her satin sleeves. In her notes to a seminar Babbage was giving at the University of Turin—in typical smart-alecky smart girl fashion, her notes to the lecture were longer than the lecture, and are the chief reason for her fame—she added an algorithm for the Analytical Engine to compute, yes, Bernoulli numbers. She is cited as the first computer programmer for this reason.

The Analytical Engine was finally built according to Babbage’s specifications in 1991 (if my memory serves me correctly, though you should take my confident air with a grain of salt, here…I was pretty drunk throughout the nineties. Like, there are big patches of vodka and tonic.).

The Analytical Engine, once built, worked. The first ENIAC computers from the forties, handy for home use if you had your own power plant and a garage the size of an airplane hangar, were less Turing-complete. The architecture from the Analytical Engine would have come in mighty handy, and the engineer of the ENIAC is said to have wept when he saw it, but the Engine had been obscured by the misty mists of time.

Babbage’s remarkable work was rightly celebrated. But Ada’s partnership and contributions were not, until very recently. In two pages of search results for Babbage, and in one lengthy article from Stanford University discussing his life and influence, Ada King, the woman who had the genius to see his earthbound cogs and gears and imagine them creating an enthralling music of the future, is not mentioned once.

Ada King, Countess of Lovelace, died at the age of thirty-seven, of uterine cancer, and was buried beside Lord Byron, the father she had never met, whose dying wish was to see his beloved Ada again.

We could have had computers in the 1850’s. And as we build computers in a lackadaisical way then build on top of the shoddy architecture forever, they would have retained their Victorian aesthetic. Giant oak computers, like church organs, dominating the parlor, completely crowding out the beloved upright piano as the new family hearth. Just Mummy and Pa and little Ada, and St.-John, gathered around “The One and Only Babbage Family Entertainment and Learning Engine! Patents Pending in all the Kingdoms of Europe!”

Holding hands, eyes sparkling with the magic of the Modern Age, they would sing Bernoulli songs and play Whack the Golliwog, which would be their racist, imperialist version of a game with aliens. There would be no space flight because the OS was built of—well, oak. Beams of oak.

Big oak computers in our homes and offices and stock markets and banks. You know that they would never have basically changed, right? With double opt-in, it would take a week to sign up for someone’s marketing campaign, which would be delivered to your door by an employee of the Babbage Company. We’d feed our oak computers punch cards to tip off the telegraph operator, Pippa, that we needed to access our bank accounts and then keep the punch cards locked up in case of identity theft.

Every Babbage Remarkable Machine would still have a built-in umbrella stand, a stereoscopic viewer, a make-up mirror for the ladies’ toilette, and a fold-out tray for the teapot.

When you shut the Babbage down, it would play “Rule, Britannia!”


Tell us what you think. Keep it civil, yet interesting.