Men are Not Broken by IceMountainFire

A must read:   Share

A wry smile for those over fifty or so

“Women’s magazines are typically associated with check-out lines in grocery stores, where they sport loud headlines that either promise a “beach body” in ten days, or describe exciting new ways to please a man in bed. (Back in the old days, all you had to do was show up.)” Erica Verrillo


Share


The Problem with Content Filters

A while ago, I sent an email to libraries informing them of several new books I’d published, hoping to interest them in acquiring one or two.  The email was blocked by mail filters for “inappropriate content”.    Here are the book descriptions I’d used:

Fighting Words: notes for a future we won’t have.  Fact-driven fiction.  Speculative.  Provocative.  What could’ve been.  What should’ve been.  Contents: Damages, Home for Unwed Fathers, Fighting Words, Comedown, What Sane Man, Sweet Sixteen, Ballsy, Justified, It’s a Boy, Men Need Sex, How We Survived, The Knitting Group, The Mars Colonies, A PostTrans PostPandemic World, Unless, Alleviation, The Women’s Party, My Last Year

Jess.  Jess used to be a man.  Then he found himself in a female body.  It wasn’t funny.  (Why would anyone even think it was?)  A novel about male privilege.

CottageEscape.zyx: Satan Takes Over.  First there was the pandemic, and people rushed to the north to spread the virus like rats leaving a plague ship.  Then there were the rentals, because other people, eager to capitalize on the pandemic, rushed to develop every last bit of shoreline and turn it into five-star accommodations for the rats. Then, well, all hell broke loose.  The sequel to TurboJetslams: Proof #29 of the Non-Existence of God.

This is what happens.  “An incisive reflection on how social forces constrain women’s lives.  … Great for fans of Sylvia Plath, Doris Lessing’s The Golden Notebook.  Booklife “This is what happens ranks in my top five of books ever read.”  Mesca Elin, Psychochromatic Redemption

This Will Not Look Good on My Resume.  Everyone gets fired at least once in their life.  And if not, well, they’re just not trying very hard.  And we all think of brilliant and immature ‘shoulda saids’ and ‘shoulda dones’ for weeks after.  (Okay, years.)  A quirky bit of fun that slaps you upside the head.  “Ya made me snort root beer out my nose!”  Moriah Jovan, The Proviso

Any guesses as to what word was inappropriate?  ‘Sex’?  ‘Female’?

Is this because the people who create the algorithms are typically young men with sex on the brain?

Or is it because an algorithm can’t determine context.  “Men Need Sex” was presented as the title of a story; I wasn’t saying that men need sex. (1)

And maybe that’s because the programmers can’t determine context. They’re probably Computer Science students.  Perhaps even C students.  Whose reading skills, let’s be honest, are probably not that good.  They aren’t, after all, Literature or Philosophy students.  They have no training in subtext, nuance …  They have trouble with complex sentences.  They have trouble with passages that are longer than a paragraph.  (I’m not being flippant; I have years of experience teaching such students.)  This is important because context determines meaning.

I am reminded of the principal, no doubt a C student who became a gym teacher before he became a principal, who reprimanded me for swearing in the classroom.  I had updated Shakespeare’s ‘Fie upon it!’ to ‘Fuck you!’  Even after half an hour of ‘I wasn’t swearing; I was quoting a character who was swearing’, he was unable to comprehend the use-mention distinction.  (And if you don’t understand the difference, you can’t begin to understand the harm of not understanding the difference.  The harm of censoring both.)

And about a week ago, I posted the following to a subreddit:

Weather websites and municipalities are issuing air quality alerts because of the smoke coming from several uncontrolled forest fires, and yet people in cottage country just a few hours north of Toronto have smokepits going all day [this was before the prohibition of ALL campfires], presumably so they won’t be bothered by mosquitoes.

Never mind that said smokepits fill the whole neighbourhood with toxic smoke, worse by far than that coming from the forementioned forest fires.

Never mind that there are non-toxic (and non-trespassive) alternatives like zapper racquets and protective clothing.

A confrontation with a neighbour about this ended in a not-quite death threat: it took less than sixty seconds for the man to go from “Mind your own business!” (I am: when your smoke crosses over onto my property, it becomes my business) to “Fuck off, bitch! Why don’t you smash your head with rocks, then jump in the lake and drown?!” (Seriously. That’s what he said. He was practically foaming at the mouth.)

Surely, this is a metaphor. Our environment, our world—in this case, the very air that we breathe—is visibly overburdened, and people just carry on, doing whatever it is they’ve always done, or want to do, regardless of the consequences.

And I immediately received this response:

Sorry, this post has been removed by the moderators of r/canada.

Moderators remove posts from feeds for a variety of reasons, including keeping communities safe, civil, and true to their purpose.

So I sent a note to the moderators:

May I ask why my recent post was removed? I described uncivil behaviour; I myself was not uncivil.

Given the speed with which the decision to remove my post was made (almost instantaneous) and the slowness of the response to my query as to why (seven days and counting), I assume the first ‘decision’ was made by an algorithm and the second is being made by a human.

But here’s the thing: if you’re going to allow an algorithm to be the gatekeeper of free speech, you should at least code it to recognize the difference between using words and mentioning words.  (2)  And that difference depends on, surprise, context.

More recently, I used an AI image generator to produce an image of a ‘red spaceship’.  All of the images were chunky, nothing but straight lines. I wanted something sleek and curvey.  So I changed my input to ‘red spaceship with curves’ and immediately received  this message:

This prompt has been blocked. Our system flagged this prompt because it may conflict with our content policy.  More policy violations may lead to automatic suspension of your access.

‘With curves’—my god, do these porn-soaked programmers see everything as sexual? (3)  So this censorship, this unjustified limiting of freedom of speech, is all because of FUCKING MEN?

A friend suggested to me that filters detect whether certain words are in the same sentence.  Okay, that’s a step toward recognizing context.  A baby step.  Literally.  Most babies start with single words, typically nouns or verbs.  ‘Mama’.  ‘Go’.  Eventually, they put a few words together, but we often have to guess at what they mean.  ‘Mama go’ could mean  ‘Mama, go away!’ or ‘I don’t want Mama  to go!’ or ‘Mama, I want to go!’  (Adult speech is typically less ambiguous because we use all sorts of additional, other, words—conjunctions, prepositions, adjectives, adverbs, pronouns … ) So simply censoring anything with ‘Mama’ and ‘go’ is using  inadequate context and, therefore, will be ‘hit and miss’ with respect to meaning.

Sure enough, when I changed my prompt to ‘red spaceship curves’, it was deemed by the bot be acceptable.  That is, the algorithm let it pass through the gate.  ‘Curves’ in the context of ‘with’ was objectionable; ‘curves’ as a standalone word was acceptable.  But, still, the context, the complete context, was spaceships, not women. (4)

So I’m thinking now that much of the censorship and many of the account closures that have had people screaming has been due not to the nefarious intentions of administrators but to bad code writing.

And then I came across Wendy Grossman’s article in The Philosophers’ Magazine, wherein she describes how ChatGPT works:

What we currently call ‘AI’ is basically data and statistics.  In providing a response, for example, ChatGPT looks for statistical correlations between the data in its corpus and the prompt you have written.  Its answer doesn’t focus on what’s statistically likely to be true, but on using words that are statistically likely to appear near the ones you’ve used.  It is profoundly, catastrophically stupid.

So I was right.  Algorithms are just identifying words.  They’re also using context, but it’s the context of correlation, not the context of meaning.

And so, like ChatGPT, the algorithms that limit our freedom of speech are catastrophically stupid.

As are, by extension, the programmers who create such algorithms.  And the administrators who decide to use them.

*

And that’s why we should be afraid, very afraid, of AI.  It’s not a fear of technology; it’s a fear of juvenile and mediocre minds having incredible power.

Power not only over what we say, but also, consequently, over what we read.

And, eventually, power over what we are: when algorithms control our feeds, decide on the basis of correlation rather than meaning what news items we see, what music we hear, what books we download, what things we buy, they thus determine what we become.  Because we are what we expose ourselves to. (5)

And they do so mindlessly.  (It would be something completely different if they were doing it mindfully with conscious intent.  Though honestly, I don’t know which would be worse.)

Of further concern is that similarly mediocre minds often refuse to even attempt to override AI.  Many people, as mindless as bots, limit their actions to the incomplete menu provided, when they chat with their friends, when they write their resumes and cover letters, when they compose music …  Worse, customer service agents (at your internet provider, your bank, your government office), also as mindless as bots, stick to the script, refusing to be human, refusing to consider context which might reveal clarification or even exception. (6)  We used to say ‘Think outside the box!’  Now we absolutely need to say ‘Think outside the algorithm!’

_____

  1. Though even if I were, what’s objectionable about that? (Other than that it’s false.)  (And algorithms can’t, will never be able to, distinguish between true and false.)
  2. And that may be impossible: you can’t use a binary language to express non-binary thinking.
  3. Though I recall, back in the pre-internet-porn 80s, hearing a man call cassette tapes housed in clear plastic casing, as opposed to colored plastic casing, ‘sexy’.
  4. Though even mention of a ‘woman with curves’ can be unobjectionable. Depends on the complete context. (‘The only important thing about a woman is whether she has curves’ is saying something quite different from  ‘When you identify a woman by whether or not she has curves, you are objectifying her’.)
  5. Keep in mind that the algorithms for search engines that power our feeds consider popularity to be more important than relevance, let alone quality by any (other) measure.  (And said popularity is often obtained by dumping irrelevant terms, often porn terms, into meta tags …)  They also assign a higher rank when there are pictures.  (Pictures which need not be related to the text in any genuine way …)
  6.  I suspect that they’ve been told that if they do depart from the script, they’ll be fired.  In that case, the blame, the responsibility, is on the administrators.  And/or I suspect that departure from the script takes more time, and their activity is being monitored to the second.  Again, the blame, the responsibility is on the administrators.  Let your employees be human!  (And pay them as humans!)



Share


Excellent piece re PIV

Well worth the read. https://francoistremblay.wordpress.com/2014/11/21/why-is-there-no-counter-argument-from-the-pro-piv-side/ Share

Wendy Grossman on AI and ChatGPT

“What we currently call ‘AI’ is basically data and statistics.  In providing a response, for example, ChatGPT looks for statistical correlations between the data in its corpus and the prompt you have written.  Its answer doesn’t focus on what’s statistically likely to be true, but on using words that are statistically likely to appear near the ones you’ve used.  It is profoundly, catastrophically stupid.”  Wendy Grossman, “The Skeptic: Beyond the Hype” in The Philosophers’ Magazine 99


Share


It’s not poverty, stupid.

Poverty is not the cause of crime. Because even when men have enough, they steal more, kill for more. (Think of all those rich white CEOs running our planet, managing the banks, the oil companies, the logging companies …) (And given that well over 90% of crime is committed by men, well, the cause should be obvious: maleness and/or masculinity.) Share

Brilliant Sarkeesian

“In the game of patriarchy, women are not the opposing team. They’re the ball.” Anita Sarkeesian


Share


The Pro-Life Position You Don’t Hear



Share


Tim Dorsey, The Pope of Palm Beach

“The sun was going down behind the Big Burger when the alligator came flying in the drive-in window.”

How’s that for an opening line?

And it gets better.  (A feat worthy of great admiration, given that it’s his 22nd novel.)

“Do you have any idea who I am?”
“Yes, why?” Said Serge.  “Forgot your own name?” p36

“Several years ago the young man had been a wrestler in high school, and now he was the guy who talked about having wrestled in high school.”  p120

Oh, and the bit about ‘God helps those who help themselves’ on p294…

 


Share


Says it all.




Share