Feeds:
Posts
Comments

Posts Tagged ‘social science’

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil is another pick of the ACRL Science and Technology librarians. I started it as a library eBook and then Klara and the Sun was ready in my eBook holds so I set it aside. I got a chance to finish it this morning. I wasn’t sure how interesting a book about algorithms would be, but it turns out, the answer is very.

O’Neil‘s own story is also interesting. She started as a mathematician in academia, went to work at a hedge fund, and had an epiphany there that math in the wrong hands could be used for bad. Since then, her bio on her blog notes, “She hopes to someday have a better answer to the question, “what can a non-academic mathematician do that makes the world a better place?”

Weapons of Math Destruction is one answer; those who read it will be better informed and have the potential to advocate for a better world. O’Neil explains, “Models are opinions embedded in mathematics,” and then lays out how they can become WMDs: they “encoded human prejudice, misunderstanding, and bias into the software systems that increasingly managed our lives. Like gods, these mathematical models were opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, were beyond dispute or appeal. And they tended to punish the poor and oppressed in our society, while making the rich richer.” In addition, they can’t make exceptions, and, O’Neil notes, “often punish individuals who happen to be the exception.”

Then she goes through any number of systems where WMDs are functioning: college admissions, predatory recruiting in the for-profit college industry, online advertising, the justice system, employment (hiring, scheduling, health monitoring), credit, insurance, social services, and of course, elections. She touches, along the way, on Facebook, Google, and Amazon and their use of algorithms to control what news and information we see and through that, influence our decisions. It’s a tough book. Even if, like me, you enjoy a fair bit of privilege, you are probably being impacted in some way by these WMDs. And we all are impacted when the most vulnerable are made more so by these out of control tools. They touch nearly everyone in America.

O’Neil does make recommendations about how algorithms can be improved and companies who deploy them can be held accountable. She recommends reversing much of what makes an algorithm a WMD — make the models and algorithms transparent; notify people and let them appeal or dispute decisions and information produced by algorithms; make things that are considered unethical or illegal in real life also unethical or illegal online, where so much WMD work happens, test algorithms’ results for bias, prejudice, and unintended consequences. She also advocates for revealing “snake oil” math that isn’t really solving anything, but just enables companies to spy on employees and/or make more money (see: workplace wellness programs, which the Computer Scientist has called bullshit on for years). Oh, and do away with the electoral college.

Most of what worries me about Weapons of Math Destruction is that these WMDs exist because our society prioritizes profits over people, and those being hurt most by them are not the powerful/not reaping the profits so there is little incentive for change. And I wonder if O’Neil is (touchingly) optimistic about regulation. She says “the job of algorithmic accountability should start with the companies that develop and deploy the algorithms. They should accept responsibility for their influence and develop evidence that what they’re doing isn’t causing harm, just as chemical companies need to provide evidence that they are not destroying the rivers and watersheds around them.”

As the new Guardian and Consumer Reports study of drinking water just revealed, that’s not actually going so well. I live in a state where there is PFAS in drinking water because chemical companies did not provide evidence and the EPA allowed them to get away with not revealing the harm they were doing. The drinking water study illuminates how regulatory agencies themselves are designed to protect profit making companies and the bottom lines of municipalities, not people’s well being. So I can’t say I have a ton of confidence that regulating algorithmic accountability will work, at least not as long as we continue to allow our wealthy corporations/people (one and the same in the USA, don’t you know) to purchase political influence, and continue to allow a revolving door between industry and government agencies.

As for O’Neil’s suggestion that context can solve many of the issues surrounding WMDs — I was fortunate to hear Ruha Benjamin give an online presentation in February about her work at the Ida B. Wells JUST Data Lab, which is focused on just that solution. O’Neil and Benjamin both advocate for including input from people (actual human people, not corporation people) and communities impacted by the use and misuse of data as part of the solution. That makes sense; however, see my previous concerns. What incentive is there for the Googles, Facebooks, banks and insurance companies, etc. to listen?

A very clear, challenging read. I’m going to have to think about what I learned and what to do with it.

Advertisement

Read Full Post »

I read about You May Also Like: Taste in an Age of Endless Choice in a Blog U post by Joshua Kim. Kim wrote that the book made him ponder the way we select books, which is an interesting question for librarians to consider. He also made the point that the book illuminates how bad we are at explaining our own tastes and at choosing what we’ll like and I thought, “That’s me!”

I’m the person who can never declare definitively my “favorite” of anything — color, book, movie, ice-cream flavor, etc. So well developed was my ability to see the merits of more than one side of an argument or more than one type of anything that my father was convinced when I was in college I was going to be brainwashed in an airport while listening politely to some cult member’s point of view.

I’ve had both good friends and my future husband shake their heads at my music collection (back when said collection was on cassette, and radio stations and the Columbia House music club were my only option for hearing about bands). A friend referred to me as a “musical slut;” the future husband said I was a musical disaster. He seemed frustrated that I appeared to like completely disparate stuff, to “have no taste in music,” when his own tastes were fairly well defined.

It turns out there’s a term for this in the age of the Internet. In You may Also Like, Tom Vanderbilt notes that sociologists Richard Peterson and Albert Simkus call it “omnivorousness,” and that it’s newfangled cultural elitism. One’s eclectic tastes signal status, as liking a particular class of things (for example, being an opera buff) once did. These days my strange CD collection would gain me points if I was trying to impress hipsters or highbrows. I didn’t find this very comforting. I’m not sure what’s worse, to have my taste in music described as weird or elitist. I think I’ll stick with being a weirdo.

You May Also Like is full of social science studies, past and present (I really liked the historical perspectives), observations about modern shopping and listening patterns, and interesting facts about the psychology of choice. Some of it made me squirm — how many times have I said here on bookconscious that I tend to be skeptical of prize-winning books? Turns out that’s a documented phenomena — ratings of books on Amazon drop after they win a prize. (One possible explanation is that people who wouldn’t normally read a book like the prizewinner are drawn to it because of the prize and its publicity, so those readers were never a good match for the book and are disappointed).

Vanderbilt’s writing style made it hard for me to read this book before bed. I finished it yesterday afternoon and found I took much more in. His tone is a bit scholarly — not off-puttingly, but not ideal for when I’m at my sleepiest. I admire someone who totally geeks out over his or her subject, and I think Vanderbilt does. With 63 pages of end notes for 226 pages of text, there are often 5-6 references per page. Vanderbilt’s voice isn’t as familiar or conversational as AJ Jacobs or Bill Bryson, but he does relate some of what he learns to his own experience.

If you like your nonfiction well researched and well written, you’ll like this book. I learned about things I want to follow up on — like Forgotify, a site dedicated to the millions of songs never played on Spotify. I’ll try to notice the subtle clues that an online review may not be authentic and I’ll be more aware of Vanderbilt’s astute point that even if a review is “real” it may be “subject to distortion and biases.” And I’ll be paying closer attention to my own likes and dislikes and those of my friends and family, thinking more critically about how those form and change.

As Vanderbilt concludes, “Trying to explain, or understand, any one person’s particular tastes — including one’s own — is always going to be a maddeningly elusive and idiosyncratic enterprise. But the way we come to have the tastes we do can often be understood through a set of psychological and social dynamics that function much the same, from the grocery store to the art museum. The more interesting question is not what we like but why we like.” That could be an endlessly fascinating thing to explore, now that I’ve read You May Also Like.

 

 

Read Full Post »