When I sat down to write this article, I planned to write about my contribution to the responsible tech landscape. I would expand on the recent conversation that Dawn Walter and I had on algorithmic bias in recommender systems for the Response-ability Summit podcast.

That seemed fitting, given this piece was a follow-up to that podcast. But as I was writing and rewriting, something kept gnawing at me. My mind returned to the casual, “data-less” bookends of our conversation in which I talked about my early interest in technology, the work I am doing today as a creator, and my book suggestions at the episode’s closing.

What strikes me as interesting about those parts of the conversation is that, in many ways, they are the reason I am interested in studying responsible tech and why I feel a tension between my anthropological and technologist identities. It is also interesting because it underscores what I believe should be one of our primary goals as anthropologists, namely, public engagement.

Much like what Dawn is doing with the RAS conference, blog, and podcast, I, too, am passionate about public anthropology and try to reach a broad audience, including academics, industry practitioners, and the public. While some of my efforts will arguably be more effective than others, I have learned that I prefer dialogue over discourse and would argue the former is more effective at making a dent in society.

So with that all in mind, I will share a bit about my work on algorithmic bias, my professional identity struggles, and my perspective on public engagement.

Algorithmic Bias and Recommender Systems

On the RAS podcast, I discussed the research I presented at Why the World Needs Anthropologists this past September. That research is grounded in my work as Head of Product & UX for Artmatcher, a mobile app that, among other things, seeks to lessen issues of access and inclusion in the art market.

The focus was on recommender systems, which are the algorithms used to surface content in digital systems. Examples of content include search engine results, music, movies, books, social media posts, dating partners, credit card suggestions, jobs, news, library books, and others which are too many to name.

The core premise of the conversation was that bias found in recommender systems has tremendous economic consequences on the estimated 50 million people that make up the creator economy. The reason for this is that recommender systems are the most common method for suggesting content, and yet, they are plagued by biases that favor some over others, often resulting in “winner-take-all” markets.

More broadly, we discussed how recommender systems mirror structural inequalities of society based on stratifications of capital, an externality I am well versed in.

As a creator of blogs, podcasts, music, and visual art, I am all too familiar with the cold realities of poorly performing content without an advertising budget or fanbase. This outcome, along with my role at Artmatcher, is one of the reasons I am studying the topic.

But truth be told, I am but one of many researchers interested in algorithmic bias, and in many ways, my perspective is not distinct from others. Many of us express caution for reasons such as my own and the many now infamous examples of algorithms having gone wrong.

From Google Flu to Microsoft Tay to the recent Zillow debacle that will result in a quarter of the staff being laid off, we “non-technologist” researchers know there are serious issues that impact the livelihood of billions across the globe.

So why, then, do I still secretly consider myself a technologist?

The Tensions of My Professional Identity

Most childhood memories involve my first experiences with gaming systems and computers. In the podcast, I recounted how I was fortunate to grow up in a household with a personal computer. My parents bought it around 1986. While this was not the early days of the home computing revolution, it was still a time dominated by dark screens with green letters and the now arcane world of bulletin board systems. Terrible looking and functioning as it all was, I loved it.

I’d spend hours fiddling with everything possible. Be it software or hardware, I was generally curious about how it worked. That curiosity has never left me. I admit it is a little tempered by age, which I like to think is attributed to lack of time more than innate interest, but it is still there regardless. I am still an early adopter of many technological developments, including creative AI, which I use to inspire or extend my abilities in nearly all aspects of my creative process.

I take no issue with this from one perspective, given that technology has always existed to enable humanity. On the other hand, I recognize that my continued use – by which I mean my forfeiting of data that is used to train AI algorithms – is contributing to the development of a potentially problematic branch of information technology.

This development, which frequently goes unchecked in the name of the never-ending quest for “growth,” happens secretly inside private organizations for competitive advantage reasons, which, I might add, I am sympathetic to, even if I know it is problematic. And so there is the tension.

I’m an anthropologist studying technology, with an appreciation for why Luddites feel the way they do. I often speak up against tech’s harmful aspects, such as in my TEDx talk on consumer DNA tests.

But I’m also a technologist. In fact, my entire career, and arguably my life since I was a young person, has been spent in bed with tech. From building custom computers as a teen to founding companies as a young adult, and now leading the product innovation process for other organizations, I have contributed to the development of the information technology industry, for good or bad.

I try my best to do the right thing. And with Artmatcher, I hope I’m designing for good. But with a 15-year history of consulting in the tech sector, I am sure there are at least a few examples where my decisions caused the inequalities we discuss at conferences like the Response-ability Summit. Even if indirectly.

So how do I remedy that?

Public Engagement and New Media

While I may not always be right in the moment, I try to learn and improve. To do that, I call upon the wise lessons of anthropology. I observe, listen, interact, and converse to understand the lived experience of others. But I do this not only to right my wrongs but because I believe we have a duty to engage the broader public constructively.

But what does public engagement even mean? Is it a book that costs $40 and only ships from the global north? Does an academic journal article count if no one outside of academia reads it? How about a keynote where the speaker is rushed off-stage afterward without even addressing pressing questions from the audience?

I am not here to suggest those are not useful, and in fact, I would encourage everyone to continue contributing in such ways. But I wish to also engage in other activities because, for me, public engagement involves participation and not proclamations.

That is why I started the Anthropology in Business and Anthro to UX podcasts. I was no longer content with the one-way blogs and papers I was writing. I wanted to hear and share the perspective of my colleagues, and learn from them, in a medium that was open and accessible. Those conversations, which involve some of the leading anthropologists globally, are devoid of jargon and free of charge.

I’m not the first to do this, nor will I be the last. Truth be told, I wasn’t an early adopter of this trend. In fact, I was late to the party. But that didn’t stop me from jumping in and contributing, nor should it stop you.

Do what you all may, but I would like to see us use all forms of media, as emergent or popular as they may be, to reach the broadest audience possible. The days of conducting research, writing a few journal articles, and producing a monograph for our peers do not need to be over, but they can change.

Today there is no reason we can’t use a drip approach to distribute the knowledge we acquire throughout the entire research journey in an open, FAIR, and transparent manner. Not only does this help with trust and accountability within academia and industry, but it also allows us to engage the public along the way.

Have some data; why not make it available on Zenodo before publishing? Mulling over some initial insights, you need to think through a bit more; what about a podcast or informal presentation where you can kick some ideas around outside of your inner circle? You’ve written a thought-provoking book; why not “dumb it down” and give a TEDx Talk? Reached rock star status and can get an invite to present anywhere; what if you made time for “soft” TV appearances with incredible reach?

Though I may never be the one to be invited on late-night TV to talk about the issues of our time as Margaret Mead once did on the Johnny Carson show, that does not bother me, nor will it stop me from trying to contribute to anthropology having a seat at the table. 

Will you join me?