Digital Divinity: Ancient traditions meet modern technology

Rest of World partners with the Henry Luce Foundation to tell stories about how technology is changing religious faith around the world:

This illustrated storybook represents a broad spectrum of themes and trends playing out across a number of religions and countries that include Hindu temples made by 3D printers to priests that dance on TikTok. They speak to the unraveling tensions of our time as people turn to technology to simplify their lives, search for answers, or find platform-born fame.

Submitted by jboy (via)

Calculating Empires: A Genealogy of Technology and Power Since 1500

From Kate Crawford and Vladan Joler, who previously collaborated on Anatomy of an AI System, this visualization explores the mutual shaping of social structures and technological systems since 1500.

The aim is to view the contemporary period in a longer trajectory of ideas, devices, infrastructures, and systems of power. It traces technological patterns of colonialism, militarization, automation, and enclosure since 1500 to show how these forces still subjugate and how they might be unwound. By tracking these imperial pathways, Calculating Empires offers a means of seeing our technological present in a deeper historical context. And by investigating how past empires have calculated, we can see how they created the conditions of empire today.

Make sure to check out the five-minute audio tour.

Submitted by jboy

The Deadly Digital Frontiers at the Border

In Time Magazine, Petra Molnar discusses her research on border technologies:

We need stronger laws to prevent further human rights abuses at these deadly digital frontiers. To shift the conversation, we must focus on the profound human stakes as smart borders emerge around the globe. With bodies becoming passports and matters of life and death are determined by algorithm, witnessing and sharing stories is a form of resistance against the hubris and cruelty of those seeking to use technology to turn human beings into problems to be solved.

Submitted by jboy (via)

What changes when connectivity is rooted in communities?

The Association for Progressive Communications newsletter covers community networks from Colombia to Nigeria:

By being rooted in their own communities and encouraging collective articulation, a community network can became a catalyst for rethinking digital spaces and build more inclusive practices, taking into account, say, inequalities of gender, race and those that impact people with disabilities – as the pieces collated for this issue show.

Submitted by jboy

Observers Observed: The Ethnographer in Silicon Valley

In a contribution to a series of essays on Silicon Valley for the venerable academic blog Crooked Timber, Tamara Kneese writes about being an ethnographer in the world of tech:

What do the stories of the many generations of ethnographic researchers who joined and sometimes left the tech industry have to tell us about how Silicon Valley ideologies are taken up, embedded, and contested in workflows and products? How do the collected personal stories, or oral histories, of UX researchers interface with those of tech campus janitors and engineers? And is there something valuable that can be learned from their varied experiences about the sometimes ambivalent relationships between research, work, and collective action?

The introductory post to the series (with links to all contributions) penned by Henry Farrell can be found here.

Submitted by jboy

This new data poisoning tool lets artists fight back against generative AI

Melissa Heikkilä reports on a new tool for artists for MIT Technology Review:

A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. …

Nightshade exploits a security vulnerability in generative AI models, one arising from the fact that they are trained on vast amounts of data—in this case, images that have been hoovered from the internet. Nightshade messes with those images.

Submitted by jboy (via)

The enshittification of academic social media

The Thesis Whisperer on social media for academics – and why it may be a good idea to step away:

Telling academics they can achieve career success by using today’s algorithmic-driven platforms is like telling Millennials they could afford to buy a house by eating less avocado on toast. It’s a cruel lie because social media is a shit way to share your work now.

Not a little bit shit either. Very shit.

Submitted by jboy (via)

I’m a Luddite (and So Can You!)

In this new comic in The Nib, Tom Humberstone explains what the Luddites can teach about resisting an automated future:

In truth, the Luddites were skilled with machines. They were simply fighting for better workers rights.

Submitted by jboy

Confronting Tech Power

AI Now’s annual report diagnoses the challenge of concentrated power in tech – and seeks ways to bring change to the industry.

We intend this report to provide strategic guidance to inform the work ahead of us, taking a bird’s eye view of the many levers we can use to shape the future trajectory of AI – and the tech industry behind it – to ensure that it is the public, not industry, that this technology serves.

Submitted by jboy (via)

See how biased AI image models are

MIT Technology Review covers research by Alexandra Sasha Luccioni, Christopher Akiki, Margaret Mitchell, and Yacine Jernite about bias in generative text-to-image models like DALL-E 2 and Stable Diffusion:

After analyzing the images generated by DALL-E 2 and Stable Diffusion, they found that the models tended to produce images of people that look white and male, especially when asked to depict people in positions of authority. That was particularly true for DALL-E 2, which generated white men 97% of the time when given prompts like "CEO" or "director." That’s because these models are trained on enormous amounts of data and images scraped from the internet, a process that not only reflects but further amplifies stereotypes around race and gender.

Submitted by jboy (via)