What changes when connectivity is rooted in communities?

The Association for Progressive Communications newsletter covers community networks from Colombia to Nigeria:

By being rooted in their own communities and encouraging collective articulation, a community network can became a catalyst for rethinking digital spaces and build more inclusive practices, taking into account, say, inequalities of gender, race and those that impact people with disabilities – as the pieces collated for this issue show.

Submitted by jboy

Observers Observed: The Ethnographer in Silicon Valley

In a contribution to a series of essays on Silicon Valley for the venerable academic blog Crooked Timber, Tamara Kneese writes about being an ethnographer in the world of tech:

What do the stories of the many generations of ethnographic researchers who joined and sometimes left the tech industry have to tell us about how Silicon Valley ideologies are taken up, embedded, and contested in workflows and products? How do the collected personal stories, or oral histories, of UX researchers interface with those of tech campus janitors and engineers? And is there something valuable that can be learned from their varied experiences about the sometimes ambivalent relationships between research, work, and collective action?

The introductory post to the series (with links to all contributions) penned by Henry Farrell can be found here.

Submitted by jboy

This new data poisoning tool lets artists fight back against generative AI

Melissa Heikkilä reports on a new tool for artists for MIT Technology Review:

A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. …

Nightshade exploits a security vulnerability in generative AI models, one arising from the fact that they are trained on vast amounts of data—in this case, images that have been hoovered from the internet. Nightshade messes with those images.

Submitted by jboy (via)

The enshittification of academic social media

The Thesis Whisperer on social media for academics – and why it may be a good idea to step away:

Telling academics they can achieve career success by using today’s algorithmic-driven platforms is like telling Millennials they could afford to buy a house by eating less avocado on toast. It’s a cruel lie because social media is a shit way to share your work now.

Not a little bit shit either. Very shit.

Submitted by jboy (via)

I’m a Luddite (and So Can You!)

In this new comic in The Nib, Tom Humberstone explains what the Luddites can teach about resisting an automated future:

In truth, the Luddites were skilled with machines. They were simply fighting for better workers rights.

Submitted by jboy

Confronting Tech Power

AI Now’s annual report diagnoses the challenge of concentrated power in tech – and seeks ways to bring change to the industry.

We intend this report to provide strategic guidance to inform the work ahead of us, taking a bird’s eye view of the many levers we can use to shape the future trajectory of AI – and the tech industry behind it – to ensure that it is the public, not industry, that this technology serves.

Submitted by jboy (via)

See how biased AI image models are

MIT Technology Review covers research by Alexandra Sasha Luccioni, Christopher Akiki, Margaret Mitchell, and Yacine Jernite about bias in generative text-to-image models like DALL-E 2 and Stable Diffusion:

After analyzing the images generated by DALL-E 2 and Stable Diffusion, they found that the models tended to produce images of people that look white and male, especially when asked to depict people in positions of authority. That was particularly true for DALL-E 2, which generated white men 97% of the time when given prompts like "CEO" or "director." That’s because these models are trained on enormous amounts of data and images scraped from the internet, a process that not only reflects but further amplifies stereotypes around race and gender.

Submitted by jboy (via)

The climate cost of the AI revolution

Wim Vanderbauwhede discusses what wide adoption of large language models (LLMs) could mean for global emissions of carbon dioxide:

[W]ith a hundred very popular AI-based services in the entire world, the electricity consumption resulting from the use of these services would lead to unsustainable increases in global CO₂ emissions.

Submitted by jboy

Navigating Jefes Fantasmas in New York City’s Urban Platform Economy

In Metropolitics, Jackson Todd examines delivery worker organizing in New York City to understand how today’s "phantom bosses" are shaping the future of labor rights in the United States:

The logistics of the so-called platform economy have reshaped our cities and communities. Urbanites can now get everything from groceries, toiletries and pet supplies to prescription medications, flowers and fast food delivered to their doors in minutes, disrupting the supply chains of a large swath of industries. For the multinational technology companies whose software powers food-delivery applications (Uber Eats, Grubhub, DoorDash), the primary goal is to create a seamless experience for the customer. But in this process, the logistics of on-demand delivery, including the exploitation of New York City’s delivery personnel, or deliveristas (as they have dubbed themselves), is rendered entirely invisible. Gig workers in New York City have become innovators in their own right, pioneering their own ways of utilizing technology in their fight for better working conditions.

Submitted by jboy

Counter Cloud Action Day

Today, March 8, 2023, is Counter Cloud Action Day:

On this day, we will try to withhold from using, feeding, or caring for The Big Tech Cloud. The strike calls for a hyperscaledown of extractive digital services, and for an abundance of collective organising. We join the long historical tail of international feminist strikes, because we understand this fight to be about labour, care, anti-racism, queer life and trans★feminist techno-politics.

Too many aspects of life depend on The Cloud. The expansionist, extractivist and financialized modes of Big Tech turn all lively and creative processes processes into profit. This deeply affects how we organise, and care for resources. Many public institutions such as hospitals, universities, archives and schools have moved to rented software-as-a-service for their core operations. The interests of Big Tech condition how we teach, make accessibility, learn, know, organise, work, love, sleep, communicate, administrate, care, and remember.

Submitted by jboy (via)