AGI Ethics News
banner
agiethicsnews.bsky.social
AGI Ethics News
@agiethicsnews.bsky.social
Your essential guide to AGI ethics. We provide expert analysis, curated content, and deep insights to help you understand and shape the future of artificial general intelligence. Subscribe to our free newsletter! #AGI #TechEthics
agiethicsnews.com
We would love for you to weigh in! You can find the poll here: us2.campaign-archive.com?e=fd5ad4c4cd...

Vote, read, and if you enjoyed the articles, please consider subscribing!

We'll share the results of the poll in next month's issue, which will be focused on public opinion.
Examining the ethical life cycle of an AGI
us2.campaign-archive.com
December 17, 2025 at 3:19 PM
8/8

Subscribe before Sunday to catch this month’s issue, focused on trust and reliability, where we’ll keep connecting the dots between AGI, ethics, and real‑world impacts.

agi.fightersteel.com
Home
The premier monthly newsletter dedicated to the ethics and impact of artificial general intelligence (AGI). Professionals and the public are faced with complex
agi.fightersteel.com
December 12, 2025 at 4:48 PM
7/8

Paul Ratner looked at how AGI is already challenging the film industry, from performance capture to synthetic actors, and why creators need new norms and safeguards to protect both workers and audiences.

agi.fightersteel.com/agi-ethics-i...
AGI ethics in the film industry
At the intersection of creativity and AGI stand some of the most pertinent questions in this field. If a future human-level superintelligence, an AGI (Artificia
agi.fightersteel.com
December 12, 2025 at 4:48 PM
6/8

Tristan Greene asked whether humans should always stay “in the loop” as AI systems begin to produce research at scale, and how automation might reshape scientific gatekeeping.

agi.fightersteel.com/should-human...
Should humans remain in the loop?
Preprint research platform arXiv recently announced that review articles and position papers submitted to its computer science category must be “accepted at a
agi.fightersteel.com
December 12, 2025 at 4:48 PM
5/8

In life sciences, Amy Lyall highlighted why FAIR, well‑governed data is an ethical backbone for safe AI and AGI in biology, not just a technical nice‑to‑have.

agi.fightersteel.com/the-ethical-...
The ethical importance of accurate data: it’s only FAIR
My background as a writer lies in the biological sciences. In this field, the advent of AI has had an impact which simply cannot be overstated. In the decade I
agi.fightersteel.com
December 12, 2025 at 4:48 PM
4/8

Practitioners from media, mental health, art, and national security shared how they want AGI to be transparent, emotionally aware, and truly “on team human” rather than a black box authority.

agi.fightersteel.com/perspectives...
Perspectives from the field
AGI Ethics News asked experts across a variety of fields to pen essays discussing how modern AI tools and the potential emergence of AGI might affect their prof
agi.fightersteel.com
December 12, 2025 at 4:48 PM
3/8

K.B. Miller examined how legal “rights of nature” movements can inspire new ways to protect ecosystems in an AGI future, centering non‑human interests in our governance thinking.

agi.fightersteel.com/lessons-from...
Lessons from Nature’s Rights
Mumta Ito, influential lawyer and founder of Nature’s Rights, says that “The law treats Nature as an object, so there is no legal relationship between Natu
agi.fightersteel.com
December 12, 2025 at 4:48 PM
2/ 8

Tristan Greene unpacked why today’s AI art is fundamentally mathematical recombination, raising tough questions about creativity, ownership, and what would truly count as “AGI‑level” art.

agi.fightersteel.com/the-letter-b...
Beauty is in the AI of the stockholder
There are countless legal challenges being fought around the globe between IP holders and AI development firms. At this point, it must feel like a rite of passa
agi.fightersteel.com
December 12, 2025 at 4:48 PM
"Just doing the same things more efficiently is too modest a contribution to our lives for this powerful technology." 👏 Great read!
December 12, 2025 at 3:28 PM