Politics Is Upstream of AI

post by iceman · 2016-09-28T21:47:40.988Z · LW · GW · Legacy · 5 comments

This is a link post for http://thefutureprimaeval.net/politics-is-upstream-of-ai/

Contents

5 comments

5 comments

Comments sorted by top scores.

comment by WalterL · 2016-09-29T14:42:11.037Z · LW(p) · GW(p)

This article is an example of looking at the world pragmatically, and acknowledging an actual truth. Kudos to the writers.

It reminds me of the scene at the start of Bad Boyz 2, where the drug kingpin has a giant pile of paper cash, and rats are nesting in it.

Kingpin: "This is a STUPID problem to have." ... Kingpin: "But it IS a problem. Hire exterminators."

Similarly, politics getting in the way of transforming the world with its irksome interest in transforming the world is exactly the sort of thing that clear eyed futurists need to figure on.

comment by iceman · 2016-09-28T21:49:14.417Z · LW(p) · GW(p)

I also enjoyed the linked Politics Is Upstream of Science, which went in-depth on the state interventions in science talked about in the beginning of this piece.

comment by scarcegreengrass · 2016-10-03T18:36:16.795Z · LW(p) · GW(p)

To add some specificity to this article, I can think of a few examples of cultural/philosophical perspectives that most people often take as assumptions in the LW Diaspora (that would not be shared by all historical humans). I like most of these assumptions, but it's always nice to specify your axioms, right?

• We are observers of an objective system of matter and energy that follows simple, particle-level rules.

• Most physical goals can be achieved given enough thought.

• Every running instance of a pattern that is close to a human brain is a moral peer. We want to promote the prosperity of peers. Mammals and other megafauna are peers of maybe 1/10 the moral weight.

• We want moral peers to have comfort, happiness, and (maybe instrumentally) control over their lives.

• We prefer to promote the prosperity of each individual human over the prosperity of an organization or the prosperity of each human cell.

• We would prefer to replace 'barren' regions (eg Mars) with ecosystems or industrial systems.

• A consensus of many diverse intelligences usually makes safer, more accurate decisions than a dictatorship of one intelligence.

• Where our current cultural perspective differs from past, contemporary, or future cultural perspectives, we are open to the idea that our perspective is not the best.

• Earth transitioned from an abiotic planet to a planet with a biosphere, and that is somewhat unusual.

comment by scarcegreengrass · 2016-10-03T18:19:42.467Z · LW(p) · GW(p)

An attempt at a synopsis of this article:

If humans build advanced AI systems, the systems will inherit the cultural, ideological, philosophical, and political perspective of its designers. This is often bad from the perspective of future generations of humans.

comment by mishka · 2023-07-16T21:59:19.406Z · LW(p) · GW(p)

The linked post disappeared, but copies exist on the Wayback Machine, e.g. https://web.archive.org/web/20160930125211/http://thefutureprimaeval.net/politics-is-upstream-of-ai/