Modelling Model Comparisons

post by Logan Riggs (elriggs) · 2019-04-04T17:26:45.565Z · LW · GW · 2 comments

Contents

  1. Similar Relationships [Type 1 comparison]
  2. (Object1, Relationship) = (Object2) [Type 2 comparison]
  Human Language and Communicating Models
  Future Work:
None
2 comments

Define a model as a set of objects and their relationships. For example, when discussing a model of music, "notes" would be the objects and a possible relationship would be "harmony". [Technically, the objects would be (frequency, volume, time), and from their you could define the relationships "harmony", "tempo", "crescendo", etc.]

Using this, there are two different ways to compare models: similar relationships and (Object, Relationship) = (Object)

1. Similar Relationships [Type 1 comparison]

Using the relationship "Repetition", there is repetition in musical phrases (Fur Elise), in poetic phrases (Annabel Lee), in song chorus's, themes in movies, etc. Even though these four examples contain different objects, we're able to find the similar relationships and compare them.

I can think of two uses for this type of comparison. The first is in metaphors. The second is in generalization. Elaborating on the latter, I find a piece of music annoying if it's too repetitive. The same is true, for poems, songs, and movies; however, I very much enjoy a song if it strikes a good balance between repetition and novelty. Learning to strike that balance when writing music has generalized to doing the same in writing lyrics and dance choreography.

The same generalization can happen in non-art fields as well, such as realizing that two different functions are both convex or that two different problems can be solved recursively, and so on.

2. (Object, Relationship) = (Object) [Type 2 comparison]

A good example to start is:

But this is more than just connecting quarks/electrons to atoms or atoms to objects in language. You can connect models of various levels together. With the model of language, we can connect to the low-level object "quarks" and the high-level object "diamond". This has helped my understanding of how Multi-level [LW · GW] Models [LW · GW] relate to transparency: if an AI can find the correct Type 2 comparison between what it's doing and our human language, then transparency is solved; however, human language is very complex.

Human Language and Communicating Models

Let's say you're talking with Alice. Two failures in communicating models are when you're (1) not discussing the same objects (Alice's definition of "sound" your definition) or (2) you don't know how Alice believes those objects relate or vice versa.

One might naively say that the model of language is simply "words" and how they relate; however, it's more like "words, your models that relate to those words, and your model of Alice's models that relate to those words". The two failures above arise from thinking in the naive model of language, and can be alleviated by the more complex model. Two helpful thoughts when talking to Alice are:

1. "Are we discussing the same objects?"

2. "How does Alice think these objects relate?"

and of course asking Alice the relevant questions to figure that out.

Future Work:

1. Type 1 comparisons:

2. Type 2 comparisons with human language/transparency

3. Creating lots of examples of Type 1 and Type 2 comparisons to un-confuse myself (I predict that what I wrote in this post is confused but at least points in the right direction)

4. A better word for Type 1, 2 comparisons

2 comments

Comments sorted by top scores.

comment by Gordon Seidoh Worley (gworley) · 2019-04-04T18:05:02.311Z · LW(p) · GW(p)

Type 1 seems to be describing what I'd call a "structure" which is another way of talking about a pattern but in a certain abstract sense. For example, consider the classic mathematician joke about topologists not being able to distinguish a donut from a coffee cup because they have the same topological genus (at least, idealized donuts and coffee cups do), genus-1.

Type 2 seems to be describing what I'd call a "system", i.e. multiple objects in relation with each other coming together to form a new object at a different level of abstraction.

Although my thinking has certainly evolved a lot since then, I wrote about an issue that required addressing this topic a couple years go, so you might find that interesting even if you're not so interested in the topic I was addressing directly.

Replies from: elriggs
comment by Logan Riggs (elriggs) · 2019-04-04T19:19:34.356Z · LW(p) · GW(p)

I almost agree with your Type 2 = "system", replace [at a different level of abstraction] with [in a different model]. Going from quarks to atoms to chairs would be different levels of abstraction, yes, but I'm trying to point at a an even broader comparison with "system" being just a subset.

For example, I could describe the object "apple" using physics, chemistry, language, and photography. Comparing atoms with words with pixels wouldn't be just varying abstractions, at least in my understanding of the word abstractions.

I've read your article, and thoroughly enjoyed the topic you were addressing directly. I believe you linked it for the similarity between "multiple objects in relation form a new object at a different level of abstraction" and progressing through each Kegan stage, is that correct?