Posts

Comments

Comment by Alexandros3 on The Nature of Logic · 2008-11-15T11:56:14.000Z · LW · GW

Would it be plausible to say that a first-order logic database is a special case of a bayesian database where every probability is 100%?

Comment by Alexandros3 on Economic Definition of Intelligence? · 2008-10-29T23:09:44.000Z · LW · GW

I am not sure you are taking into account the possibility that an intelligence may yield optimal performance within a specific recource-range. Would a human mind given a 10x increase in memmory (and memmories) opperate even marginally better? Or would it be overwhelmed by an amount of information it was not prepared for? Similarly, would a human mind even be able to operate given half the computational resources? In comparing mind A with 40bits/1trillionFPO with the Mind B of 80bits/2trillionFPO may be a matter of how many resources are available, since we don't have any datapoints about how much they each yield given the other's resources.

So perhaps the trendy term of scalability might be one dimension of the intelligence metric you seek. Can a mind take advantage of additional resources if they are made available? I suspect that an intelligence A that can scale up and down (to a specific minimum) linearly may be thought of as superior to an intelligence B that may yield a higher optimization output for a specific amount of resources but is unable to scale up or down.

Comment by Alexandros3 on Dark Side Epistemology · 2008-10-18T10:51:55.000Z · LW · GW

How about the notion of an insult as a first-order offence? "Don't insult God/Our Nation/The People/etc.". It is an explicit emotional fortress that reason cannot by definition scale. When it goes near there, all the 'intelligence defeating itself' mechanisms come into play. We take the fortress as our starting argument and start to think backwards until our agitated emotions are satisfied by our half-reasonable but beautiful explanation of why the fortress is safe and why what caused us to doubt it is either not so or can be explained some other way. Ergo, one step deeper into dark epistemology.