Posts

Comments

Comment by systemsguy on Tell Your Rationalist Origin Story · 2014-11-27T03:23:27.897Z · score: 1 (1 votes) · LW · GW

Thank you for the welcome!

I will review CFAR, as at a glace it has some significant clients and at least some success.

There are no meetups near me, it seems.

I appreciate the feedback.

Comment by systemsguy on Tell Your Rationalist Origin Story · 2014-11-27T03:18:25.830Z · score: 0 (0 votes) · LW · GW

Thank you for the welcome!

This site is unusually populated with internal links -- that must take some discipline for the posters (and either good search tools or good memories, or both!).

I will review your links, and I much appreciate your sharing.

Comment by systemsguy on Tell Your Rationalist Origin Story · 2014-11-26T02:50:45.630Z · score: 2 (2 votes) · LW · GW

Hi all. I'm a seasoned engineer, BSEE plus MS in Systems Engineering, with a couple of decades in electronics systems architecture, team management, and now organization management. I'm a big picture guy who can still somewhat do the math, but not really much anymore (ahhh, back in the day.......). Myers-Briggs says I'm an INTJ.

I've had some classes and additional practical experience in decision theory, statistics, communications theory, motivation, common biases and fallacies, utility, and such basics. I am beset with an interest in almost everything technical (I'm a T person, with the depth in electronics systems and the breadth in general engineering and technical topics), but heavily skewed to applied technology, not research. The observable world to me seems to be horridly sub-optimized, largely to human short-sightedness and apparent inability to plan ahead or see the bigger picture of their actions. I much like games and what-ifs. Favorite quotes include Einstein's "you can't solve problems with the same level of thinking that created them", an unattributed "people are not rational creatures, but rationalizing", and one I use to limit analysis-paralysis "I can afford to be wrong, but not indecisive".

I am individualistic and introverted by nature, but I've become more socially conscious and communicative as I've progressed in my career and life with wife and kids. I'm here because I'd like for the world to be a more rational place, especially for my children, but honestly my expectations for success are low. I like the moderated format and technically leaning of this site, though to be honest my readings over the last few days indicate the discussions are more like a debate room than a crowd-sourced problem-solving machine. I'm not saying that is bad, but I can't help but wonder where the "action verbs" will come into the game.

Comment by systemsguy on Humans are not automatically strategic · 2014-11-25T19:18:43.023Z · score: 1 (1 votes) · LW · GW

Once I held passing interest in Mensa, thinking that an org of super-smart people would surely self-organize to impact the world (positively perhaps, but taking it over as a gameboard for the new uberkind would work too). I was disappointed to learn that mostly Mensa does little, and when they get together in meatspace it is for social mixers and such. I also looked at Technocracy, which seemed like a reasonable idea, and that was different but no better.

Now I'm a few decades on in my tech career, and I have learned that most technical problems are really people problems in disguise, and solving the organization and motivational aspects are critical to every endeavor, and are essentially my full-time job. What smoker or obese person or spendthrift isn't a Type 3, above? Who doesn't absorb into their lives with some tunnel vision and make type 2 mistakes? Who, as a manager, hasn't had to knowingly make a decision without sufficient information? I know I have audibly said, "We can't afford to be indecisive, but we can afford to be wrong", after I make such decisions, and I mean it.

Reading some of these key posts, though, points out part of the problem faced in this thread: we're trying to operate at higher levels of action without clear connections and action at lower levels. http://lesswrong.com/lw/58g/levels_of_action/

We have a forum for level 3+ thinking, without clear connections to level 1-3 action. The most natural, if not easy, step would be to align as a group in a fashion to impact other policy-making organizations. To me, we are perfecting a box of tools that few are using; we should endeavor to have ways to try them out and hone the cutting edges, and work then to go perform. A dojo approach helps with this by making it personal, but I'm not sure it is sufficient nor necessary, and it is small-scale and from my newbie perspective lacking shared direction.

Take dieting, for a counter example: I can apply rationality and Bayesian thinking to my dietary choices. I recall listening to 4-4-3-2 on Sat morning cartoons, and I believed every word. I read about the perils of meats and fat, and the benefits of vegetable oils and margarine. I heard from the American Heart Association to consume much less fat and trade out for carbs. I learned from the Diabetes Association to avoid simple carbs and use art'f sweeteners. Now I've learned not to blindly trust gov'ts and industries, and have combined personal experience, reading, and internet searching to gain a broader viewpoint that does not agree with any of the above! Much such research is a sifting and sorting exercise at levels 2-4, but with readily available empirical Level 1options, as I can try out promising hypotheses upon myself. As I see what works, and what doesn't, I can adapt my thinking and research. Anybody else can too.

Would a self-help group assist my progress? Well, an accountability group helps, but it isn't necessary. Does it help to "work harder" at level 1 alone? No....key improvements for me have come with improving my habits and managing desire, and then improving how I go about improving those. Does it help to have others assisting at level 3 and up? To an extent, it is good to share via e-mail and anecdote personal experiences, books, and thoughts.

The easy part is the vision, though -- I want to be healthier, lighter, stronger, and live longer. Seems pretty clear and measurable -- weight, blood pressure, cholesterol, 1-mile run time, bench-press pounds.

So what is the vision here? What are our relevant and empirically measurable goals?

Comment by systemsguy on Professing and Cheering · 2014-11-24T02:52:42.337Z · score: 0 (0 votes) · LW · GW

Some individuals (and I presume more here than most venues) struggle with any internal inconsistency, while others readily compartmentalize and move on. I am an engineer by training and of course most of my workmates are engineers, yet they represent a variety of religions as well. Most have some questions and doubts about their own, and plenty more about others, and yet that doesn't make a huge difference for day-to-day life.

Some would quickly conclude that such an engineer's judgement is questionable, and discount their work, but most seem to be adequately logical in other spheres.

Perhaps the better questions is one of utility -- what value does the individual get for their beliefs? I graduated with many Elect Engrs; let's presume one went to work on microprocessor design (driven by quantum theory) and another does correction math for GPS satellites (driven by relativity). It is well understood that the two theories have been objectively demonstrated to work well in their respective domains, and yet are mathematically incompatible (at best each may a simplification of a more universal rule). Both cannot be 'true', and while both could be false and likely are to some degree, they are both incredibly useful.

From a systems perspective I tend to fall back on the Systems rules-of-thumb, like "all models are wrong; some are useful", and "draw a box around what is working together to do what you're interested in, and analyze within". Compartmentalization allows one to get down to the work at hand, in support of a utilitarian view.

I am here to learn, though. Must inconsistency be driven out, or simply embraced as part of the imperfect human machine?

Comment by systemsguy on Mandatory Secret Identities · 2014-11-23T18:32:49.025Z · score: 0 (0 votes) · LW · GW

First post, so I'll be brief on my opinion. I would say "it depends". To communicate between people and even to clarify one's own thoughts, a formal language, with an appropriate lexicon and symbols, is a key facilitator.

As for desirability of audience, the About page says "Less Wrong is an online community for discussion of rationality", with nothing about exclusivity. I would suggest that if a topic is of the sort that newbies and lay people would read, then English is better; if more for the theorists, then math is fine.