Internet Literacy Atrophy

post by Elizabeth (pktechgirl) · 2021-12-26T12:30:01.540Z · LW · GW · 49 comments

It’s the holidays, which means it’s also “teach technology to your elderly relatives” season. Most of my elderly relatives are pretty smart, and were technically advanced in their day. Some were engineers or coders back when that was rare. When I was a kid they were often early adopters of tech. Nonetheless, they are now noticeably worse at technology than my friends’ 3 year old. That kid figured out how to take selfie videos on my phone after watching me do it once, and I wasn’t even deliberately demonstrating. 

Meanwhile, my aunt (who was the first girl in her high school to be allowed into technical classes) got confused when attempting to use an HBOMax account I’d mostly already configured for her (I think she got confused by the new profile taste poll but I wasn’t there so I’ll never be sure). She pays a huge fee to use Go Go Grandparent instead of getting a smartphone and using Uber directly. I got excited when an uncle seemed to understand YouTube, until it was revealed that he didn’t know about channels and viewed the subscribe button as a probable trap. And of course, there was my time teaching my PhD statistician father how to use Google Sheets, which required learning a bunch of prerequisite skills he’d never needed before and I wouldn’t have had the patience to teach if it hadn’t benefited me directly. 

[A friend at a party claimed Apple did a poll on this and found the subscribe button to be a common area of confusion for boomers, to the point they were thinking of changing the “subscribe” button to “follow”. And honestly, given how coy substack is around what exactly I’m subscribing to and how much it costs, this isn’t unreasonable.]

The problem isn’t that my relatives were never competent with technology, because some of them very much were at one point. I don’t think it’s a general loss of intelligence either, because they’re still very smart in other ways. Also they all seem to have kept up with shopping websites just fine. But actions I view as atomic clearly aren’t for them.

Meanwhile, I’m aging out of being the cool young demographic marketers crave. New apps appeal to me less and less often. Sometimes something does look fun, like video editing, but the learning curve is so steep and I don’t need to make an Eye of The Tiger style training montage of my friends’ baby learning to buckle his car seat that badly, so I pass it by and focus on the millions of things I want to do that don’t require learning a new technical skill. 

Then I started complaining about YouTube voice, and could hear echoes of my dad in 2002 complaining about the fast cuts in the movie Chicago.

Bonus points: I watched this just now and found it painfully slow.

I have a hypothesis that I’m staring down the path my boomer relatives took. New technology kept not being worth it to them, so they never put in the work to learn it, and every time they fell a little further behind in the language of the internet – UI conventions, but also things like the interpersonal grammar of social media – which made the next new thing that much harder to learn. Eventually, learning new tech felt insurmountable to them no matter how big the potential payoff. 

I have two lessons from this. One is that I should be more willing to put in the time to learn new tech on the margin than I currently am, even if the use case doesn’t justify the time. Continued exposure to new conventions is worth it. I have several Millennial friends who are on TikTok specifically to keep up with the youths; alas, this does not fit in with my current quest for Quiet

I’ve already made substantial concessions to the shift from text to voice, consuming many more podcasts and videos than I used to and even appearing on a few, but I think I need to get over my dislike of recordings of my own voice to the point I can listen to them. I made that toddler training montage video even though iMovies is a piece of shit and its UI should die in a fire.This was both an opportunity to learn new skills and manufactured a future inspiration when things are hard.

Second: there’s a YouTube channel called “Dad, How Do I?” that teaches basic householding skills like changing a tire, tying a tie, or making macaroni and cheese. We desperately need the equivalent for boomers, in a form that’s accessible to them (maybe a simplified app? Or even start with a static website). “Child, how do I…?” could cover watching individual videos on YouTube, the concept of channels, not ending every text message with “…”, Audible, etc. Things younger people take for granted.  Advanced lessons could cover Bluetooth headphones and choosing your own electronics. I did some quick math and this is easily a $500,000/year business.

[To answer the obvious question: $500k/year is more than I make doing freelance research, but not enough more to cover the difference in impact and enjoyment. But if you love teaching or even just want to defray the cost of video equipment for your true passion, I think this is promising.]

My hope is that if we all work together to learn things, fewer people will be left stranded without access to technical tools, and also that YouTube voice will die out before it reaches something I care about.

49 comments

Comments sorted by top scores.

comment by georget · 2021-12-26T15:57:09.535Z · LW(p) · GW(p)

Nice article. As a late-bloomer boomer (68 years old) I find myself frustrated with those within my age bracket (65-70) who resist the most basic skills needed to navigate the world as it is whether they like it or not. Ex: a brother-in-law that uses a flip phone and expects me to time picking him up at the airport because he is too cheap and stubborn to learn how to text on a reasonable cell phone. I would note that your observation of being marginalized by marketers is true for us as well: compare the ads on tv at different times of day::old people ads in the morning, ads for bored, jobless,staying home sick folks midday, and the young, exciting people in prime time.
I try to keep up with younger generations and their views of the world through things like this forum, some discord channels, and EA stuff. However, I try not to play the old guy card and personally, I find that having a sense of age appropriateness is worthwhile. Age is not, in my opinion just a number. Our body gets old and it begins to take more attention to deal with all that process. You throw signing into some streaming service just cause you want to relax for a minute, or having to go navigate passwords to get to banking, doctors, the library ( for crying out loud in a bucket, as my father used to say) it can make a person sometimes seem inept and grumpy.

comment by arunto · 2021-12-26T17:35:45.998Z · LW(p) · GW(p)

For me (52 yrs old) it would actually be quite helpful to know what I should know/look into to keep up with current technologies. What is the current "internet canon" of tools, sites, and programs? And more general: How can I - at any given time - best find out which new things on the internet I should at least superficially learn in order not to be left behind?

Replies from: Gunnar_Zarncke
comment by Gunnar_Zarncke · 2021-12-27T21:19:29.254Z · LW(p) · GW(p)

It helps if you have kids. I have frequent discussions with my sons about why a certain new tech is worthwhile. And I challenge him to find solutions that I'd like to have but that don't exist - or that I just don't know about as it turns out in some cases. I discovered notion.com this way, many google suite features (he convinced me to use GChat), and the transcript panel of YouTube.

comment by jbash · 2021-12-26T16:32:19.377Z · LW(p) · GW(p)

The subscribe button on Youtube is a trap. And I say this as somebody who knows exactly what a channel is, why channels exist, why that subscribe button is there, and many of the reasons for fine little details in how they work; who could, if necessary, code Youtube from scratch, from the bare metal right up to all of the UI bloat; and who participated in building the technology base that Youtube relies on.

For that matter, spreadsheets are kind of a cognitive trap, and it's not necessarily a good idea to invest a lot in learning to use them... let alone to invest time and effort in learning a cloud version.

Replies from: GWS, Pattern, adamzerner
comment by Stephen Bennett (GWS) · 2021-12-26T16:56:35.634Z · LW(p) · GW(p)

Sure, I'll bite, why is the youtube subscribe button a trap? I anticipate that we will agree about what the subscribe button is and what it does, which means that this is fundamentally going to be a disagreement about what the definition of a trap is. I'm not interested in litigating that, so mostly I am curious about any information you have about how subscribing works that you expect I don't already know.

Replies from: jbash
comment by jbash · 2021-12-26T18:30:34.218Z · LW(p) · GW(p)

The subscribe button is there to take advantage of your cognitive and motivational structure and keep you "engaged" with YouTube. Having subscriptions gives you a "reason" to return to Youtube on a regular basis, and gives Youtube an excuse to send you "reminders" about content in your subscribed channels.

Your subscriptions may also help Youtube to feed you content that keeps you there once you show up, although Youtube has access to other, often more effective ways of doing that, and having to "honor" subscriptions may actually interefere with those, so I don't think it really counts.

Anyway, the bottom line is that, if you are like most people, subscriptions will contribute to you spending more time on Youtube than you "should", in the sense that your Youtube time will interfere with goals that you would, if asked, say were more important. The intent is to have "watching Youtube" be an activity in itself, rather than having Youtube be a tool that you use to get information relevant to some outside purpose.

The subscription system is also used to motivate people to give content to Youtube. Although some mega-channels make economic sense, the "gamification" of subscription numbers helps to motivate marginal creators to spend more time and effort than they can really afford.

Subscriptions may occasionally help to meet a "user goal" like learning or staying informed about a specific topic... but their design and usual effect is to advance the "Youtube goal" of keeping the user staring at, or possibly producing, Youtube content and advertising, more than the user otherwise would and regardless of the user's own interests (in any sense of the word "interests"...).

Some people will say that the trap has to do with tracking your activities, but that's basically not true. Subscriptions don't track you any more than just visiting any major Web site will track you. It's more about controlling your activities. Your subscriptions do help a little with analyzing you as an advertising target, but I don't think that's a really major purpose or effect.

Replies from: GWS
comment by Stephen Bennett (GWS) · 2021-12-26T19:12:59.690Z · LW(p) · GW(p)

I appreciate that you took the time to explain your position. I think this is indeed a difference in the definition of "trap", so I'll leave it here.

comment by Pattern · 2021-12-26T18:32:48.895Z · LW(p) · GW(p)

Why are spreadsheets a trap, and what do you use instead? (What do you mean by a 'cloud version', Google's spreadsheets?)

Replies from: jbash
comment by jbash · 2021-12-26T18:49:15.852Z · LW(p) · GW(p)

Spreadsheets make it really easy to set up a simple "mathematical model". It really doesn't take more than about 5 minutes to learn enough about spreadsheets to get something useful going, and actually starting a spreadsheet has very low overhead, both in terms of what you have to do and of how much thought you have to put into it.

The problem with that is that it's easy to start using them for everything, including things that are really too complicated to be safely done in a spreadsheet. It's also possible to have something that started out as a reasonable spreadsheet application grow from that into an unreliable, unmaintainable monstosity.

If you view a spreadsheet as a program, it's written in a "write only language". It's really hard to come into a big spreadsheet and understand how everything works, or know how to safely make a change, or meaningfully review it for correctness, or even apply revision control to it. There's no global view; you have to interact with the whole thing one cell at a time. And you're not exactly encouraged to give things meaningful names, either.

... but it's SO EASY to start with a spreadsheet that people are often lulled into making one, and then adding more and more to it. When your spreadsheet reaches a certain complexity level, you may then find yourself investing time into learning more and more arcane features so you can extend what it does... which means that, for you, a spreadsheet will become even more the default tool the next time you want to do something. But it'll still be a bad programming language.

You can end up with "spreadsheet experts" who use them for everything. If I had a nickel for every spreadsheet I've seen that should have been a database, for example...

It's sort of like writing shell scripts; it's trivial to write a script to automate a few commands you do all the time, but if you keep adding features, then a year later you have a monstrosity that you wish you'd written in a regular, maintainable language.

And by a "cloud version", I mean Google Sheets (or Office 365, for that matter). The data are controlled entirely by the host; it may or may not be feasible to extract all of the information you put in, and it's definitely not going to be trivial if your spreadsheet has any complexity. The program that does the calculations is controlled entirely by the host, and may be changed at any time, including in ways that alter the results. The feature set is controlled entirely by the host; features you rely on may be changed or completely removed at any moment. Not really attractive as a long-term investment.

On edit: what I use instead is usually a real programming language. I won't say which ones I favor, because it would be impolite to start even more of a language war. :-)

Replies from: ciphergoth, Pattern
comment by Paul Crowley (ciphergoth) · 2021-12-26T20:16:50.508Z · LW(p) · GW(p)

I would love a web-based tool that allowed me to enter data in a spreadsheet-like way, present it in a spreadsheet-like way, but use code to bridge the two.

Replies from: JenniferRM, Gunnar_Zarncke
comment by JenniferRM · 2022-01-12T18:46:03.888Z · LW(p) · GW(p)

Subtracting out the "web-based" part as a first class requirement, while focusing on the bridge made of code as a "middle" from which to work "outwards" towards raw inputs and final results...

...I tend to do the first ~20 data entry actions as variable constants in my code that I tweak by hand, then switch to the CSV format for the next 10^2 to 10^5 data entry tasks that my data labelers work on, based on how I think it might work best (while giving them space for positive creativity).

A semi-common transitional pattern during the CSV stage involves using cloud spreadsheets (with multiple people logged in who can edit together and watch each other edit (which makes it sorta web-based, and also lets you use data labelers anywhere on the planet)) and ends with a copypasta out of the cloud and into a CSV that can be checked into git. Data entry... leads to crashes... which leads to validation code... which leads to automated tooling to correct common human errors <3

If the label team does more than ~10^4 data entry actions, and the team is still using CSV, then I feel guilty about having failed to upgrade a step in the full pipeline (including the human parts) whose path of desire calls out for an infrastructure upgrade if it is being used that much. If they get to 10^5 labeling actions with that system and those resources then upper management is confused somehow (maybe headcount maxxing instead of result maxxing?) and fixing that confusion is... complicated.

This CSV growth stage is not perfect, but it is highly re-usable during exploratory sketch work on blue water projects because most of the components can be accomplished with a variety of non-trivial [LW · GW] tools.

If you know of something better for these growth stages, I'd love to hear about your workflows, my own standard methods are mostly self constructed.

comment by Gunnar_Zarncke · 2021-12-27T21:13:39.076Z · LW(p) · GW(p)

There are tools that let you do that. There is a whole unit testing paradigm called fixtures for it. A prominent example is Fitnesse: http://fitnesse.org/FitNesse.UserGuide.WritingAcceptanceTests

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2021-12-27T23:17:30.620Z · LW(p) · GW(p)

I'm not sure I see how this resembles what I described?

Replies from: Gunnar_Zarncke
comment by Gunnar_Zarncke · 2021-12-28T00:52:15.676Z · LW(p) · GW(p)

Maybe I misunderstand what you have in mind? The idea is to 

  • enter data in a spreadsheet,
  • that is interpreted as row-wise input to function in a program (typically a unit test), and
  • the result of the function is added back into additional columns in the spreadsheet. 
Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2021-12-28T01:00:50.517Z · LW(p) · GW(p)

The idea is that I can do all this from my browser, including writing the code.

Replies from: Gunnar_Zarncke, ESRogs
comment by Gunnar_Zarncke · 2021-12-28T01:29:05.702Z · LW(p) · GW(p)

That would be cool. I think it should be relatively easy to set up with replit (online IDE).

comment by ESRogs · 2021-12-31T02:45:35.399Z · LW(p) · GW(p)

Sounds a bit like AlphaSheets (RIP).

comment by Pattern · 2022-01-03T21:35:12.515Z · LW(p) · GW(p)
On edit: what I use instead is usually a real programming language. I won't say which ones I favor, because it would be impolite to start even more of a language war. :-)

Different programming languages are for different things.

'I use this instead of spreadsheet' - that's a use case I haven't heard a war over. ('I use this note taking app' - that I have read a lot of different sides on.)

comment by Adam Zerner (adamzerner) · 2021-12-27T00:25:48.316Z · LW(p) · GW(p)

Downvoted for being purely an argument from authority.

Replies from: pjeby, tomcatfish, IrenicTruth
comment by pjeby · 2021-12-27T05:53:51.694Z · LW(p) · GW(p)

I believe the reason the author mentioned their credentials was not to establish themselves as an authority, but to indicate that it's possible to see the subscribe button as a trap even if one is tech savvy and knows it has nothing to do with e.g. subscription billing. (In contrast to where the article implied people avoided the subscribe button due to not understanding it.)

Replies from: adamzerner
comment by Adam Zerner (adamzerner) · 2021-12-27T19:50:40.110Z · LW(p) · GW(p)

This is a good example of a situation where I believe the principle of charity is being applied too strongly. The author's claim was that it is a trap, not that it is possible to see it as a trap. The structure of that first paragraph is "Claim that it is a trap. Points about being an authority figure on the topic." (FWIW I don't mean any of this contentiously, just constructive criticism.)

comment by Alex Vermillion (tomcatfish) · 2021-12-27T22:43:52.253Z · LW(p) · GW(p)

In agreement: It is literally an argument from authority because there is no other proof given. Readers of the original comment are asked to assume the commenter is correct based on their authority and reputation.

comment by IrenicTruth · 2021-12-31T00:59:52.019Z · LW(p) · GW(p)

Like pjeby, I think you missed his point [LW(p) · GW(p)]. He was not arguing from authority, he was presenting himself as evidence that someone tech-savvy could still see it as a trap. His actual reason for believing it is a trap is in his reply to GWS [LW(p) · GW(p)].

comment by Dagon · 2021-12-27T16:23:15.345Z · LW(p) · GW(p)

I despise videos when text and photos would do - I'm far too often in a noisy (or shared quiet) space, and I read so much faster than people talk.  I'm even more annoyed at videos that pad their runtime to hit ad minima or something - I can't take a quick scroll to the end to see if it's worthwhile, then go back and absorb what I need at my own pace.

I recognized that videos take less time from the creator, and pay better.  So that's the way of the world, but I don't have to like it.  I mention this mostly as an explanation that I know I'm in the "old man yells at cloud" phase of my life, and a reason that I'm OK with some aspects of it.

Replies from: Gunnar_Zarncke
comment by Gunnar_Zarncke · 2021-12-27T21:24:21.201Z · LW(p) · GW(p)

I think video has a potentially higher bandwidth of information that text. The downside is that it more difficult to skim esp for people who can speedread. I was very happy when my son pointed out the transcript panel in YouTube which partly solves that. I think there are quite some valuable features left in that solution space.

Replies from: Dagon
comment by Dagon · 2021-12-27T23:16:25.615Z · LW(p) · GW(p)

Transcripts and playback at 1.5-2.5 speed (depending content) definitely helps a lot, as does a ToC with timestamps. You're right that it's higher bandwidth (in terms of information per second of participation), but I think my objection is that not all of that information is equally valuable, and I often prefer lower-bandwidth more-heavily-curated information.

Hmm, I wonder if I can generalize this to "communication bandwidth is a cost, not a benefit".  Spending lots more attention-effort to get a small amount more useful information isn't a tradeoff I'll make most of the time.

Replies from: Viliam
comment by Viliam · 2021-12-28T11:31:16.684Z · LW(p) · GW(p)

Spending lots more attention-effort to get a small amount more useful information isn't a tradeoff I'll make most of the time.

This makes it generally a worse medium for a rational debate. Few people are willing to spend dozens of hours to become familiar with the arguments of their opponents. So instead the vlog debate will degenerate into "each side produces hours of convicing videos, everyone watches the videos of their side and throws the links to the opponents, but no one bothers watching the opponents' videos".

comment by MondSemmel · 2021-12-27T15:11:49.666Z · LW(p) · GW(p)

I have a hypothesis that I’m staring down the path my boomer relatives took. New technology kept not being worth it to them, so they never put in the work to learn it, and every time they fell a little further behind in the language of the internet – UI conventions, but also things like the interpersonal grammar of social media – which made the next new thing that much harder to learn. Eventually, learning new tech felt insurmountable to them no matter how big the potential payoff. 

There's also the explore-exploit tradeoff: the younger you are, the more you should explore and accumulate new knowledge; whereas the older you are, the more knowledge you've already accumulated. Insofar as you expect additional information to only have marginal value, you should mostly exploit your existing knowledge.

So from that perspective, I'd say what these older relatives need is not so much better instruction, but a genuinely excellent & strongly motivating reason to learn some specific new thing. For instance, is learning how to use Youtube really worth their time and energy, when they have a perfectly functional TV in their living room?

I feel that way about lots of technology which only seems like a shiny new thing without enduring value. (And lots of it is even profoundly negative, e.g. I would be much better off if so much of the Internet wasn't so incredibly addictive.)

In contrast, I do consider a small subset of technology and related skills as total game-changers, e.g. I'm sooooo much faster at touch-typing than at hand-writing that it affects the ways I think and communicate. Similarly, I tried voice commands on smartphones a few years ago, and was just thoroughly unimpressed by the quality back then; but it's very obvious that this tool will eventually become good enough (or has already?) that it will become another game-changer in my ability to take notes, and to think, when I'm not at my PC.

From the outside, it does sound admittedly hard to tell the difference between shiny vs. game-changing technology.

On a more personal level, however, that part is easier. For instance, our family's WhatsApp chat group would be a powerful incentive for my older relatives to learn how to use smartphones and this app, if they weren't already fluent with technology; and similarly, there was talk among my relatives of uploading photos of their babies (/ grandchildren) to a privately shared Google Drive, which is again the kind of thing that would strongly motivate the grandparents to learn about that technology if they didn't already know it.

comment by cata · 2021-12-26T21:29:02.423Z · LW(p) · GW(p)

I am 34 years old and I sense a very similar progression as you do, where I have mastered "early Internet" ways of doing things and I am less and less inclined to adopt new trends. Your remark about learning to be comfortable with hearing one's own voice on recordings is very interesting to me.

By the way, your video has a suffix of 40 seconds of black silence :-)

Replies from: pktechgirl
comment by Elizabeth (pktechgirl) · 2021-12-27T04:22:08.561Z · LW(p) · GW(p)

I always stop watching once the belt clicks and missed that. Thanks!

comment by Raemon · 2021-12-26T21:18:42.331Z · LW(p) · GW(p)

I made that toddler training montage video even though iMovies is a piece of shit and its UI should die in a fire.

I think this was actually a pretty interesting example that is worth going into more detail about. (I was there at the time Elizabeth was learning iMovie, and personally thought of this as the key insight behind the post)

iMovie does a particular thing where it resizes things when you squeeze the timeline with your fingers on the trackpad. This is part of a general trend towards having screens respond in (what is attempting to be) an organic way. This makes tradeoffs against being predictable in some ways. (It always resizes the teeniest sliver of footage-time to be large enough that you can see a thumbnail of the clip, even if it's only 1% as long as the other nearby clips)

And while my naive reaction is "this is bullshit", I also see how the endgame for this evolution of UI-style is the Iron Man interface:
 

...which is probably going to depend on you having a bunch of familiarity with "finger sliding" UI, which may evolve over time.

Replies from: pktechgirl
comment by Elizabeth (pktechgirl) · 2021-12-28T00:49:36.278Z · LW(p) · GW(p)

I think there's a shift. When I was learning tech, the goal was to build a model of what was going on under the hood so you could control it, on its terms. Modern tech is much more about guessing what you want and delivering it, which seems like it should be better and maybe eventually will be, but right now is frustrating. It's similar to when I took physics-for-biologists despite having the math for physics-for-physics-majors. Most people must find for-biologists easier or they wouldn't offer it, but it was obvious to me I would have gained more predictive power with less effort if I'd taken a more math-based class. 

Replies from: ESRogs
comment by ESRogs · 2021-12-31T03:05:19.503Z · LW(p) · GW(p)

Modern tech is much more about guessing what you want and delivering it, which seems like it should be better and maybe eventually will be, but right now is frustrating.

Reminds me of this:

[Wolfram Alpha] is not a full-text search engine. It is a database query and visualization tool. More precisely, it is a large (indeed, almost exhaustive) set of such tools. These things may seem similar, but they are as different as popes and partridges.

Google is not a control interface; WA is. When you use WA, you know which of these tools you wish to select. You know that when you type “two cups of flour and two eggs” (which now works) you are looking for a Nutrition Facts label. It is only Stephen Wolfram’s giant electronic brain which has to run ten million lines of code to figure this out. Inside your own brain, it is written on glowing letters across your forehead.

So the giant electronic brain is doing an enormous amount of work to discern information which the user knows and can enter easily: which tool she wants to use.

When the giant electronic brain succeeds in this task, it has saved the user from having to manually select and indicate her actual data-visualization application of choice. This has perhaps saved her some time. How much? Um, not very much.

When the giant electronic brain fails in this task, you type in Grandma’s fried-chicken recipe and get a beautiful 3-D animation of a bird-flu epidemic. (Or, more likely, “Wolfram Alpha wasn’t sure what to do with your input.” Thanks, Wolfram Alpha!) How do you get from this to your Nutrition Facts? Rearrange some words, try again, bang your head on the desk, give up. What we’re looking at here is a classic, old-school, big steaming lump of UI catastrophe.

And does the giant electronic brain fail? Gosh, apparently it does. After many years of research, WA is nowhere near achieving routine accuracy in guessing the tool you want to use from your unstructured natural-language input. No surprise. Not only is the Turing test kinda hard, even an actual human intelligence would have a tough time achieving reliability on this task.

The task of “guess the application I want to use” is actually not even in the domain of artificial intelligence. AI is normally defined by the human standard. To work properly as a control interface, Wolfram’s guessing algorithm actually requires divine intelligence. It is not sufficient for it to just think. It must actually read the user’s mind. God can do this, but software can’t.

Of course, the giant electronic brain is an algorithm, and algorithms can be remembered. For instance, you can be pretty sure that the example queries on the right side of your screen (“June 23, 1988”) will always send you to the same application. If you memorize these formats and avoid inappropriate variations, you may not end up in the atomic physics of the proton.

This is exactly what people do when circumstances force them to use this type of bad UI. They create an incomplete model of the giant electronic brain in their own, non-giant, non-electronic brains. Of course, since the giant electronic brain is a million lines of code which is constantly changing, this is a painful, inadequate and error-prone task. But if you are one of those people for whom one of Wolfram’s data-visualization tools is useful, you have no choice.

[...]

Thus, the “flexible” and “convenient” natural-language interface becomes one which even Technology Review, not exactly famous for its skepticism, describes as “inflexible.” The giant electronic brain has become a giant silicon portcullis, standing between you and your application of choice. You can visualize all sorts of queries with Wolfram Alpha—but first you have to trick, cajole, or otherwise hack a million lines of code into reading your mind.

https://www.unqualified-reservations.org/2009/07/wolfram-alpha-and-hubristic-user/

comment by NicholasKross · 2021-12-27T05:03:03.562Z · LW(p) · GW(p)

"Please subscribe. It's free, and you can always change your mind later." - a successful YouTube call-to-action, from someone who understands the confusion about "subscribe" when many new folk come in from newspaper subscriptions. (Emphasis added)

comment by Rana Dexsin · 2021-12-27T20:20:43.907Z · LW(p) · GW(p)

Aside from the natural (to the human) effects surrounding learning and motivation—in this particular domain, in the current era, I suspect there are important sub-questions revolving around the effects of the “constant rippling and trembling” of an implicit norm of any-time all-the-time often-predatory UI changes pushed from afar, with the primary motivational hook being the service owner's. In fact, you specifically mention

I think she got confused by the new profile taste poll but I wasn’t there so I’ll never be sure

which puts me in mind of any number of interesting new dialog boxes or other widgets with unpredictable consequences. Maybe that all dissolves under “the key is to learn the UI grammar and a rough consensus of what shouldn't break things”, but the notion of a unified platform grammar also gets eroded by the fashion cycles (I wonder how this differs by sub-medium, in particular mobile vs desktop vs Web).

Replies from: ben-lang
comment by Ben (ben-lang) · 2022-10-25T12:08:45.124Z · LW(p) · GW(p)

It wasn't until was teaching my grandma to check her emails on a desktop that I realised quite how many pointless pop-ups there actually are. Once a "this software needs to update" pop-up is the difference between her getting to her emails or stopping in confusion you suddenly realise that the chances of a typical desktop computer letting you get as far as an email login screen without a popup is low.

comment by Adam Zerner (adamzerner) · 2021-12-26T19:51:36.682Z · LW(p) · GW(p)

but I think I need to get over my dislike of recordings of my own voice to the point I can listen to them

Apparently this is extremely common and there is a scientific explanation for it. And as an additional data point, I experienced it myself.

I have a hypothesis that I’m staring down the path my boomer relatives took. New technology kept not being worth it to them, so they never put in the work to learn it, and every time they fell a little further behind in the language of the internet – UI conventions, but also things like the interpersonal grammar of social media – which made the next new thing that much harder to learn. Eventually, learning new tech felt insurmountable to them no matter how big the potential payoff.

This doesn't explain why young people with a similar lack of experience, eg. the three year old mentioned in the post, have a vastly easier time learning new tech-related things.

Replies from: Ericf, pktechgirl
comment by Ericf · 2021-12-26T20:16:39.278Z · LW(p) · GW(p)
  1. 3-year olds have an easier time learning anything than an adult (eg Languages)

  2. 3-year olds don't have any well-formed "ruts" in thier neural pathways. New UI or workflows often cut across the existing ruts

Replies from: Viliam, MondSemmel, adamzerner
comment by Viliam · 2021-12-26T22:53:37.025Z · LW(p) · GW(p)

Also, 3-year olds do not worry whether they might break something -- their parents would fix it.

Older people know that things can go wrong in various ways, but they are not sure how exactly. New scams are being invented every day. If you spend most of your time playing with the technology, you have a good idea about what is dangerous and what is not. If you only use it once in a while, it's a minefield.

For example, if you notice that you missed a phone call and you call the person back... it can cost you lots of money (if the person is a scammer, setting up a paid service, then automatically calling up thousands of people and hanging up, expecting some of them to call back). When you see the missed call, is this something you consider before calling back? Most old people do not have a sufficiently good model; they are aware that some seemingly innocent things are dangerous, but they do not know which ones exactly.

If you teach a 70 years old person how to use a smartphone, do you also explain to them all possible things that can go wrong? Heck, I am not sure I could even list all the dangers. I rely on being an active online reader, so when a new scam is invented, I will probably read about it before someone tries it on me; hopefully. But that old person is just thrown into a pool with sharks. Same if you teach someone browsing the web. Same if you teach someone shopping online. All the tech is full of scams, and if you get scammed, well it sucks to be you, you should have been more tech savvy.

(Recently, someone impersonated my 70-years old mother on Whatsapp and a few other online messengers. I don't even know how it is possible to create a Whatsapp account using someone else's phone number; when I try to create an account, it checks the number by sending me a verification SMS. But apparently it is possible to do somehow, because someone did exactly this; created accounts with my mother's phone number, and some young woman's photo; then used them to sell some cars. We have no idea how; my mother only uses her smartphone for calling, and sending/receiving SMS. We just reported the whole thing to the police, and my mother changed her phone number.)

The tech is hostile, but if you keep using it every day, you get used to it, and learn to navigate it. You recognize the most frequent scams, and you get lucky that the more rare ones avoided you.

Replies from: gbear605, ChristianKl
comment by gbear605 · 2021-12-26T23:03:11.984Z · LW(p) · GW(p)

I suspect that in your WhatsApp case, someone spoofed her phone number so that they received the verification SMS instead. SMS verification is recently considered an unsafe method, which is why there’s been a move towards two factor authentication apps.

I’m not confident though, which only proves your point! I’m a professional software developer who reads about things like this all the time and I only have guesses at what went wrong.

comment by ChristianKl · 2021-12-27T11:02:29.055Z · LW(p) · GW(p)

(Recently, someone impersonated my 70-years old mother on Whatsapp and a few other online messengers. I don't even know how it is possible to create a Whatsapp account using someone else's phone number; when I try to create an account, it checks the number by sending me a verification SMS.

One way this happens is to use the social graph: One of your relatives or friends writes you: "I made a mistake and now it sent a verification code to your phone, can you please give me the verification code?"

When your 70-year old mother gets such a message from another 70-year old friend she wants to help her friend and thus passes the verification code along. That verification code can then be used to overtake the account and attack further targets. 

If each old person has >10 similar contacts you only need 10% to fall for this to overtake more and more accounts.

Replies from: Viliam
comment by Viliam · 2021-12-27T17:21:01.308Z · LW(p) · GW(p)

Thanks, I learned yet another way to scam people. But no such thing happened. My mother understands the concept of SMS, she says this did not happen, and she keeps the old messages on her phone, I checked them. Someone simply made a Whatsapp account with her phone number without her receiving any SMS message. I have no idea how that is possible -- but that is exactly my point. (And, as usual, Whatsapp does not have any customer service that we could contact and ask.)

She already changed her number, so unless the same thing happens again, we consider this problem solved. It was just an illustration how difficult to understand things are (even for an IT guy such as me).

Replies from: Bojadła
comment by Bojadła · 2021-12-27T23:20:47.422Z · LW(p) · GW(p)

Alternative explanation: Your mother did participate in the scam in some way and is too embarrassed to admit it. (You know your mother better than I do. I'm just saying this might have happened and you might not have considered it.)

comment by MondSemmel · 2021-12-27T17:05:24.651Z · LW(p) · GW(p)

Re: 1, I recall this being in dispute or at least oversimplified. If you put children and adults on equal footing (in particular, by giving adults the same amount of time to learn languages that children do - potentially several hours per day!), I would be astonished if children came out ahead.

(Notwithstanding some minor aspects of language which children indeed seem advantaged in learning, like speaking a language without an accent.)

comment by Adam Zerner (adamzerner) · 2021-12-26T22:09:08.356Z · LW(p) · GW(p)

The OP was hypothesizing that a lack of keeping up with tech trends leads to you "falling behind" and eventually reaching a point where it feeling insurmountable to learn new tech things. It is possible that this hypothesis is true, and that young people have such a huge advantage in learning new things that this advantage outweighs their similar lack of background knowledge.

I don't get that sense though. There are some places where 40 year olds have an advantage over 5 year olds in learning new things. There are other places where 5 year olds have the advantage. Then there's the question of how wide the gap is between 5 year olds and 40 year olds. Language comes to mind as a place where the gap would be massive, but new tech doesn't feel like it should have a massive gap. My epistemic status is just musing though.

comment by Elizabeth (pktechgirl) · 2021-12-27T04:29:27.657Z · LW(p) · GW(p)

Worth noting this was an extremely brilliant and online three year old who had a bunch of experience with multiple devices. She might not have seen my particular phone before, but I expect she had a good grounding in UI grammar. 

comment by Ben (ben-lang) · 2022-10-25T11:53:10.365Z · LW(p) · GW(p)

I had my first experience with Tic Tok recently. Someone was showing me some funny videos. It wasn't until the third or fourth video that I finally realised. "Oh, these are dialogues, and each time their is a jump cut to the same person talking in the same voice but with a different backdrop that symbolises a change in speaker". The person showing me the videos could not believe this had not been obvious the first time.

comment by ViktoriaMalyasova · 2021-12-27T07:24:57.959Z · LW(p) · GW(p)

We desperately need 

Wait, didn't this post just make a case that older people don't keep up with new technology because they don't feel they need it?

New apps appeal to me less and less often. Sometimes something does look fun, like video editing, but the learning curve is so steep and I don’t need to make an Eye of The Tiger style training montage of my friends’ baby learning to buckle his car seat that badly, so I pass it by and focus on the millions of things I want to do that don’t require learning a new technical skill. 

Doesn't sound to me like you desperately need that app :)

Replies from: arunto
comment by arunto · 2021-12-27T08:28:20.624Z · LW(p) · GW(p)

That is true but actually it is a main part of the problem. You don't need many new apps, but by not using them you can cumulatively loose (or not gain) crucial competencies you do need later on.