The Computer Delusion

floozy

Traveler
Joined
Apr 29, 2022
Messages
31
Reaction score
126
Awards
17
Website
floozy.neocities.org
I found this article while searching for research on how thought varies differently across speech, writing, and the internet (if you have any information on this, please send!) and I was wondering what you all had to say about it. This is focused mostly on educational programs on computers, rather than CS or anything like that. I remember being introduced to these programs meant to teach spelling, touch-typing, and mental math. Even through high school, most of our homework was assigned online. It seems natural that this development is going to teach kids to think very uniformly and algorithmically. If it's an unsophisticated machine, which it usually is, it can only grade Correct/Incorrect. I've always found this really frustrating and a very binary approach to learning. It might be more acceptable with hard math, but science and literature both need ambiguity and creativity to be fully appreciated. As for computer science programs in school, that's a whole different topic. Personally, I think children should learn to use computers, and one of the main flaws with "computer literacy programs" in school is that they didn't. Boomers don't know how to use technology. Zoomers don't know how to use technology. We poured so much money and time into "teach everyone to code" and then just didn't do that. The programming knowledge I learned in school was laughable - what use does anyone have for Scratch if they don't understand a file directory? Why learn the syntax if you don't know the applications?

Anyway, here are some excerpts from the article if you don't feel like reading the whole thing:


The issue, perhaps, is the magnitude of the errors. Alan Lesgold, a professor of psychology and the associate director of the Learning Research and Development Center at the University of Pittsburgh, calls the computer an "amplifier," because it encourages both enlightened study practices and thoughtless ones. There's a real risk, though, that the thoughtless practices will dominate, slowly dumbing down huge numbers of tomorrow's adults. As Sherry Turkle, a professor of the sociology of science at the Massachusetts Institute of Technology and a longtime observer of children's use of computers, told me, "The possibilities of using this thing poorly so outweigh the chance of using it well, it makes people like us, who are fundamentally optimistic about computers, very reticent."

...

In any event, what is fun and what is educational may frequently be at odds. "Computers in classrooms are the filmstrips of the 1990s," Clifford Stoll, the author of Silicon Snake Oil: Second Thoughts on the Information Highway (1995), told The New York Times last year, recalling his own school days in the 1960s. "We loved them because we didn't have to think for an hour, teachers loved them because they didn't have to teach, and parents loved them because it showed their schools were high-tech. But no learning happened."

...

"It's highly motivating for them," Ortiz said as she rushed from machine to machine, attending not to math questions but to computer glitches. Those she couldn't fix she simply abandoned. "I don't know how practical it is. You see," she said, pointing to a girl counting on her fingers, "these kids still need the hands-on" -- meaning the opportunity to manipulate physical objects such as beans or colored blocks. The value of hands-on learning, child-development experts believe, is that it deeply imprints knowledge into a young child's brain, by transmitting the lessons of experience through a variety of sensory pathways. "Curiously enough," the educational psychologist Jane Healy wrote in Endangered Minds: Why Children Don't Think and What We Can Do About It (1990), "visual stimulation is probably not the main access route to nonverbal reasoning. Body movements, the ability to touch, feel, manipulate, and build sensory awareness of relationships in the physical world, are its main foundations." The problem, Healy wrote, is that "in schools, traditionally, the senses have had little status after kindergarten."

...

Reading programs get particularly bad reviews. One small but carefully controlled study went so far as to claim that Reader Rabbit, a reading program now used in more than 100,000 schools, caused students to suffer a 50 percent drop in creativity. (Apparently, after forty-nine students used the program for seven months, they were no longer able to answer open-ended questions and showed a markedly diminished ability to brainstorm with fluency and originality.

...

Kris Meisling, a senior geological-research adviser for Mobil Oil, told me that "people who use computers a lot slowly grow rusty in their ability to think." Meisling's group creates charts and maps -- some computerized, some not -- to plot where to drill for oil. In large one-dimensional analyses, such as sorting volumes of seismic data, the computer saves vast amounts of time, sometimes making previously impossible tasks easy. This lures people in his field, Meisling believes, into using computers as much as possible. But when geologists turn to computers for "interpretive" projects, he finds, they often miss information, and their oversights are further obscured by the computer's captivating automatic design functions. This is why Meisling still works regularly with a pencil and paper -- tools that, ironically, he considers more interactive than the computer, because they force him to think implications through.

...

Worth noting that this was written in 1997. Some aspects are dated, but others have only become more relevant with online learning and Zoom University. Let me know what you think.
 
Virtual Cafe Awards

Collision

Green Tea Ice Cream
Joined
Jun 5, 2022
Messages
381
Reaction score
1,424
Awards
126
I think some of what you're saying has merit but I'm not super impressed with the article. Todd Oppenheimer is basically doing exactly what he's complaining about. He says that, many of the studies in support of the use of computers in education are, "...more anecdotal than conclusive." (pg. 4) However, just before this he tells us that he is examining these claims based on, "...the evidence in the academic literature and in the everyday experiences [he has] observed or heard about in a variety of classrooms." (pg. 3) Since Oppenheimer provides no real citations to academic literature, it's hard to determine whether or not any of his claims are based on good data. The primary source of his claims seems to be interviews with teachers, which is an example of anecdotal rather than scientific evidence, anyway. On that basis alone I think it's fair to say that these claims are at least equally baseless to the ones they are allegedly refuting. Frankly, I've lived through enough of this kind of "<NEWEST_THING> is dumbing down the youth!" mania to know it's not worth engaging with seriously. There are a lot of dullards in the world and the reasons for that aren't as simple as blaming television, computers, or rap music.

Personally, I think children should learn to use computers, and one of the main flaws with "computer literacy programs" in school is that they didn't. Boomers don't know how to use technology. Zoomers don't know how to use technology. We poured so much money and time into "teach everyone to code" and then just didn't do that. The programming knowledge I learned in school was laughable - what use does anyone have for Scratch if they don't understand a file directory? Why learn the syntax if you don't know the applications?
In my opinion, this is the real issue at play. Computers might not be making people dumber but the computerization of everything is making computer illiteracy a much bigger deal. As you say, there's very little point in teaching everyone to code if that means simplifying the subject matter to a point where all the computer illiterates can do it. One of the points Oppenheimer makes, that I can support despite it being anecdotal, is just how great of a difference engaged teachers can make. He mentions Dennis Frezzo, a UC Berkeley graduate, who along with a few other teachers put together a robotics lab using Lego (maybe Mindstorms) and 386 processors. Obviously, students who have an environment like that are going to perform very differently from those who have a burned out 60 year-old as a teacher. Unfortunately, based on my experience of American public education the latter is much more common.

I've been in and out of university level computer education for the last decade and this problem is pervasive there too. A good instructor can make all the difference but even a good instructor has to compete with the extremely rudimentary computer skills of most students. In my experience it's not uncommon, in such environments, to find instructors who are simply overwhelmed by students who can't use a computer without training wheels. I find there is also an attitude among educators, at least among university computer professors, that the computer engineers of the future simply aren't going to need low-level architectural knowledge anymore. If you ask, you might even be told directly that really understanding a computer as either a user or an engineer is a waste of time.

It's not related to neurology or psychology but if you found Oppenheimer's article interesting you might find Hard, Soft, and Wet interesting as well. I was able to pick up a used copy very cheap about a year ago. It's not what I would call a great book but I think it does capture some of the same delusionally aspirational 90s views on computers.
 
Virtual Cafe Awards

Orlando Smooth

Well-Known Traveler
Joined
Aug 12, 2019
Messages
453
Reaction score
1,702
Awards
142
"Computers in classrooms are the filmstrips of the 1990s,"
This really hit home. In the 90's, my parents loved the fact that my elementary school had a "computer room" with a couple dozen computers where we'd learn how to use Windows 95 or whatever for basic purposes. This was because they had been in the professional workforce for a while at that point, and had the sense to know that computers were going to play a massive role in the life of their children, and their professional success was going to be in part dependent on knowing how to use them. I don't fault them for this, their instincts were in many ways correct. I challenge you to find any job or career that doesn't necessitate interacting with digitized technology every day - the few that still exist would be on the very low end of the income spectrum. The only thing of value I actually took away from those early computer classes was my typing abilities; I can type very quickly and while blindfolded, making very few mistakes. This is extremely valuable in boosting productivity/efficiency in any kind of modern white collar job as I can spend substantially less time writing emails, documents, etc., than someone who cannot type with all 10 fingers simultaneously. But typing classes existed LONG before the invention of the personal computer.

I suppose you could make the point that I gained some sort of abstract familiarity and comfortability with using personal computers because of this, but that was going to have happened anyways. My parents had a computer at home from the time I was born because their jobs required it, and my dad taught me how to load DOS games when I was like 4 years old. You might then say that it was good for those students who did not have computers at home, but I don't know if I buy that line of argument in this specific context based on the evidence; try finding a Millennial or Gen X that never learned how to use a computer's basic functions. The filmstrips analogy is apt here again, as it's like saying they were good for introducing students to tape-media, ignoring that virtually everyone became very familiar eventually anyways because of the VCR.

Having zoomer siblings that are substantially younger than me who never learned how to use filesystems, how to install programs without an app store, etc., because they were the iPad generation is proof to me that we need to teach basic computer literacy in schools. However, using computers to try and teach things unrelated to computers themselves does seem like a generally bad idea. At best, they can serve as supplemental material and the gateway to a wealth of background knowledge.
 
Virtual Cafe Awards

Collision

Green Tea Ice Cream
Joined
Jun 5, 2022
Messages
381
Reaction score
1,424
Awards
126
This really hit home. In the 90's, my parents loved the fact that my elementary school had a "computer room" with a couple dozen computers where we'd learn how to use Windows 95 or whatever for basic purposes. This was because they had been in the professional workforce for a while at that point, and had the sense to know that computers were going to play a massive role in the life of their children, and their professional success was going to be in part dependent on knowing how to use them. I don't fault them for this, their instincts were in many ways correct. I challenge you to find any job or career that doesn't necessitate interacting with digitized technology every day - the few that still exist would be on the very low end of the income spectrum. The only thing of value I actually took away from those early computer classes was my typing abilities; I can type very quickly and while blindfolded, making very few mistakes. This is extremely valuable in boosting productivity/efficiency in any kind of modern white collar job as I can spend substantially less time writing emails, documents, etc., than someone who cannot type with all 10 fingers simultaneously. But typing classes existed LONG before the invention of the personal computer.
I had a very similar childhood experience with school computer classes. Arguably, my elementary school could have been any number of the ones mentioned in Oppenheimer's article. We had a lab of fancy (for the time) new iMac G3's that I assume had been donated by Apple Inc. We would go as a class to the computer lab about once a week and mostly just fool around on the machines. Eventually, probably in 4th grade, they started to teach us touch typing which was the only big take away lesson from computer time (I still remember being incredulous when they brought out a piece of cut up cardboard box to cover each keyboard). I changed schools for 5th grade and my new school had the same iMac G3's but had a enough that each classroom had about 20 machines (besides also having a full lab with an attendant). They were much more lax about computer time than my previous school and so a lot of students would play on the computers during lunch or recess (Neopets and Runescape were both very popular but Flash games would get good traction too). The lab attendant had even written a few "educational" Flash games and had published them on the school web page (towards the end of the year he also showed some of us how to install software cracks).
I suppose you could make the point that I gained some sort of abstract familiarity and comfortability with using personal computers because of this, but that was going to have happened anyways. My parents had a computer at home from the time I was born because their jobs required it, and my dad taught me how to load DOS games when I was like 4 years old. You might then say that it was good for those students who did not have computers at home, but I don't know if I buy that line of argument in this specific context based on the evidence; try finding a Millennial or Gen X that never learned how to use a computer's basic functions. The filmstrips analogy is apt here again, as it's like saying they were good for introducing students to tape-media, ignoring that virtually everyone became very familiar eventually anyways because of the VCR.
You're right to suppose this because that's exactly what I will do. Whether you like it or not you're a digital native and while our experiences are similar I think that most Millennials and Gen X kids are probably not like us. I posted this in another thread as well but I think it bears repeating. As you say your parents were both at least a little bit familiar with computers from their work. You had one in your house your whole life. They were available at your school when you were young. The same was the case for me. However, is this really the case for most Millennials or Gen Xers? Since I have no more data than Oppenheimer, you're welcome to be incredulous of this claim but I don't think most people grew up in an environment like we did. I suppose, to follow your VCR analogy, it's really a question of what you think the basic functions of a computer are. Surely, many people in the VHS era were familiar with the operation of a VCR at a basic level (pause, play, stop, fast forward, rewind, record) but could they have been expected to do much more than pop a tape in and perform one of these elemental actions? There's a big difference between being, let's call it, film or tape literate and simply popping a VHS or Beta tape into a player. How many VCR owners could have described any of the parts of a VCR or describe in even a rudimentary way how a VHS tape works (i.e., that the tape stores data magnetically)? Luckily, you don't need this information to use a VCR but I think that's the rub isn't it? We can apply these questions to computers too. I would encourage you to try asking some less nerdy Millennials to describe, for example, what a hard drive is or how WiFi works. I think that Millennial computer literacy is largely a myth.
Having zoomer siblings that are substantially younger than me who never learned how to use filesystems, how to install programs without an app store, etc., because they were the iPad generation is proof to me that we need to teach basic computer literacy in schools. However, using computers to try and teach things unrelated to computers themselves does seem like a generally bad idea. At best, they can serve as supplemental material and the gateway to a wealth of background knowledge.
I can basically agree with you on this. The only exception is that I think that computers could be used for far more in an educational environment. There's a lot of room to leverage computers to supplement a single teacher. When you have 30+ students it's hard to give them each individualized attention but I think, perhaps naively, that carefully constructed software could make a big difference. A teacher may not have time to drill you on your spelling if you're lagging behind the class but a computer could do this just as well on it's own. As you say, even without specialized software computers can be a gateway to a wealth of supplemental material for students if only they know how to find it.
 
Virtual Cafe Awards
Joined
Jul 5, 2022
Messages
70
Reaction score
429
Awards
36
As for computer science programs in school, that's a whole different topic. Personally, I think children should learn to use computers, and one of the main flaws with "computer literacy programs" in school is that they didn't. Boomers don't know how to use technology. Zoomers don't know how to use technology. We poured so much money and time into "teach everyone to code" and then just didn't do that. The programming knowledge I learned in school was laughable - what use does anyone have for Scratch if they don't understand a file directory? Why learn the syntax if you don't know the applications?
take a look at this paper if you have time - you won't have to read far to get the gist. it would appear that in fact it is simply not possible to teach most people how to code, regardless of how much time is spent attempting to do so, a fact that is well-known amongst CS educators. people without the natural aptitude can never form a mental model of how basic aspects of programming such as assignment work, even after several week's guidance on the subject. the reason schools teach Scratch is not because it's useful in any real capacity, but because it allows them to easily determine which of their students have the ability to program, and which will never be able to do so.
 
Virtual Cafe Awards

Collision

Green Tea Ice Cream
Joined
Jun 5, 2022
Messages
381
Reaction score
1,424
Awards
126
take a look at this paper if you have time - you won't have to read far to get the gist. it would appear that in fact it is simply not possible to teach most people how to code, regardless of how much time is spent attempting to do so, a fact that is well-known amongst CS educators. people without the natural aptitude can never form a mental model of how basic aspects of programming such as assignment work, even after several week's guidance on the subject. the reason schools teach Scratch is not because it's useful in any real capacity, but because it allows them to easily determine which of their students have the ability to program, and which will never be able to do so.
The author of this paper has retracted most of his conclusions. From the retraction:
In autumn 2005 I became clinically depressed. My physician put me on the then-standard treatment for
depression, an SSRI. But she wasn't aware that for some people an SSRI doesn't gently treat depression,
it puts them on the ceiling. I took the SSRI for three months, by which time I was grandiose, extremely
self-righteous and very combative – myself turned up to one hundred and eleven. I did a number of very
silly things whilst on the SSRI and some more in the immediate aftermath, amongst them writing "The
camel has two humps". I'm fairly sure that I believed, at the time, that there were people who couldn't learn
to program and that Dehnadi had proved it. The paper doesn't exactly make that claim, but it comes pretty
close. Perhaps I wanted to believe it because it would explain why I'd so often failed to teach them. It was
an absurd claim because I didn't have the extraordinary evidence needed to support it. I no longer believe
it's true.
Because of some of the other silly things I did, I was suspended from my job at Middlesex. The university
was wise enough, once the dust had settled, to recognise that there had been a mental health issue and to
take me back, and it was kind enough to support me whilst I recovered from my depression. It was during
that later period that Dehnadi and I asked some statistician colleagues if they could help us recover more
information from his data. Was there, for example, evidence that his test predicted the performance of novice
students: beyond the pass/fail distinction which he had shown he could predict to an extent, could we tell
who would do well and who would do better? Were there age or sex differences? (The statisticians asked
about those: we weren't looking.) After a lot of work, the answers were, by and large, that we couldn't
see any such differences in our data.
Just to be clear: I do not believe that Dehnadi discovered an aptitude test for programming, as I claimed in
2006. Nor do I believe in programming sheep and non-programming goats. On the other hand, neither do I
believe that further investigation showed that he'd found nothing of substance, as I implied in 2008
 
Virtual Cafe Awards

Orlando Smooth

Well-Known Traveler
Joined
Aug 12, 2019
Messages
453
Reaction score
1,702
Awards
142
Whether you like it or not you're a digital native and while our experiences are similar I think that most Millennials and Gen X kids are probably not like us. I posted this in another thread as well but I think it bears repeating. As you say your parents were both at least a little bit familiar with computers from their work. You had one in your house your whole life. They were available at your school when you were young. The same was the case for me. However, is this really the case for most Millennials or Gen Xers?
Oh hey, I was in that thread too. Anyways, I have no qualms being labeled a digital native because that very clearly is what I am. When you ask if this is true for most Millennials or Gen Xers though: yes. I have never, and I truly mean never, met anyone in these age cohorts who didn't know absolute basic things like drag and drop, right clicking, how to operate MS Word, email basics, how to use a desktop web browser, etc. But I have seen a lot of both Boomers and Zoomers have their minds blown by these concepts, or think that they're "overly complicated nerd stuff."

I grew up true middle class (rust belt, parents occasionally struggled to pay bills, but we also had no fear of getting food on the table and were able to occasionally go on vacation) and had exposure to people who were richer, poorer, smarter, dumber, more white collar, more blue collar, generally techy, family's were morally opposed to technology (yes really), and so on. I don't think any of those were the dividing lines amongst people in the age cohort we're talking about. If there is a commonality amongst people born between ~1965 and ~2000 who don't know how to use desktop computers at the most basic level I'd really love to know what it is. Any commonality other than not being American, that is. Kids I grew up with that lived in trailerparks with derelict parents still knew how to log in and play games online. Kids whose parents were paranoid about videogames making them violent and therefore sheltered them from all digital media were right at home when they sat at the family computer at a friend's house. Kids whose parents didn't own a computer because they were free spirited hippies, yet they knew how to bypass the school's network safe guards in order to access flash game sites. I could go on, the point is that I find it difficult to imagine people in this age group who, by 2005 or so, still did not know how to operate a desktop at a basic level.
 
Virtual Cafe Awards

Collision

Green Tea Ice Cream
Joined
Jun 5, 2022
Messages
381
Reaction score
1,424
Awards
126
Oh hey, I was in that thread too. Anyways, I have no qualms being labeled a digital native because that very clearly is what I am. When you ask if this is true for most Millennials or Gen Xers though: yes. I have never, and I truly mean never, met anyone in these age cohorts who didn't know absolute basic things like drag and drop, right clicking, how to operate MS Word, email basics, how to use a desktop web browser, etc. But I have seen a lot of both Boomers and Zoomers have their minds blown by these concepts, or think that they're "overly complicated nerd stuff."
After having given it a little thought, I think the question is, "can most personal computer users generalize?" It's one thing to be familiar with a workflow but another thing entirely to be able to apply a general understanding of personal computers. If you take away the familiar trapping of Windows or whatever Apple device someone prefers can they make meaningful predictions about how personal computers will behave? To use reading (i.e., regular literacy) as an example, I think that someone who is literate is not just someone who can read a specific book or a specific style of book. Someone who is literate, can generalize their knowledge of written language to read and understand most documents even if the style is initially unfamiliar to them. I think you can also take this a step further and expect, for example, people literate in English to also be able to generalize this skill to other languages. A literate English speaking student of Chinese should not need to be taught how to read again only the grammar and vocabulary required to understand the new language. Is the same true of Millennials (or Gen X in you prefer) with computers? If you were to take away the Windows or Apple GUI and drop such a person into the Linux framebuffer terminal would they still be able to apply a basic understanding that it was a personal computer system with files, directories, and programs or would they be stuck? If they cannot then, it seems to me, all they've done is memorize a workflow without any real understanding of the organization of the device itself.

wow fair enough then... i bookmarked the original paper a while ago and wasn't aware of the retraction, thanks for correcting me
No worries, I read the original paper and thought the style seemed unprofessional so I had to check if it was for real. :pugPls:I'm curious what other, "silly things," the author did to get himself suspended.
 
Virtual Cafe Awards

Fairykang

Cybernetic Esotericist
Joined
Dec 26, 2021
Messages
212
Reaction score
488
Awards
71
The only thing of value I actually took away from those early computer classes was my typing abilities
That and getting around porn filters for me.

There's a big difference between being, let's call it, film or tape literate and simply popping a VHS or Beta tape into a player.
The mystery of the blinking clock on the VCR is still within popular culture to this day.

My contribution to this thread is an observation in this article that zoomers don't have the computer literacy to know what a folder in a computer is. Search functions in the newer Windows and macOS leads to the same problems calculators do with basic arithmetic.

 
Virtual Cafe Awards
Joined
Jul 5, 2022
Messages
70
Reaction score
429
Awards
36
That and getting around porn filters for me.


The mystery of the blinking clock on the VCR is still within popular culture to this day.

My contribution to this thread is an observation in this article that zoomers don't have the computer literacy to know what a folder in a computer is. Search functions in the newer Windows and macOS leads to the same problems calculators do with basic arithmetic.

nice article and something that i've observed too. probably caused by smartphones saving your files wherever, and only showing you the relevant ones by context. eg if you try to send a photo it will open your photos folder with no indication of the specific location in storage. i think for a layman it's actually not a bad way to handle it though.
 
Virtual Cafe Awards

Collision

Green Tea Ice Cream
Joined
Jun 5, 2022
Messages
381
Reaction score
1,424
Awards
126
I'm not sure what I find most terrifying about this article. The Verge outright suggesting that using Instagram is basically the same as knowing how to navigate a hierarchical filesystem is pretty crazy. Sure kids today don't understand how the device they use to access Instagram works but their teachers don't use Instagram at all so maybe it's the teachers who are stupid? :?Puzzled?:
 
Virtual Cafe Awards

240p

Internet Refugee
Joined
Jul 21, 2022
Messages
2
Reaction score
3
Awards
1
The ever-increasing dependence on technology combined with a decrease in computer literacy is concerning.

One doesn't need to know the inner workings of a car in order to use it and benefit from it. However, there seems to be a steady amount of people who DO understand how cars work and are even capable of running production lines to make them.

The problem I see is that the number of "programmers" may be increasing, but the number of programmers who have even a basic understanding of how a computer works all the way down is decreasing. Additionally, the more abstracted computing becomes, the more difficult it becomes to maintain an understanding of all the abstractions.

Good talk on the subject:
View: https://m.youtube.com/watch?v=ZSRHeXYDLko
 

Similar threads