AI will make our children stupid

(thecritic.co.uk)

53 points | by binning 2 hours ago

21 comments

  • niceguy1827 2 hours ago
    Aren't people already stupid enough? The fact that the author wrote this article without verifying if the existing trend of children's IQ shows some level of stupidity.

    And please excuse my language. I probably watch George Carlin videos a bit too much.

    > For example, a 2018 analysis by researchers at Northwestern University and the University of Oregon found that average IQ scores in the U.S. began declining slightly after 1995, particularly in younger generations. This reversal mirrors findings in several European countries, including Norway, Denmark, and the UK.

    https://nchstats.com/average-iq-by-state-in-us/

    • hn_throwaway_99 1 hour ago
      The human body is famously a "use it or lose it" system. For example, the US (and most of the developed world) has had a large reduction in grip strength since just 40 years ago as Americans get ever more sedentary. I think most people of a certain age can relate to how they've gotten a lot worse at remembering and following directions now that "the Google lady" just tells you right where to turn.

      The same thing is happening/will happen with AI. If you don't go through the hard brain work of thinking things up for yourself, especially writing, your writing skills will deteriorate. We'll see that in a giant scale as more and more kids lean on ChatGPT to "check their homework".

    • cons0le 2 hours ago
      Isn't AI too new to study it's effects on the kids?
    • ares623 1 hour ago
      “Bad thing X is already happening. If that’s not being solved then making X exponentially worse is therefore okay.”

      What is 20 PRs per day worth.

      Engineers will literally burn the world if it means looking good for their employers.

    • bgwalter 2 hours ago
      The authors' (there are two) position is that yes, people are already stupid enough, but it will get much worse:

      "We may soon look back on this era of TikTok, Love Island and Zack Polanski as an age of dignity and restraint."

    • FpUser 2 hours ago
      >"I probably watch George Carlin videos a bit too much."

      My favorite. Love the guy. Too bad he is dead.

  • xnx 2 hours ago
    AI will be a super-tutor for the curious and a tool to outsource all thinking for the incurious.
    • WhyOhWhyQ 2 hours ago
      The job doesn't pay you to be curious. It pays you to get stuff done. Curiosity makes you jobless. Most of the Silcon Valley people who frequent this website larp as curious people, but are basically incurious status seekers.
      • fn-mote 2 hours ago
        > The job doesn't pay you to be curious.

        YOUR job doesn’t pay you to be curious.

        Well, you could say mine doesn’t either, literally, but the only reason I am in this role, and the driving force behind my major accomplishments in the last 10 years, has been my curiosity. It led me to do things nobody in my area had the (ability|foolishness) to do, and then it led me to develop enough improvements that things work really well now.

        • agumonkey 1 hour ago
          I'd be curious if jobs like yours are not on the tail side of the distribution. It's very common that in work groups, curiosity / creativity gets ignored if not punished. I've seen this even in small techies groups, there was a natural emergence of boundaries in which people don't get to think beyond (you're overstepping, that's not your role, you're doing too much). It seems a pavlovian reflex when leadership doesn't know how to operate without assigning roles.
        • wrs 1 hour ago
          I mean, think of all the people getting paid eight-digit compensation right now because they were curious about this dead-end deep learning stuff 15 years ago for no good reason!
          • WhyOhWhyQ 1 hour ago
            I couldn't resist... Like the kid at facebook who's buddies with Altman so gets to be a billionare? Like Altman himself (when did he enter the field again? Oh yea he was a crypto huckster). Like everyone I've ever met in the machine learning department? 95% of the people in that field are just following trends and good at winning that game. Call it sour grapes, but I'm just observing reality here. And everyone who thinks following fads = being curious is just doing the larp I described earlier. Moreover, everyone who thinks following fads keeps them safe from AI is deluding themselves. The AI of 2026 can do it better than you can.
      • armchairhacker 2 hours ago
        We need some curious people. Otherwise nothing gets discovered, including solutions to future problems.
        • Spooky23 2 hours ago
          We do. But the would-be modern nobility are quite happy with being a sort of feudal lord.
        • WhyOhWhyQ 2 hours ago
          Fully expecting to get banned for my comment, but I'll just go on. Look at the silicon valley heroes and they're all business types. There's a few rare exceptions.
          • eklavya 1 hour ago
            Calm down. Hardly any drama except yours.
            • WhyOhWhyQ 1 hour ago
              You're right. Signing off for the day.
      • AnimalMuppet 2 hours ago
        Curiosity as your only trait makes you jobless. Curiosity enough to learn something new can help you remain employed.
    • turtletontine 2 hours ago
      I don’t necessarily think you’re wrong, but I’m skeptical that the curious will really meaningfully learn from LLMs. There’s a huge gap between reading something and thinking “gee that’s interesting, I’m glad I know that now,” and really doing the work and deeply understanding things.

      This is part of what good teaching is about! The most brilliant engaged students will watch a lecture and think “wow nice I understand it now!” and as soon as they try to do the homework they realize there’s all kinds of subtleties they didn’t consider. That’s why pedagogical well crafted assignments are so important, they force students to really learn and guide them along the way.

      But of course, all this is difficult and time consuming, while having a “conversation” with a language model is quick and easy. It will even write you flowery compliments about how smart you are every time you ask a follow up question!

      • tnias23 2 hours ago
        I find LLMs useful for quickly building mental models for unfamiliar topics. This means that instead of beating my head against the wall trying to figure out the mental model, I can beat my head against the wall trying to do next steps, like learning the lower level details or the higher level implications. Whatever is lost not having to struggle through figuring out the mental model is easily outweighed by being able to spend that time applying myself elsewhere.
        • wrs 1 hour ago
          I have some success by trying to explain something to an LLM, having it correct me with its own explanation that isn’t quite right either, correcting it with a revised explanation, round and round until I think I get it.

          Sort of the Feynman method but with an LLM rubber duck.

    • agumonkey 1 hour ago
      yes and it will mostly depends on the culture / economy. if you create incentives for kids to explore ideas through LLMs they'll become very knowledgeable (and maybe somehow confident). otherwise it will be the tiktok of cognition.

      10 bucks there will be a law to enforce exponential backoff so that you need to get good after a few questions before the LLMs delays things by an hour

    • nineteen999 2 hours ago
      I mean, its totally possible to be curious about some things and less curious about others.

      There's few things more annoying than a human that thinks it has the most accurate and up-to-date AI-level knowledge about everything.

    • AlexandrB 2 hours ago
      This is assuming the current AI business model (losing lots of money). As with the internet as a whole, AI companies will probably be incentivized to waste your time and increase "engagement" as they seek revenue. At that point, AI will only be a good tutor if you're extremely diligent at avoiding the engagement bait.
    • aeon_ai 2 hours ago
      Amen.
  • ChrisMarshallNY 2 hours ago
    Yeah, when they allowed calculators in the classroom, we all started getting dumber.

    If that sounded silly, it was exactly what they said would happen, when that came to pass (I grew up in the last generation where they weren't allowed. I know lots of folks younger than me, that I think are smarter than I am).

    • barapa 2 hours ago
      I don't find this all that compelling. Different technologies can have different effects. And why would future effects be influenced by the accuracy of random people's predictions of other events in the past?
    • femiagbabiaka 2 hours ago
      The calculator analogy doesn’t really work, speaking as someone who is more of an AI booster than skeptic. The addition of calculators to the classroom necessitated a change in pedagogy. So now kids learn how to do math without them, and then add them once the fundamentals are there. Learning how to think is even more foundational.
      • PaulDavisThe1st 1 hour ago
        Quite debatable. Learning how to think frequently involves basic math skills like "hmm, they claim a two of order of magnitude effect, but is that even feasible?" When you can't do "math" like that in your head, your ability to think is significantly impaired, as we are currently seeing.
        • femiagbabiaka 1 hour ago
          I think we agree? If LLMs will be included in classroom learning at all, it has to be done with an understanding of how it will affect learning outcomes, and it’s not clear that the effect will be the same as introducing calculators was, at all.
          • ChrisMarshallNY 1 hour ago
            I don't think it will be the same, but I also don't think we have any idea how it will actually affect us. Human development is a true chaos system. Psychohistory is fiction. Maybe AI might be able to enable it, but it sure ain't here, now.

            If we are looking just at IQ scores, then there's literally, billions of factors involved. It could be chemistry, nuclear radiation, malnutrition, stress, etc.

            Most of that stuff is totally unpredictable, and we can only tell, after the fact.

            Alea jacta est. There's nothing we can do about it. The candy ain't going back into the piñata. We'll just have to see what happens.

  • aqula 1 hour ago
    People said the same thing about the internet, that you should be getting your information from actual books and that the internet will make you lazy and complacent. There was a time when you were encouraged to write code on paper first instead of typing it directly because it made you think clearly. Some time before calculators apparently dulled your mental faculties, so you should be hand rolling all your calculations. Go back in time far enough and you'll find Socrates disparaging writing because it weakens your memory and destroys your mind. And yet humanity is here and seems to be doing all right. Every generation has managed to produce smart people that have been able to push the boundaries of scientific and technological progress. If anything we may be getting smarter. What history has repeatedly shown is that when you reduce friction for the human brain, it goes and finds more complex things to do. Such a periodic removal of friction, may very much be a necessity for progress, because it allows the paradigm of thought to shift to a higher level. The same should happen with AI as well.
  • m4ck_ 2 hours ago
    yeah but it's totally gonna usher us into a workless utopia where everyone has everything they ever wanted, because everything will be free! Or at least it will if we allow AI companies to operate completely unregulated and unimpeded.
  • jerome-jh 1 hour ago
    AI can be a great tool. It can make our children (and us) lazier, but not necessarily stupider. Short video platforms OTOH certainly make our children stupider (and depressed).
  • mo_42 1 hour ago
    Did the invention of the steam engine and all other heavy machines make us physically weaker? I guess so. People working on (literal) heavy stuff don't need the strength they used to.

    But now they move around even more heavy stuff with machines.

    I think something similar might happen to our brains. Maybe we won't be able to work ourselfs through every detail of a mathematical proof, of a software program, or a treatsie on philosophy. But we'll abe able to accomplish intellectual work that only really smart poeple could accomplish. I think this is what counts: outcome.

  • pluc 2 hours ago
    Don't worry; by the time your children are effectively stupid, you will be stupid enough not to realize it and instead will praise them for how well they can verbalize what they want. You will call it cognitive progress and you will thank AI for it.
  • spwa4 2 hours ago
    Hasn't Tiktok already done that?

    Oh, and the extreme brain drain the west imposed on everyone else, from South Africa to China, resulting in no available "brains", let's say, in those countries, and in the rich countries only brains available that aren't invested in making westerners smart, along with a disdain among existing populations of professions that require brains.

    • FrankyHollywood 2 hours ago
      Don't know if TikTok is the problem, a generation ago (some) kids mindlessly watched cartoons for hours a day.

      I think this is mostly about learning to think and develop grit.

      As a kid when I wanted to play a game I had to learn dos commands, know how to troubleshoot a non functioning sounds blaster etc. Sometimes took me days to fix.

      Doing this develops understanding of a domain, problem-solving skills and grit.

      My kid just opens steam and everything works. Any question he has he asks AI. I am really curious what effect this will have on this generation. It is tempting to quickly say "they will be brain dead zombies" but that seems too simplistic.

      In 20yrs we'll know!

      • Fire-Dragon-DoL 1 hour ago
        I keep seeing that people ask question to AI.

        That sounds like a dedicated teacher though, not that bad?

        Like asking questions and learning how and what questions to ask is an amazing skill

      • spwa4 1 hour ago
        > My kid just opens steam and everything works. Any question he has he asks AI

        (... and then presumably he applies what the AI tells him, occasionally asking why)

        Frankly, this is a much better and targeted way to learn. If this is what happens, great!

        I mean, I'd give him an intro how to pirate games, because

        1) it's a technical challenge with a built-in reward

        2) AIs (especially Gemini, but more and more ChatGPT too) refuse to help doing it

        So a truly excellent pursuit for learning!

        But I do feel it's very different from what happens with smartphones and that is desperately bad.

    • johnfn 2 hours ago
      Yes, yes, it's certainly not social media, or the plethora of apps that cater to and in some ways create an ever-shortening attention span (Reddit, TikTok, Facebook, Instagram, ...). It's definitely that thing you can use to research and learn anything you could ever want -- that is the thing which will unquestionably make our children stupid.
      • forgetfreeman 2 hours ago
        If it takes a few thousand pages of textbooks or other reference material to gain competence with a given topic how is consuming superficial summaries provided by AI expected to produce comparable results?
        • NeutralCrane 2 hours ago
          > If it takes a few thousand pages of textbooks or other reference material to gain competence

          This is a huge assumption and not one I’m sure holds up. In my experience gaining competence is often more a matter of hands on experimentation and experience, and the thousands of pages of reference material are there to get you to the point where you can start getting hands on experience, and debug your experiments when they don’t work. If AI can meaningfully cut back on that by more efficiently getting people to the experimentation stage, it absolutely will be more effective. And so far in my limited experience, it seems extremely promising.

    • grugagag 2 hours ago
      Stupid is a continuum. The tiktok stupid may pale in comparison if AI is blindly implemented at all levels of education.
  • mwkaufma 1 hour ago
    I like a scathing critique of overly-hyped chatbots as much as the next guy, but leading with the pseudoscience of IQ scores has the persuasive impact of a farting noise.
  • chneu 2 hours ago
    I think blaming AI isn't quite right.

    I think the current mentality of "Make every process in life as easy and time-efficient as possible" is the problem.

    AI is just a tool. What someone does with it is up to them. The current desire to not do anything, however, means people will abuse AI to make their lives more segregated from the work that enables them.

    As technology progresses, people are less connected to the how and why of life. This leads to people not understanding how to do basic things. Nobody can do anything on their own and they have to pay money to someone for really basic stuff. People can hardly go grocery shopping anymore as it takes too much time. Peak capitalism?

    Really just watch Idiocracy. AI isnt the problem; people's desire to do as little as possible is the problem.

    • baal80spam 1 hour ago
      In theory, "making every process in life as easy and time-efficient as possible" was supposed to "enable humans to perform more creative and complex tasks requiring TRUE intelligence!"

      In practice, they just spend all that saved time scrolling tiktok.

  • grouchomarx 2 hours ago
    Maybe, did shoes make us worse at walking barefoot?
    • satvikpendem 2 hours ago
      Yes it did, and it can nearly permanently alter the shape of your foot such that it's worse for walking in general.
    • marcosdumay 1 hour ago
      Yes, but they made us much better at walking.
    • singpolyma3 2 hours ago
      Yes
  • FpUser 2 hours ago
    >"AI will make our children stupid"

    AI is too late for the party. Mission already accomplished

  • luxuryballs 2 hours ago
    It’s like a new form of doing the average thing will keep you average, it’s always been the case that you have to go above and beyond to do better. So in that way it could be argued that the lazy/average approach might be better off with more of the computer doing work for them.
  • brazukadev 2 hours ago
    Our kids were stupid enough when I was a kid, I'm 100% sure it's not possible to become more stupid than I and me childhood friends were and turns out most of us are doing ok. The internet helpd some of us to become super smart and some of us super dumb, same will happen with AI.
  • xvector 2 hours ago
    I would have benefited so much from AI when learning concepts, needing things explained to me in just a slightly different way, or confirming my understanding of a phenomenon.

    As usual, it comes down to parenting. Bad parents will blame AI for their kids being stupid, just as they blame TikTok or whatever today.

  • knowitnone3 2 hours ago
    Our children are already stupid and have been for a while. Watch Jay Leno asking questions on the streets or Veritasium asking the scale of the universe[1]. Looking at the setting, this was asked on a college campus! Then, colleges are having to teach remedial math because college students even can't do basic math[2]. 1. https://www.youtube.com/watch?v=fG8SwAFQFuU 2. https://nypost.com/2025/04/05/opinion/harvard-univ-the-ivy-l...
  • the_real_cher 1 hour ago
    I hate that people think this.

    AI is a superfast internet search.

    Imagine if you had that growing up. Instant access to any information with a professorial level of teaching and you could ask any question to clear up any confusion?

    Our kids are going to be smarter than we could even imagine because getting access to any information they can imagine is instant taught by a perfect tutor.

  • A4ET8a8uTh0_v2 1 hour ago
    I am taking a contrary stance and not even as a contrarian voice, but based on basic patterns over the past few centuries. The whole article is geared towards a specific audience. The interesting thing is that it is not exactly wrong, but the way it presents facts is intended for a specific type of consumption: in this case -- generating anti AI sentiment.

    << Fundamental skills like mental arithmetic, memorising text, or reading a map could soon be obsolete as cognitive offloading becomes a normal way of working.

    Calculator, books, gps -- the three have been trotted out each time and some ( what passed for books in ancient days ) decried by otherwise smart people, who simply could not fanthom a different way of solving an issue. Worse, they offered no reason for:

    1. Why do I need to calculate everything in my head? 2. Why do I need to memorize every passage? 3. Why do I need to remember every step?

    So kids, who saw an improvement simply ignored the old men.. and good thing too. Otherwise, I might not even have been able to read beowulf ( literally ).

    << it’s also the desire among people in positions of authority and influence

    Is it? Recent news suggested that execs of various tech corps limit their kids passive screen time ( so no doom scrolling, no social media ).

    << able to retain concentration so that we can learn and distinguish between what is real and what is AI slop

    True, but in a sense that has always been true. If so, what is the real reason for this 'collection of words'?

    << The danger here is the separation of process from “product”. In the eyes of the utilitarian tech-evangelist, the essay is simply a product, a sequence of words to be generated as quickly as possible.

    And here is the issue. Author is concerned that their words are no longer going to be special; note, not completely unlike certain monks upon learning about printing press. How quaint.

    << But the process of writing is itself constitutive of understanding.Writing is thinking. It is the act of retrieving knowledge, wrestling with syntax, and organising logic that forges understanding.

    Have you read some of the articles out there ( including this one )? There is no wrestling there. There might ( I am being charitable ) be some thinking, but if there is logic OR understanding, it is not beyond what is required for serving the owner of the writer. That is all there is to it.

    << When AI produces the final text, the student is the ventriloquist’s dummy, mouthing words that originated elsewhere.

    Well, I will be darned. This individual is just taking words out of my mouth, because I was about to say all those talking ( sorry, writing ) heads are just parroting one another with the origin of the sound ( sorry again, word ) clearly not coming from them..

    << They possess the answer but lack the understanding of how it was derived

    So.. we ban encyclopedias?

    << We are also witnessing a kind of cognitive laziness which some of our institutions are actively encouraging.

    I can give him that. It does take effort not to rely on it.

    << It requires the uncomfortable sensation of not knowing

    But... but.. the author knows.. he just told us all what to think...

    << float on a sea of algorithmic slop they have neither the will nor the wit to navigate.

    And this is different from now how exactly? Scale? Kids who want to read will read. Kids who want to learn, will learn.

    ***

    Honestly, I am hard pressed not to say this article is slop. Not even proper AI slop like we would expect today ( edit: because at least that is entertaining ). This is lazy human slop. High and mighty, but based on 'old man yells at the cloud' vibes.

  • dmarchand90 1 hour ago
    [flagged]