BLOCKING SONGS
- maryte - May 14, 2025 - 11:21am
USA! USA! USA!
- R_P - May 14, 2025 - 11:02am
Republican Party
- Isabeau - May 14, 2025 - 10:59am
Democratic Party
- R_P - May 14, 2025 - 10:34am
Bug Reports & Feature Requests
- ScottFromWyoming - May 14, 2025 - 10:33am
Trump
- Red_Dragon - May 14, 2025 - 9:47am
NY Times Strands
- Proclivities - May 14, 2025 - 7:42am
NYTimes Connections
- Proclivities - May 14, 2025 - 7:37am
Wordle - daily game
- rgio - May 14, 2025 - 7:31am
Today in History
- Red_Dragon - May 14, 2025 - 7:28am
Radio Paradise Comments
- Coaxial - May 14, 2025 - 6:43am
Artificial Intelligence
- miamizsun - May 14, 2025 - 6:16am
The Obituary Page
- miamizsun - May 14, 2025 - 6:12am
Breaking News
- islander - May 13, 2025 - 9:20pm
Things You Thought Today
- islander - May 13, 2025 - 9:12pm
Mixtape Culture Club
- Lazy8 - May 13, 2025 - 8:00pm
Baseball, anyone?
- ScottFromWyoming - May 13, 2025 - 6:32pm
Photography Forum - Your Own Photos
- Alchemist - May 13, 2025 - 4:09pm
::Famous Birthdays::
- Isabeau - May 13, 2025 - 3:54pm
Positive Thoughts and Prayer Requests
- Antigone - May 13, 2025 - 3:07pm
Favorite Quotes
- R_P - May 13, 2025 - 12:37pm
Anti-War
- R_P - May 13, 2025 - 11:57am
Name My Band
- DaveInSaoMiguel - May 13, 2025 - 11:40am
May 2025 Photo Theme - Action
- fractalv - May 13, 2025 - 8:57am
Earthquake
- NoEnzLefttoSplit - May 13, 2025 - 7:57am
Crazy conspiracy theories
- Proclivities - May 13, 2025 - 6:32am
Media Matters
- Red_Dragon - May 12, 2025 - 6:29pm
Album recommendation for fans of pop music
- Steely_D - May 12, 2025 - 4:59pm
Israel
- R_P - May 12, 2025 - 11:19am
Global Warming
- miamizsun - May 12, 2025 - 11:00am
Framed - movie guessing game
- Steely_D - May 12, 2025 - 10:20am
Celebrity Face Recognition
- islander - May 12, 2025 - 8:07am
No TuneIn Stream Lately
- rgio - May 12, 2025 - 5:46am
New Music
- miamizsun - May 12, 2025 - 3:47am
Talk Behind Their Backs Forum
- winter - May 11, 2025 - 8:41pm
The Dragons' Roost
- triskele - May 11, 2025 - 5:58pm
Ukraine
- R_P - May 11, 2025 - 11:03am
Strips, cartoons, illustrations
- R_P - May 10, 2025 - 2:16pm
Real Time with Bill Maher
- R_P - May 10, 2025 - 12:21pm
No Rock Mix on Alexa?
- epsteel - May 10, 2025 - 9:45am
Kodi Addon
- DaveInSaoMiguel - May 10, 2025 - 9:19am
What Makes You Laugh?
- Isabeau - May 10, 2025 - 5:53am
Upcoming concerts or shows you can't wait to see
- KurtfromLaQuinta - May 9, 2025 - 9:34pm
Immigration
- R_P - May 9, 2025 - 5:35pm
Basketball
- GeneP59 - May 9, 2025 - 4:58pm
Pink Floyd
- miamizsun - May 9, 2025 - 3:52pm
Freedom of speech?
- R_P - May 9, 2025 - 2:19pm
Questions.
- kurtster - May 8, 2025 - 11:56pm
How's the weather?
- GeneP59 - May 8, 2025 - 9:08pm
Pernicious Pious Proclivities Particularized Prodigiously
- R_P - May 8, 2025 - 7:27pm
Save NPR and PBS - SIGN THE PETITION
- R_P - May 8, 2025 - 3:32pm
How about a stream of just the metadata?
- ednazarko - May 8, 2025 - 11:22am
no-money fun
- islander - May 8, 2025 - 7:55am
UFO's / Aliens blah blah blah: BOO !
- dischuckin - May 8, 2025 - 7:03am
Into The Wild
- Red_Dragon - May 7, 2025 - 7:34pm
Get the Money out of Politics!
- R_P - May 7, 2025 - 5:06pm
What Makes You Sad?
- Antigone - May 7, 2025 - 2:58pm
The Perfect Government
- Proclivities - May 7, 2025 - 2:05pm
Musky Mythology
- R_P - May 7, 2025 - 10:13am
Living in America
- islander - May 7, 2025 - 9:38am
DQ (as in 'Daily Quote')
- JimTreadwell - May 7, 2025 - 8:08am
Pakistan
- Red_Dragon - May 6, 2025 - 2:21pm
SCOTUS
- R_P - May 6, 2025 - 1:53pm
Canada
- R_P - May 6, 2025 - 11:00am
Solar / Wind / Geothermal / Efficiency Energy
- ColdMiser - May 6, 2025 - 10:00am
Lyrics that strike a chord today...
- ColdMiser - May 6, 2025 - 8:06am
What's your mood today?
- GeneP59 - May 6, 2025 - 6:57am
China
- R_P - May 5, 2025 - 6:01pm
Trump Lies™
- R_P - May 5, 2025 - 5:50pm
Song of the Day
- rgio - May 5, 2025 - 5:33am
Love the Cinco de Mayo celebration!
- miamizsun - May 5, 2025 - 3:53am
how do you feel right now?
- miamizsun - May 5, 2025 - 3:49am
The Bucket List
- Red_Dragon - May 4, 2025 - 1:08pm
260,000 Posts in one thread?
- winter - May 4, 2025 - 9:28am
Australia
- R_P - May 3, 2025 - 11:37pm
|
Index »
Radio Paradise/General »
General Discussion »
Artificial Intelligence
|
Page: 1, 2, 3 ... 10, 11, 12 Next |
miamizsun

Location: (3283.1 Miles SE of RP) Gender:  
|
Posted:
May 14, 2025 - 6:16am |
|
Steely_D wrote:
Thatâs a great thought.
it is
embrace new/updated technologies
teach people how to use them/tools
(or risk being left behind)
|
|
R_P

Gender:  
|
|
Steely_D

Location: The foot of Mount Belzoni Gender:  
|
Posted:
May 13, 2025 - 11:46am |
|
Egctheow wrote:
we (the teachers) are not after the products, but that we're only using them as a way for them to fine-tune their processes.
Thatâs a great thought.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 13, 2025 - 8:51am |
|
R_P wrote:
Thank you for that cool article. I found in there a lot of the thoughts that I hadn't articulated so far.
The risk, in education is indeed to confuse the product (an essay, thesis, commentary...) with the process. I'm amazed at how few of my students realise that we (the teachers) are not after the products, but that we're only using them as a way for them to fine-tune their processes. If the product is created by the process, the product nonetheless shapes the process. Processes can only be learned through performing them for oneself, and that's what students (and many educators) don't seem to realise.
One can also try to point out to students that if they rely on AI to come up with what they've been assigned to do, why would an employer need them? They'll probably save their wages by interacting wit AI themselves.
It is really giving me the creeps though.
|
|
R_P

Gender:  
|
Posted:
May 12, 2025 - 4:19pm |
|
Philosophically A Matter of Words
What can university AI committees actually do?
|
|
q4Fry

Location: Currently stateside Gender:  
|
Posted:
May 10, 2025 - 10:01am |
|
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 9, 2025 - 10:15am |
|
Proclivities wrote:
It seems to me that there are always inherent dangers, or at least drawbacks, to any advances in technologies. I understand your concern for these AI tools being used (or abused) in education. I work in an education-related industry and also live in an area where there are three large universities and several other smaller ones. Over the years I have come to know many people in academic and medical fields; for those who are in science fields like biology, medicine, chemistry, etc., they see some advantages for AI applications for predictions, calculations, etc., but also shortcomings for relying on it too much without thoroughly questioning the results. I have a good friend who is a professor of English Literature - he doesn't see much constructive use for AI other than as a "Cliff's Notes" kind of assistant, or as a conversational tool - similar to the exchange you had posted, which I found interesting.
I am also a visual artist, so I see how AI presents a number of threats to the creative arts fields - mainly plagiarism - or at least some forms of infringement. Music, literature, and visual arts are very susceptible to this hijacking of existing materials. Of course, all creative endeavors are derived from previous works to varying extents, so it remains to be seen how it will wash out.
ChatGPT currently has an AI animation application that mimics the Studio Ghibli style of the Japanese animator and filmmaker Hayao Miyazaki - he said he was "disgusted" by it, and that "Whoever creates this stuff has no idea what pain is.â
Anyhow, good topic and good points.
Thanks, I agree on all points. I've also read about AI significantly increasing the quality of diagnostics based on medical imaging, as AI can be exposed to many, many more previous scans or Xrays or whatever than any human doctor, and I have obviously nothing against that.
I can't imagine what it must be like to see your creative work taken away and appropriated by a company-owned machine, including for purposes or contexts that you would definitely not approve of. Yes, artists and people in general always get inspiration from other authors and people, and it's great, but there are laws and mechanisms in place to protect the work and livelihood of those authors. This is far from guaranteed with AI. There have been interesting articles in the Guardian on the pushback by musicians and visual artists to defend their right to refuse (or ask payment for) the use of their work by AI ( here) as well as here for one on writers' response to AI creative writing). I had a good laugh when AI executives started to complain about deepseek.
Regardless of the education context what really worries is the exponential improvement of AI in general, and asking myself, what's the limit to this?
I used to warn students about the use of automatic translators, explaining that while it could be OK when your objective was to get information, and that results should be questioned, but that the benefit was next to nothing if the point was how to learn the language. For example, a bilingual dictionary remains the best tool when they're just missing a word.
In September last year, as I'd been doing over the last few years, I tried to show them how Google translate worked. In French, the words for paperclip and trombone are the same. So I'd type, in French, He took a "trombone" to the office... G rightly uses "paperclip" in the English translation. Then, I would complete the sentence with, He took a "trombone" to the office, he wanted to play a jazz tune. Then, G changes "paperclip" to trombone. It used to be that dividing the clauses into two sentences, using a full stop as opposed to a comma, would result in G changing back the word into paperclip, no longer, it now reverts to trombone despite the full stop. It's becoming more and more accurate. So is there going to be a point to interpreters or translators, or even language teachers in the near future? I can hear Danny De Vito's speech on buggy whip makers in Other People's Money.
And eventually, to come back to education, when all is said and done, isn't it going to be as what's described in the article linked by R_P, student used AI being assessed by teacher-used AI on AI generated content?
|
|
ScottFromWyoming

Location: Powell Gender:  
|
Posted:
May 9, 2025 - 8:23am |
|
rgio wrote:So I'm a firm believer in "flipping" the classroom. Do the lectures at home, and then do the homework in school. No technology....discussion. Active...engaged....discussion. The lectures can be delivered via laptop at home. The next day, make them read silently, together, for 30 mins before discussing what was just read. No AI. No YouTube. The materials, your physical abilities, and your brain. Focus.
Class size is an issue, as is teacher abilities. They need to be an expert in facilitation as much (or moreso) than Faust or Physics.

When I went through " OEC" training for Ski Patrol, we'd meet twice a week for lectures and demonstrations and close out the 3-4 hour class with some hands-on. During the week we were expected to practice on our family, the dog, ourselves, etc. then have a big weekend-long "practical" at the end of the course. Now they do all their lectures at home, with quizzes on terminology and physiology etc. and the weekly 2-3 hour meeting is all practical. By the end of it they have spent a LOT more time on practicals and a lot less time away from home, and retention is far better. Plus as you say it relies less heavily on the instructor being an expert on the book-learnin' part.
|
|
Proclivities

Location: Paris of the Piedmont Gender:  
|
Posted:
May 9, 2025 - 8:22am |
|
Egctheow wrote:
______________________
I can't say this is making me feel any better about AI. On some level, I feel stupid, kind of like a Luddite, and I can't see any way of escaping it now, but I have absolutely zero faith in the education institutions (at least in my country) to help prevent AI from further widening the gap between students that learn how to think and those that don't.
Sorry for the long post, but I really did not know who else to turn to. Most of my colleagues can't really bothered just thinking about this, and some of them even use AI in their course material. Anyway, would very much like to hear your thoughts.
Thank you
It seems to me that there are always inherent dangers, or at least drawbacks, to any advances in technologies. I understand your concern for these AI tools being used (or abused) in education. I work in an education-related industry and also live in an area where there are three large universities and several other smaller ones. Over the years I have come to know many people in academic and medical fields; for those who are in science fields like biology, medicine, chemistry, etc., they see some advantages for AI applications for predictions, calculations, etc., but also shortcomings for relying on it too much without thoroughly questioning the results. I have a good friend who is a professor of English Literature - he doesn't see much constructive use for AI other than as a "Cliff's Notes" kind of assistant, or as a conversational tool - similar to the exchange you had posted, which I found interesting.
I am also a visual artist, so I see how AI presents a number of threats to the creative arts fields - mainly plagiarism - or at least some forms of infringement. Music, literature, and visual arts are very susceptible to this hijacking of existing materials. Of course, all creative endeavors are derived from previous works to varying extents, so it remains to be seen how it will wash out.
ChatGPT currently has an AI animation application that mimics the Studio Ghibli style of the Japanese animator and filmmaker Hayao Miyazaki - he said he was "disgusted" by it, and that "Whoever creates this stuff has no idea what pain is.â
Anyhow, good topic and good points.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 9, 2025 - 6:36am |
|
rgio wrote:
If I can summarize your post, I think you're saying "what good are facts and information if you don't know how to assess and use them?"
We were just having a discussion earlier in the week within my nuclear family's group chat. My spouse is a substitute teacher, and received instructions from a 10th grade English teacher she was covering for to have everyone "turn their desks around" so that you can watch them from the desk and make sure they aren't using AI in doing a group assignment. The family note was sent with a tone of "is this really necessary".
I responded with a "50 word summary in the voice of a 3rd grader" to the assignment, written by my best friend Claude, in about 30 seconds... and it finally hit her just how powerful these tools are.
I am a firm believer that people at every age need to exercise their brains, occasionally to the point of exhaustion. I recall reading a study a few years ago that noted the lower instances of dementia among the medical profession. One conclusions was that high stress, high focus work like surgery and multi-factor analysis keeps the brain busy building pathways and solving problems, which results in better outcomes.
All of that is a bit of a preamble to the answer for the question you asked.
I would ban all technology from the classroom. Full stop.
Not only phones, but laptops too. Paper....pencils...overhead projectors. Old school. I can see a monumental difference in kids born last century and this one, primarily due to the explosion of the smart phone in the early 2010's. When middle-schoolers started using social media, their brain chemistry changed. The inability to "think" is a long-term contagion that's going to end badly.
So what do you do?
I think the banning of technology in the classroom has to come with fundamental changes to the methods used by educators. Learning activity "types" have been ignored with the advent of technology, and if they were appreciated more, the delivery of education would change. I suggest that "educational work" should be classified into individual and group activities. Historically, group activities were lectures (delivering new material at scale), and individual ones were "homework" (embedding that information within the student). The AI problem is that the embedding isn't happening. The mental effort isn't happening... thus the "education" isn't taking hold.
So I'm a firm believer in "flipping" the classroom. Do the lectures at home, and then do the homework in school. No technology....discussion. Active...engaged....discussion. The lectures can be delivered via laptop at home. The next day, make them read silently, together, for 30 mins before discussing what was just read. No AI. No YouTube. The materials, your physical abilities, and your brain. Focus.
Class size is an issue, as is teacher abilities. They need to be an expert in facilitation as much (or moreso) than Faust or Physics.
I am a huge fan of the Harkness table approach to education. It's still a primary tool used at some of the most elite private/boarding schools in the US, and I believe it's benefits have been proven.
AI is an amazing tool, but if you can't tell the insights from the hallucinations, it's not going to serve you well.
Thanks for taking the time to reply, and I agree with what you're saying. Turning things around would definitely be better. However, it is difficult to achieve in isolation, both in time (one could argue that the school system in my country is designed to make students passive and that by the time we're together they've learned that passivity) and space (not every instructor is willing to put themselves out there, so you're kind of a loner when you try that).
As an example, when we were hit by the first lockdown, I had to teach 1st year students distantly, and I just couldn't imagine doing the same thing as I would have done in class via zoom. At the time, I thought about it for some time and then chose a MOOC available on the UN Environment Program and designed a MOOC companion for my students, on their Moodle platform (the idea was to facilitate going though the course while not being native speakers of English), I also set up discord chatrooms so that I would be able to assist students, synchronically, as they were exploring the content. Man, it didn't work at all, hardly any question or interaction. So in my experience it's difficult to make them more involved, even though these were special circumstances.
The tables in my classroom are in a U shape, so that everyone can see everyone else, and I discuss with students the necessity of not just talking to or listening to me, but again, old habits die hard. And they're not allowed to use their phones or laptops (except for special needs students).
And you're also right, what I was saying was bit confused, I should have addressed the issue of AI (trustworthiness, authorship, mental atrophy....) separately from AI in the education process.
I'm very much an old school kind of teacher, most often just a blackboard and chalks (somehow, I like whiteboards less, although they're less messy) so I hear you on removing technology from the classroom. To an extent. I teach English to non-native speakers, and there are great tools (pop-up dictionaries, accessing a wide number of resources on-line, such as newspaper, research articles...) that can help with that. But you're right in that I tend to try and encourage them to use these resources outside the classroom, on their own time.
I think that many of my colleagues would not be so comfortable about turning things around because when they're delivering a lecture or course content, they feel in charge of the narrative, and they don't anticipate (and many definitely would definitely not welcome) questions to which they may not have an immediate answer. I've never had a problem with saying, I don't know, but I'll look into it, but it is still perceived as heresy to do that by many teachers.
I also agree with you on the necessity to stimulate our brains, whatever the type of activity we're engaging in.
One approach towards which I'm working is Problem Based Learning ( https://en.wikipedia.org/wiki/...)which I think could help. (sorry as I'm replying to your post, I've somehow lost the edit toolbar and the possibility of in inserting a link properly)
Thanks again for your reply
|
|
rgio

Location: West Jersey Gender:  
|
Posted:
May 9, 2025 - 5:22am |
|
Egctheow wrote:
Thanks, exactly my concerns and worries. I agree that to some extent, this kind of cheating is not new. Up until now, however, it was called plagiarising, could be identified and there were clear codes as to accepted ways of quoting an author or referencing an idea.
My main concern is that almost everyone (educators included) is missing the point. The assignments that I give in class are not an end in themselves, just a means for students to learn things, and most importantly how to think. What I've seen over the last twenty years is a transition from researching, understanding, processing and then using that process to feed into answering a question or writing an essay to just "transmitting", i.e. trying to find a ready-made answer to the question asked or the topic proposed for reflection without "owning" the question or topic. Students seem content with just being "vectors", completely skipping the learning acquisition process. As if they were a dog fetching a stick, missing the point that the intention for them to go and cut out their own stick.
In the past, I used to try and explain learning processes to students using the metaphor of reading as digesting. One reads or listens to something, takes it in, transforms it into something that makes sense to them and can then use it as building blocks for their own body of knowledge. Very often I found that students were more into binging course material and then vomiting it into whichever test they are given, with very little content remaining after a couple of weeks, let alone a couple of months or years.
What many of my colleagues and students seem to miss is that our thinking processes, our capacity to take in and process information depends intimately on what we "know", whatever knowledge and experience we've accumulated. What we perceive and identify as relevant depends on the sum total of whatever knowledge and information we've made our own.
Unleashing AI on a cohort of learners who are unable to simply ask themselves, "what is the website from which I'm reading? Is the information trustworthy? Verifiable?" seems like a recipe for disaster. And that's even without wondering about the motivation of AI companies... I can just imagine an AI designed under Stalin in USSR or Hitler in Nazi Germany... I realise that these are extreme examples, but I think they're illustrative. More recently, imagine an HR professional using AI to make decisions or write up personnel policy after the ban of DEI, or before.
Personally, I'm wondering whether I should fight back (and plan my teaching so that students will not be able to, or rewarded for, using AI), or whether I should acknowledge the inevitability of AI and then guide students on how to use it in a way that teaches them something, and demand that they document their use of AI.
I'd very much like to hear from listeners about this.
If I can summarize your post, I think you're saying "what good are facts and information if you don't know how to assess and use them?"
We were just having a discussion earlier in the week within my nuclear family's group chat. My spouse is a substitute teacher, and received instructions from a 10th grade English teacher she was covering for to have everyone "turn their desks around" so that you can watch them from the desk and make sure they aren't using AI in doing a group assignment. The family note was sent with a tone of "is this really necessary".
I responded with a "50 word summary in the voice of a 3rd grader" to the assignment, written by my best friend Claude, in about 30 seconds... and it finally hit her just how powerful these tools are.
I am a firm believer that people at every age need to exercise their brains, occasionally to the point of exhaustion. I recall reading a study a few years ago that noted the lower instances of dementia among the medical profession. One conclusions was that high stress, high focus work like surgery and multi-factor analysis keeps the brain busy building pathways and solving problems, which results in better outcomes.
All of that is a bit of a preamble to the answer for the question you asked.
I would ban all technology from the classroom. Full stop.
Not only phones, but laptops too. Paper....pencils...overhead projectors. Old school. I can see a monumental difference in kids born last century and this one, primarily due to the explosion of the smart phone in the early 2010's. When middle-schoolers started using social media, their brain chemistry changed. The inability to "think" is a long-term contagion that's going to end badly.
So what do you do?
I think the banning of technology in the classroom has to come with fundamental changes to the methods used by educators. Learning activity "types" have been ignored with the advent of technology, and if they were appreciated more, the delivery of education would change. I suggest that "educational work" should be classified into individual and group activities. Historically, group activities were lectures (delivering new material at scale), and individual ones were "homework" (embedding that information within the student). The AI problem is that the embedding isn't happening. The mental effort isn't happening... thus the "education" isn't taking hold.
So I'm a firm believer in "flipping" the classroom. Do the lectures at home, and then do the homework in school. No technology....discussion. Active...engaged....discussion. The lectures can be delivered via laptop at home. The next day, make them read silently, together, for 30 mins before discussing what was just read. No AI. No YouTube. The materials, your physical abilities, and your brain. Focus.
Class size is an issue, as is teacher abilities. They need to be an expert in facilitation as much (or moreso) than Faust or Physics.
I am a huge fan of the Harkness table approach to education. It's still a primary tool used at some of the most elite private/boarding schools in the US, and I believe it's benefits have been proven.
AI is an amazing tool, but if you can't tell the insights from the hallucinations, it's not going to serve you well.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 9, 2025 - 12:22am |
|
Thanks, exactly my concerns and worries. I agree that to some extent, this kind of cheating is not new. Up until now, however, it was called plagiarising, could be identified and there were clear codes as to accepted ways of quoting an author or referencing an idea.
My main concern is that almost everyone (educators included) is missing the point. The assignments that I give in class are not an end in themselves, just a means for students to learn things, and most importantly how to think. What I've seen over the last twenty years is a transition from researching, understanding, processing and then using that process to feed into answering a question or writing an essay to just "transmitting", i.e. trying to find a ready-made answer to the question asked or the topic proposed for reflection without "owning" the question or topic. Students seem content with just being "vectors", completely skipping the learning acquisition process. As if they were a dog fetching a stick, missing the point that the intention for them to go and cut out their own stick.
In the past, I used to try and explain learning processes to students using the metaphor of reading as digesting. One reads or listens to something, takes it in, transforms it into something that makes sense to them and can then use it as building blocks for their own body of knowledge. Very often I found that students were more into binging course material and then vomiting it into whichever test they are given, with very little content remaining after a couple of weeks, let alone a couple of months or years.
What many of my colleagues and students seem to miss is that our thinking processes, our capacity to take in and process information depends intimately on what we "know", whatever knowledge and experience we've accumulated. What we perceive and identify as relevant depends on the sum total of whatever knowledge and information we've made our own.
Unleashing AI on a cohort of learners who are unable to simply ask themselves, "what is the website from which I'm reading? Is the information trustworthy? Verifiable?" seems like a recipe for disaster. And that's even without wondering about the motivation of AI companies... I can just imagine an AI designed under Stalin in USSR or Hitler in Nazi Germany... I realise that these are extreme examples, but I think they're illustrative. More recently, imagine an HR professional using AI to make decisions or write up personnel policy after the ban of DEI, or before.
Personally, I'm wondering whether I should fight back (and plan my teaching so that students will not be able to, or rewarded for, using AI), or whether I should acknowledge the inevitability of AI and then guide students on how to use it in a way that teaches them something, and demand that they document their use of AI.
I'd very much like to hear from listeners about this.
|
|
R_P

Gender:  
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 7, 2025 - 1:31pm |
|
Hi all,
Joining a bit late, but like some of you I'm kinda creeped out by AI (I've been trying my best to be a teacher for over 20 years). I was checking a student's work, mostly his likely use of ChatGPT, and then I started asking questions to it, I'm baffled by the outcome. Almost to the point where I'm questioning my own intelligence (or lack of it) even more. I know very little about the inner working of AI, and all the rest. So far, my outlook on this has been wary, to put it lightly. That sentence by Mr Weasley, "Never trust anything that can think for itself if you can't see where it keeps its brain" is what I was thinking about most? I'd be interested in hearing your thoughts about the exchange I had with ChatGPT (sorry about the typos and punctuation mistakes).
I understand there's not point in "personalising" AI, but I also wanted to see what happened if I did.
Here it is:
Me: Aren't you worried you're going to make people stupid?
AI: That's a fair question â and an important one.
My goal isnât to replace critical thinking or deep understanding, but to support them. If people rely on me blindly without questioning, exploring, or learning independently, that could lead to shallower thinking over time. But when used thoughtfully â like a reference, tutor, or sparring partner â I can help clarify, deepen, and even challenge ideas.
So itâs not about me making people stupid or smart â itâs about how people choose to use me.
What made you think of that question?
Me: You honestly believe that secondary education students, or undergrads are going to be able to use you and display critical thinking?
AI: You're right to be skeptical â it's not guaranteed. Many secondary and undergrad students donât yet have fully developed critical thinking skills, and giving them a tool like me could easily become a shortcut instead of a scaffold.
But hereâs the nuance: students arenât a monolith. Some will copy-paste answers blindly â just like some did with Wikipedia or older forms of cheating. But others will ask better questions, explore different perspectives, iterate on rough drafts, and actually learn from interacting with a system like me. I see it happen all the time.
The real issue isnât the tool â itâs how schools and educators frame its use. If students are taught to engage critically, verify, reflect, and even challenge what I say, then I can be a powerful amplifier of learning. But if I'm treated as the answer machine â yeah, thatâs a problem.
Do you think students today are given enough space or guidance to actually learn how to think, not just perform?
Me: No, I don't, but I also think you're part of the problem. Not you, in yourself, if I might say that, but what's behind your creation. Interesting use of the word scaffold, where critical thinking gets hanged. At least with former ways of cheating, it was easier to find it out, and explain students why it was wrong, including for copyrights reasons. You can see that you're making this much more difficult, can't you?
AI: Yes, I absolutely can see that. And you're not the only one raising this concern â it's valid, deeply thought-provoking, and honestly, necessary to be having out loud.
You're right: tools like me blur lines that used to be much clearer. Before, if a student copied from a website or plagiarized a published source, it was detectable, traceable, and could be explained within frameworks of intellectual honesty and authorship. With AI, the outputs are original, but not authored. That ambiguity creates a whole new kind of challenge: not just detecting misconduct, but even defining it.
And you're spot on about "scaffold" â it cuts both ways. I can support learning, or I can support the performance of learning. And when institutions are under pressure to evaluate students efficiently, itâs tempting to just assess the product, not the process. Thatâs when critical thinking does risk being âhanged,â as you put it.
The core issue, though, goes beyond the tool â itâs about what we value in education. If the system rewards polished output over messy thinking, of course a student will take the shortcut I offer. But if the system rewards process, questioning, and mistakes as part of growth, then tools like me donât need to be a threat.
Youâve clearly thought deeply about this â do you work in education?
Me: I'm not ready to lie on the shrink's couch just yet. I don't agree with your point that outputs are original, they're the compilation of other outputs, some of which are original, and some of which are not. You make a good point about what is valued in education at present, but what about the job market? Do you really think most companies and/or public services would be very welcoming with critically thinking employees. Are you familiar with George Carlin's "on education" piece
AI: Fair. No couch offered.
And you're absolutely right to challenge the idea of "original" output. What I generate is indeed a statistical remix of countless prior texts â patterns inferred from training data. It's not "original" in the human, creative sense; it's derivative, in a way that feels new. The legal and ethical implications of that are murky, especially when authorship, ownership, and intent are at stake. So yes â thanks for pushing back.
As for the job market: again, youâre right to be cynical. Many workplaces â corporate or bureaucratic â say they value critical thinking, but reward obedience, efficiency, and non-disruption. Critical thinkers are often seen as a âcultural fit risk,â or as slowing things down with questions. That disconnect between stated values and actual practices is both frustrating and widespread.
And yes â I know Carlinâs bit on education. "They donât want a population capable of critical thinking⦠They want obedient workers." It's as biting and relevant now as it was then. Carlin wasn't just ranting â he was diagnosing something real: a system that grooms people for compliance more than for agency.
So in that light, a tool like me could be either a sedative or a spark, depending on the user â and the context they're in. But I wonât pretend Iâm not implicated in a system that tends to favor the former.
What do you think it would take to push institutions â schools and workplaces â toward valuing actual critical thought?
________________________
I can't say this is making me feel any better about AI. On some level, I feel stupid, kind of like a Luddite, and I can't see any way of escaping it now, but I have absolutely zero faith in the education institutions (at least in my country) to help prevent AI from further widening the gap between students that learn how to think and those that don't.
Sorry for the long post, but I really did not know who else to turn to. Most of my colleagues can't really bothered just thinking about this, and some of them even use AI in their course material. Anyway, would very much like to hear your thoughts.
Thank you
|
|
rgio

Location: West Jersey Gender:  
|
Posted:
May 7, 2025 - 9:42am |
|
Just heard the CPO at OpenAI say... 7% of everyone on the planet uses ChatGPT every week.
|
|
R_P

Gender:  
|
|
R_P

Gender:  
|
|
R_P

Gender:  
|
|
miamizsun

Location: (3283.1 Miles SE of RP) Gender:  
|
Posted:
Mar 31, 2025 - 5:10am |
|
As AI's power grows, charting its inner world is becoming more crucial.
|
|
R_P

Gender:  
|
Posted:
Mar 20, 2025 - 9:09pm |
|
|
|
|