Global Warming
- geoff_morphini - May 16, 2025 - 8:04pm
Trump
- Steely_D - May 16, 2025 - 7:54pm
M.A.G.A.
- geoff_morphini - May 16, 2025 - 7:46pm
NYTimes Connections
- geoff_morphini - May 16, 2025 - 5:54pm
What the hell OV?
- Isabeau - May 16, 2025 - 5:13pm
Fascism In America
- buddy - May 16, 2025 - 5:08pm
Bruce Springsteen interview and clips of concert
- Isabeau - May 16, 2025 - 4:52pm
Democratic Party
- Isabeau - May 16, 2025 - 4:35pm
SCOTUS
- islander - May 16, 2025 - 2:23pm
Propaganda
- R_P - May 16, 2025 - 1:01pm
What Makes You Laugh?
- Proclivities - May 16, 2025 - 12:43pm
Things You Thought Today
- Antigone - May 16, 2025 - 12:00pm
Wordle - daily game
- rgio - May 16, 2025 - 11:52am
NY Times Strands
- maryte - May 16, 2025 - 10:13am
May 2025 Photo Theme - Action
- fractalv - May 16, 2025 - 9:20am
What makes you smile?
- GeneP59 - May 16, 2025 - 9:16am
Bug Reports & Feature Requests
- JerryBinNJ - May 16, 2025 - 9:06am
How's the weather?
- GeneP59 - May 16, 2025 - 8:50am
Radio Paradise Comments
- GeneP59 - May 16, 2025 - 8:19am
Today in History
- Red_Dragon - May 16, 2025 - 7:08am
Israel
- R_P - May 15, 2025 - 9:33pm
My Favorites - Export and/or stream link?
- KickingUpDust - May 15, 2025 - 7:19pm
Immigration
- Steely_D - May 15, 2025 - 6:52pm
Things I Saw Today...
- Red_Dragon - May 15, 2025 - 4:19pm
Musky Mythology
- R_P - May 15, 2025 - 2:07pm
Economix
- Lazy8 - May 15, 2025 - 7:25am
Mixtape Culture Club
- Lazy8 - May 15, 2025 - 7:22am
Artificial Intelligence
- miamizsun - May 15, 2025 - 5:01am
Who is singing?
- miamizsun - May 15, 2025 - 4:13am
USA! USA! USA!
- R_P - May 14, 2025 - 6:13pm
::Animal Kingdom::
- GeneP59 - May 14, 2025 - 5:25pm
Europe
- Red_Dragon - May 14, 2025 - 3:32pm
BUG: My Favourites Mix not Playing in MQA Quality on Blue...
- NRJCL5 - May 14, 2025 - 3:18pm
BLOCKING SONGS
- ptooey - May 14, 2025 - 2:32pm
Republican Party
- R_P - May 14, 2025 - 1:32pm
The Obituary Page
- miamizsun - May 14, 2025 - 6:12am
Breaking News
- islander - May 13, 2025 - 9:20pm
Baseball, anyone?
- ScottFromWyoming - May 13, 2025 - 6:32pm
Photography Forum - Your Own Photos
- Alchemist - May 13, 2025 - 4:09pm
::Famous Birthdays::
- Isabeau - May 13, 2025 - 3:54pm
Positive Thoughts and Prayer Requests
- Antigone - May 13, 2025 - 3:07pm
Favorite Quotes
- R_P - May 13, 2025 - 12:37pm
Anti-War
- R_P - May 13, 2025 - 11:57am
Name My Band
- DaveInSaoMiguel - May 13, 2025 - 11:40am
Earthquake
- NoEnzLefttoSplit - May 13, 2025 - 7:57am
Crazy conspiracy theories
- Proclivities - May 13, 2025 - 6:32am
Media Matters
- Red_Dragon - May 12, 2025 - 6:29pm
Album recommendation for fans of pop music
- Steely_D - May 12, 2025 - 4:59pm
Framed - movie guessing game
- Steely_D - May 12, 2025 - 10:20am
Celebrity Face Recognition
- islander - May 12, 2025 - 8:07am
No TuneIn Stream Lately
- rgio - May 12, 2025 - 5:46am
New Music
- miamizsun - May 12, 2025 - 3:47am
Talk Behind Their Backs Forum
- winter - May 11, 2025 - 8:41pm
The Dragons' Roost
- triskele - May 11, 2025 - 5:58pm
Ukraine
- R_P - May 11, 2025 - 11:03am
Strips, cartoons, illustrations
- R_P - May 10, 2025 - 2:16pm
Real Time with Bill Maher
- R_P - May 10, 2025 - 12:21pm
No Rock Mix on Alexa?
- epsteel - May 10, 2025 - 9:45am
Kodi Addon
- DaveInSaoMiguel - May 10, 2025 - 9:19am
Upcoming concerts or shows you can't wait to see
- KurtfromLaQuinta - May 9, 2025 - 9:34pm
Basketball
- GeneP59 - May 9, 2025 - 4:58pm
Pink Floyd
- miamizsun - May 9, 2025 - 3:52pm
Freedom of speech?
- R_P - May 9, 2025 - 2:19pm
Questions.
- kurtster - May 8, 2025 - 11:56pm
Pernicious Pious Proclivities Particularized Prodigiously
- R_P - May 8, 2025 - 7:27pm
Save NPR and PBS - SIGN THE PETITION
- R_P - May 8, 2025 - 3:32pm
How about a stream of just the metadata?
- ednazarko - May 8, 2025 - 11:22am
no-money fun
- islander - May 8, 2025 - 7:55am
UFO's / Aliens blah blah blah: BOO !
- dischuckin - May 8, 2025 - 7:03am
Into The Wild
- Red_Dragon - May 7, 2025 - 7:34pm
Get the Money out of Politics!
- R_P - May 7, 2025 - 5:06pm
What Makes You Sad?
- Antigone - May 7, 2025 - 2:58pm
The Perfect Government
- Proclivities - May 7, 2025 - 2:05pm
Living in America
- islander - May 7, 2025 - 9:38am
DQ (as in 'Daily Quote')
- JimTreadwell - May 7, 2025 - 8:08am
|
Index »
Radio Paradise/General »
General Discussion »
Artificial Intelligence
|
Page: 1, 2, 3 ... 10, 11, 12 Next |
miamizsun

Location: (3283.1 Miles SE of RP) Gender:  
|
Posted:
May 15, 2025 - 5:01am |
|
the irony here is that you can use ai to illustrate ways to ai proof your lessons
(prompting is key)
some of the solutions have already been mentioned here
see results/examples below:
Teaching Framework: Fostering Authentic Learning in the Age of AI
This framework, called LEARN (Leverage Engagement, Assess Process, Redesign Tasks, Nurture Critical Thinking), provides actionable strategies for teachers to ensure students learn rather than rely on AI for assignments. Each component includes specific techniques, examples, and rationales, drawing on educational principles and current challenges with AI use in classrooms.
1. Leverage Engagement Through Active Learning
Goal: Create classroom environments where students are actively involved in learning, reducing the temptation to outsource tasks to AI.
- Techniques:
- In-Class Activities: Use interactive methods like group discussions, debates, or Socratic seminars to process concepts in real-time. For example, instead of assigning a written essay on a novel, host a class debate on the novelâs themes, requiring students to articulate ideas on the spot.
- Flipped Classroom: Assign readings or videos as homework, then use class time for hands-on activities like problem-solving or peer teaching. This shifts focus from AI-generated summaries to applying knowledge.
- Real-Time Feedback: Use tools like polling apps (e.g., Kahoot) or whiteboards for students to demonstrate understanding during lessons, making AI use less relevant.
- Example: In a history class, instead of a take-home report on the French Revolution, students participate in a role-play activity as historical figures, defending their actions in a mock trial. This requires understanding context and cannot be easily outsourced to AI.
- Rationale: Active learning fosters engagement and immediate application of knowledge, reducing opportunities for AI reliance. Research (e.g., Freeman et al., 2014) shows active learning improves retention and critical thinking compared to passive methods.
2. Assess Process Over Product
Goal: Focus evaluations on the learning process rather than final outputs, which AI can easily generate.
- Techniques:
- Drafting and Revision Stages: Require students to submit outlines, drafts, or reflections alongside final assignments. For example, ask for an annotated bibliography or a âthinking logâ detailing how students arrived at their conclusions.
- In-Class Writing or Problem-Solving: Conduct assessments during class under supervised conditions, such as timed essays or math problem sets, to ensure originality.
- Metacognitive Reflections: Ask students to write about their learning process, e.g., âWhat challenges did you face in this project, and how did you overcome them?â AI struggles to replicate personal, context-specific reflections.
- Example: For an English essay, students submit a thesis statement, outline, and first draft for feedback before the final version. The teacher grades the process (e.g., quality of revisions) as heavily as the final essay.
- Rationale: Emphasizing process makes it harder for students to submit AI-generated work, as teachers can track development. Studies (e.g., Nicol & Macfarlane-Dick, 2006) highlight that formative feedback on process enhances learning outcomes.
3. Redesign Tasks to Be AI-Resistant
Goal: Create assignments that are difficult for AI to complete effectively or that require unique, human-centric skills.
- Techniques:
- Personalized and Contextual Tasks: Assign projects tied to studentsâ experiences or local contexts, e.g., âInterview a family member about a historical event and compare it to a primary source.â AI cannot replicate personal narratives or local knowledge.
- Complex, Open-Ended Problems: Design tasks requiring synthesis and judgment, such as case studies or ethical dilemmas, where AIâs generic responses fall short. For example, âPropose a solution to a local environmental issue, justifying your approach with evidence.â
- Multimodal Outputs: Require presentations, videos, or live demonstrations instead of text-based submissions. For instance, a science project might involve creating a physical model and explaining it orally.
- Example: In a social studies class, students analyze a recent local news event, incorporating a class discussion and a personal opinion piece that references their community. AI struggles with hyper-local or subjective tasks.
- Rationale: AI excels at generic, text-based tasks but struggles with personalized, multimodal, or highly contextual assignments. This aligns with Bloomâs Taxonomy, encouraging higher-order skills like evaluation and creation.
4. Nurture Critical Thinking and AI Literacy
Goal: Teach students to think critically and use AI as a tool, not a crutch, while understanding its limitations.
- Techniques:
- Teach AI Evaluation: Have students compare AI-generated outputs to human work, identifying flaws like factual inaccuracies or lack of nuance. For example, ask students to fact-check a ChatGPT response on a scientific topic.
- Integrate AI Ethically: Allow limited AI use for specific tasks (e.g., brainstorming ideas) but require students to document how they used it and improve upon it. This promotes transparency and skill development.
- Critical Inquiry Projects: Assign tasks that require questioning sources or assumptions, such as analyzing biases in AI-generated texts or evaluating primary vs. secondary sources.
- Example: In a media studies class, students use an AI tool to generate a news article, then critique its accuracy, tone, and bias compared to a journalistâs article, submitting a written analysis.
- Rationale: Teaching AI literacy empowers students to use technology responsibly while honing critical thinking. Research (e.g., UNESCO, 2023) emphasizes AI literacy as essential for 21st-century education.
5. Reinforce Accountability and Culture of Integrity
Goal: Build a classroom culture that values learning and discourages academic dishonesty, including AI misuse.
- Techniques:
- Honor Codes and Discussions: Discuss academic integrity openly, explaining why learning matters over grades. Have students sign an honor code tailored to AI use.
- Detection Tools with Transparency: Use AI detection tools (e.g., Turnitin, GPTZero) to identify AI-generated text, but pair this with conversations about why originality matters, not just punishment.
- Positive Reinforcement: Reward effort, creativity, and improvement through rubrics that prioritize these qualities, reducing pressure to cheat.
- Example: A teacher starts the semester with a workshop on academic integrity, where students brainstorm the long-term benefits of learning (e.g., career skills) vs. short-term gains of cheating. Rub clutchedrics award points for originality and effort.
- Rationale: A culture of integrity, supported by clear expectations and positive incentives, reduces cheating. Studies (e.g., McCabe et al., 2012) show honor codes and proactive discussions lower academic misconduct.
Implementation Tips
- Start Small: Begin with one or two strategies (e.g., in-class assessments and process-based grading) to test effectiveness without overwhelming yourself or students.
- Adapt to Subject and Age: Tailor tasks to the subject (e.g., math may focus on in-class problem-solving, while humanities emphasize personal reflections) and student age (e.g., younger students may need simpler tasks).
- Leverage Technology: Use learning management systems (e.g., Canvas, Google Classroom) to track drafts and provide feedback, streamlining process-based assessment.
- Professional Development: Teachers can seek training on AI in education through resources like ISTE or EdTech conferences to stay updated on tools and strategies.
Addressing Challenges
- Time Constraints: Redesigning assignments and grading processes can be time-intensive. Solution: Use templates for rubrics and streamline tasks (e.g., peer reviews for drafts).
- Student Resistance: Some students may prefer familiar, AI-friendly tasks. Solution: Explain the purpose of new methods and highlight benefits (e.g., better skills, fairer grades).
- Access to AI: Students will still have access to AI tools. Solution: Focus on tasks where AIâs output is less effective or easily detectable, and emphasize learning over punishment.
- Equity Concerns: Not all students have equal access to technology or support. Solution: Ensure assignments can be completed with minimal tech (e.g., in-class work) and provide resources for struggling students.
Supporting Evidence
- Educational Research: Active learning and formative assessment improve engagement and retention (Freeman et al., 2014; Nicol & Macfarlane-Dick, 2006). Process-oriented grading aligns with constructivist theories, emphasizing learning as a journey.
- AI in Education: Sources like EdWeek (2023) and UNESCO (2023) note that AI misuse is common but can be countered with task redesign and critical thinking focus. Detection tools, while imperfect, support integrity when used transparently.
- Teacher Insights: Discussions on X and education blogs (e.g., TeachThought, 2024) highlight successful strategies like personalized tasks and in-class assessments, with teachers reporting reduced AI reliance when students are engaged.
Example Lesson Plan (High School English)
Objective: Analyze themes in To Kill a Mockingbird without AI reliance.
- Day 1 (Engage): Class discussion on justice themes, using a Socratic seminar. Students jot down ideas in real-time, graded for participation.
- Day 2 (Redesign Task): Students write a short in-class response: âHow does Scoutâs perspective on justice evolve, and how does it relate to your own views?â Teacher provides immediate feedback.
- Day 3 (Assess Process): Students submit an outline for a longer essay, including a thesis and two quotes from the text. Homework is to revise the outline based on peer feedback.
- Day 4 (Nurture Critical Thinking): Students compare an AI-generated summary of the novel (provided by the teacher) to the text, noting inaccuracies or oversimplifications.
- Day 5 (Reinforce Integrity): Final essay is written in class, with a rubric rewarding original analysis and effort. Students sign an honor pledge before starting.
Outcome: Students demonstrate understanding through discussion, process, and original work, with minimal opportunity to use AI.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 15, 2025 - 12:58am |
|
Thanks, yet again, R_P for this video. Great, great examples of thinking, teaching, and learning.
Funny, because it very much echoes with what I'm trying to teach, and why I'm teaching it, or doing it this way.
So I teach English to non-native speakers. One of the first things I go through is speaking and listening skills. I try to explain about automation of pronunciation (and any kind of skill) by using a driving analogy. When you first learn how to drive a car with clutch and a gear stick, your brain actually has to focus a lot on the messages it's sending your left and right foot, dosing the pressure and release so as not to stall or stagger, which would be system 2 in the video, and then, once you've done it enough (and it can take a few or many iterations) you don't have to "think" about it, you just do it and off you go, that would be system 1. (and you can still mess it up if there's a lot of extraneous cognitive load)
And this is exactly why I think AI in education is so dangerous. As Derek Muller explains, what's going to happen when people don't "do" new things, tasks, essays etc. and ask the machine to do it for them? They'll just never learn how to do it for themselves, and that is very dangerous for all of us, I think.
In learning languages, there's definitely that question of cognitive overload, which means that you cannot achieve at least some degree of fluency without having achieved mastery of some tasks (if a student has to figure out vocabulary, pronunciation, syntax and what tense to use, they're not about to be saying anything in a reactive manner). And mastery won't be achieved without repetition, practice, feedback multiplied and multiplied over time.
My concern is also that education departments and some teachers are perhaps not so interested in teaching as a means for people to learn how to think, how to build up their knowledge and their cognitive systems, but rather as giving the 'appearance' of learning. As long as it looks like the system is working, churning out students and degrees, it's all good... When I first became a teacher, I had a small depression, among the factors that caused it was my feeling of powerlessness at helping students learn, as well as the perception that my job was not much more than giving the appearance of equal opportunity and social fairness to my society. In essence, I felt like a stamping machine, stamping poorly performing students, 'poor', average students, average, and good students good. I was of course more interested in helping them get better, regardless of where they were starting from.
In a way AI is likely to make it easier to pretend that the education system works (just like using CTRL=C and CTRL+V in essays was before) as I don't think a lot of teachers are going to bother verifying how different AIs answer the questions they have asked to compare that with the students' answers.
Finally, if one takes the GPS example in the video (you won't build up your orientation skills if you always use a GPS), it can lead to people driving into water... Happened in my country, and happened elsewhere too ( here and here and there are other examples). Now, if you apply that example to the use of AI in learning, you can see where that would lead.
However, I also agree (reluctantly) with the fact that AI can be great in the learning process (as Derek says by providing feedback and so on), but so far, I've mostly seen examples of AI used stupidly in a way that reduces, rather than enhances, what the student is going to learn; and that's without even considering Grok's "white genocide".
Anyway, sorry for ranting, but I really enjoyed this video, and I might actually use it in class.
|
|
R_P

Gender:  
|
Posted:
May 14, 2025 - 6:35pm |
|
|
|
ScottFromWyoming

Location: Powell Gender:  
|
Posted:
May 14, 2025 - 3:05pm |
|
miamizsun wrote:
if you haven't tried experimenting with these or others you're missing out
in my world i pick something i know very well and then ask about it
results are good, very good
More and more meetings come with a follow-up AI transcription and it's a lifesaver because I like to read whatever it is they just told me because that's how I process. It can separate the channels when people are talking over the top of each other.
|
|
R_P

Gender:  
|
Posted:
May 14, 2025 - 2:47pm |
|
NoEnzLefttoSplit wrote:
Embrace the dark side.
|
|
miamizsun

Location: (3283.1 Miles SE of RP) Gender:  
|
Posted:
May 14, 2025 - 2:45pm |
|
NoEnzLefttoSplit wrote:
i've used perplexity, gemini and grok (in that order)
they are all pretty impressive (can't use for work)
if you haven't tried experimenting with these or others you're missing out
in my world i pick something i know very well and then ask about it
results are good, very good
|
|
NoEnzLefttoSplit

Gender:  
|
Posted:
May 14, 2025 - 2:25pm |
|
|
|
miamizsun

Location: (3283.1 Miles SE of RP) Gender:  
|
Posted:
May 14, 2025 - 6:16am |
|
Steely_D wrote:
Thatâs a great thought.
it is
embrace new/updated technologies
teach people how to use them/tools
(or risk being left behind)
|
|
R_P

Gender:  
|
|
Steely_D

Location: The foot of Mount Belzoni Gender:  
|
Posted:
May 13, 2025 - 11:46am |
|
Egctheow wrote:
we (the teachers) are not after the products, but that we're only using them as a way for them to fine-tune their processes.
Thatâs a great thought.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 13, 2025 - 8:51am |
|
R_P wrote:
Thank you for that cool article. I found in there a lot of the thoughts that I hadn't articulated so far.
The risk, in education is indeed to confuse the product (an essay, thesis, commentary...) with the process. I'm amazed at how few of my students realise that we (the teachers) are not after the products, but that we're only using them as a way for them to fine-tune their processes. If the product is created by the process, the product nonetheless shapes the process. Processes can only be learned through performing them for oneself, and that's what students (and many educators) don't seem to realise.
One can also try to point out to students that if they rely on AI to come up with what they've been assigned to do, why would an employer need them? They'll probably save their wages by interacting wit AI themselves.
It is really giving me the creeps though.
|
|
R_P

Gender:  
|
Posted:
May 12, 2025 - 4:19pm |
|
Philosophically A Matter of Words
What can university AI committees actually do?
|
|
q4Fry

Location: Currently stateside Gender:  
|
Posted:
May 10, 2025 - 10:01am |
|
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 9, 2025 - 10:15am |
|
Proclivities wrote:
It seems to me that there are always inherent dangers, or at least drawbacks, to any advances in technologies. I understand your concern for these AI tools being used (or abused) in education. I work in an education-related industry and also live in an area where there are three large universities and several other smaller ones. Over the years I have come to know many people in academic and medical fields; for those who are in science fields like biology, medicine, chemistry, etc., they see some advantages for AI applications for predictions, calculations, etc., but also shortcomings for relying on it too much without thoroughly questioning the results. I have a good friend who is a professor of English Literature - he doesn't see much constructive use for AI other than as a "Cliff's Notes" kind of assistant, or as a conversational tool - similar to the exchange you had posted, which I found interesting.
I am also a visual artist, so I see how AI presents a number of threats to the creative arts fields - mainly plagiarism - or at least some forms of infringement. Music, literature, and visual arts are very susceptible to this hijacking of existing materials. Of course, all creative endeavors are derived from previous works to varying extents, so it remains to be seen how it will wash out.
ChatGPT currently has an AI animation application that mimics the Studio Ghibli style of the Japanese animator and filmmaker Hayao Miyazaki - he said he was "disgusted" by it, and that "Whoever creates this stuff has no idea what pain is.â
Anyhow, good topic and good points.
Thanks, I agree on all points. I've also read about AI significantly increasing the quality of diagnostics based on medical imaging, as AI can be exposed to many, many more previous scans or Xrays or whatever than any human doctor, and I have obviously nothing against that.
I can't imagine what it must be like to see your creative work taken away and appropriated by a company-owned machine, including for purposes or contexts that you would definitely not approve of. Yes, artists and people in general always get inspiration from other authors and people, and it's great, but there are laws and mechanisms in place to protect the work and livelihood of those authors. This is far from guaranteed with AI. There have been interesting articles in the Guardian on the pushback by musicians and visual artists to defend their right to refuse (or ask payment for) the use of their work by AI ( here) as well as here for one on writers' response to AI creative writing). I had a good laugh when AI executives started to complain about deepseek.
Regardless of the education context what really worries is the exponential improvement of AI in general, and asking myself, what's the limit to this?
I used to warn students about the use of automatic translators, explaining that while it could be OK when your objective was to get information, and that results should be questioned, but that the benefit was next to nothing if the point was how to learn the language. For example, a bilingual dictionary remains the best tool when they're just missing a word.
In September last year, as I'd been doing over the last few years, I tried to show them how Google translate worked. In French, the words for paperclip and trombone are the same. So I'd type, in French, He took a "trombone" to the office... G rightly uses "paperclip" in the English translation. Then, I would complete the sentence with, He took a "trombone" to the office, he wanted to play a jazz tune. Then, G changes "paperclip" to trombone. It used to be that dividing the clauses into two sentences, using a full stop as opposed to a comma, would result in G changing back the word into paperclip, no longer, it now reverts to trombone despite the full stop. It's becoming more and more accurate. So is there going to be a point to interpreters or translators, or even language teachers in the near future? I can hear Danny De Vito's speech on buggy whip makers in Other People's Money.
And eventually, to come back to education, when all is said and done, isn't it going to be as what's described in the article linked by R_P, student used AI being assessed by teacher-used AI on AI generated content?
|
|
ScottFromWyoming

Location: Powell Gender:  
|
Posted:
May 9, 2025 - 8:23am |
|
rgio wrote:So I'm a firm believer in "flipping" the classroom. Do the lectures at home, and then do the homework in school. No technology....discussion. Active...engaged....discussion. The lectures can be delivered via laptop at home. The next day, make them read silently, together, for 30 mins before discussing what was just read. No AI. No YouTube. The materials, your physical abilities, and your brain. Focus.
Class size is an issue, as is teacher abilities. They need to be an expert in facilitation as much (or moreso) than Faust or Physics.

When I went through " OEC" training for Ski Patrol, we'd meet twice a week for lectures and demonstrations and close out the 3-4 hour class with some hands-on. During the week we were expected to practice on our family, the dog, ourselves, etc. then have a big weekend-long "practical" at the end of the course. Now they do all their lectures at home, with quizzes on terminology and physiology etc. and the weekly 2-3 hour meeting is all practical. By the end of it they have spent a LOT more time on practicals and a lot less time away from home, and retention is far better. Plus as you say it relies less heavily on the instructor being an expert on the book-learnin' part.
|
|
Proclivities

Location: Paris of the Piedmont Gender:  
|
Posted:
May 9, 2025 - 8:22am |
|
Egctheow wrote:
______________________
I can't say this is making me feel any better about AI. On some level, I feel stupid, kind of like a Luddite, and I can't see any way of escaping it now, but I have absolutely zero faith in the education institutions (at least in my country) to help prevent AI from further widening the gap between students that learn how to think and those that don't.
Sorry for the long post, but I really did not know who else to turn to. Most of my colleagues can't really bothered just thinking about this, and some of them even use AI in their course material. Anyway, would very much like to hear your thoughts.
Thank you
It seems to me that there are always inherent dangers, or at least drawbacks, to any advances in technologies. I understand your concern for these AI tools being used (or abused) in education. I work in an education-related industry and also live in an area where there are three large universities and several other smaller ones. Over the years I have come to know many people in academic and medical fields; for those who are in science fields like biology, medicine, chemistry, etc., they see some advantages for AI applications for predictions, calculations, etc., but also shortcomings for relying on it too much without thoroughly questioning the results. I have a good friend who is a professor of English Literature - he doesn't see much constructive use for AI other than as a "Cliff's Notes" kind of assistant, or as a conversational tool - similar to the exchange you had posted, which I found interesting.
I am also a visual artist, so I see how AI presents a number of threats to the creative arts fields - mainly plagiarism - or at least some forms of infringement. Music, literature, and visual arts are very susceptible to this hijacking of existing materials. Of course, all creative endeavors are derived from previous works to varying extents, so it remains to be seen how it will wash out.
ChatGPT currently has an AI animation application that mimics the Studio Ghibli style of the Japanese animator and filmmaker Hayao Miyazaki - he said he was "disgusted" by it, and that "Whoever creates this stuff has no idea what pain is.â
Anyhow, good topic and good points.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 9, 2025 - 6:36am |
|
rgio wrote:
If I can summarize your post, I think you're saying "what good are facts and information if you don't know how to assess and use them?"
We were just having a discussion earlier in the week within my nuclear family's group chat. My spouse is a substitute teacher, and received instructions from a 10th grade English teacher she was covering for to have everyone "turn their desks around" so that you can watch them from the desk and make sure they aren't using AI in doing a group assignment. The family note was sent with a tone of "is this really necessary".
I responded with a "50 word summary in the voice of a 3rd grader" to the assignment, written by my best friend Claude, in about 30 seconds... and it finally hit her just how powerful these tools are.
I am a firm believer that people at every age need to exercise their brains, occasionally to the point of exhaustion. I recall reading a study a few years ago that noted the lower instances of dementia among the medical profession. One conclusions was that high stress, high focus work like surgery and multi-factor analysis keeps the brain busy building pathways and solving problems, which results in better outcomes.
All of that is a bit of a preamble to the answer for the question you asked.
I would ban all technology from the classroom. Full stop.
Not only phones, but laptops too. Paper....pencils...overhead projectors. Old school. I can see a monumental difference in kids born last century and this one, primarily due to the explosion of the smart phone in the early 2010's. When middle-schoolers started using social media, their brain chemistry changed. The inability to "think" is a long-term contagion that's going to end badly.
So what do you do?
I think the banning of technology in the classroom has to come with fundamental changes to the methods used by educators. Learning activity "types" have been ignored with the advent of technology, and if they were appreciated more, the delivery of education would change. I suggest that "educational work" should be classified into individual and group activities. Historically, group activities were lectures (delivering new material at scale), and individual ones were "homework" (embedding that information within the student). The AI problem is that the embedding isn't happening. The mental effort isn't happening... thus the "education" isn't taking hold.
So I'm a firm believer in "flipping" the classroom. Do the lectures at home, and then do the homework in school. No technology....discussion. Active...engaged....discussion. The lectures can be delivered via laptop at home. The next day, make them read silently, together, for 30 mins before discussing what was just read. No AI. No YouTube. The materials, your physical abilities, and your brain. Focus.
Class size is an issue, as is teacher abilities. They need to be an expert in facilitation as much (or moreso) than Faust or Physics.
I am a huge fan of the Harkness table approach to education. It's still a primary tool used at some of the most elite private/boarding schools in the US, and I believe it's benefits have been proven.
AI is an amazing tool, but if you can't tell the insights from the hallucinations, it's not going to serve you well.
Thanks for taking the time to reply, and I agree with what you're saying. Turning things around would definitely be better. However, it is difficult to achieve in isolation, both in time (one could argue that the school system in my country is designed to make students passive and that by the time we're together they've learned that passivity) and space (not every instructor is willing to put themselves out there, so you're kind of a loner when you try that).
As an example, when we were hit by the first lockdown, I had to teach 1st year students distantly, and I just couldn't imagine doing the same thing as I would have done in class via zoom. At the time, I thought about it for some time and then chose a MOOC available on the UN Environment Program and designed a MOOC companion for my students, on their Moodle platform (the idea was to facilitate going though the course while not being native speakers of English), I also set up discord chatrooms so that I would be able to assist students, synchronically, as they were exploring the content. Man, it didn't work at all, hardly any question or interaction. So in my experience it's difficult to make them more involved, even though these were special circumstances.
The tables in my classroom are in a U shape, so that everyone can see everyone else, and I discuss with students the necessity of not just talking to or listening to me, but again, old habits die hard. And they're not allowed to use their phones or laptops (except for special needs students).
And you're also right, what I was saying was bit confused, I should have addressed the issue of AI (trustworthiness, authorship, mental atrophy....) separately from AI in the education process.
I'm very much an old school kind of teacher, most often just a blackboard and chalks (somehow, I like whiteboards less, although they're less messy) so I hear you on removing technology from the classroom. To an extent. I teach English to non-native speakers, and there are great tools (pop-up dictionaries, accessing a wide number of resources on-line, such as newspaper, research articles...) that can help with that. But you're right in that I tend to try and encourage them to use these resources outside the classroom, on their own time.
I think that many of my colleagues would not be so comfortable about turning things around because when they're delivering a lecture or course content, they feel in charge of the narrative, and they don't anticipate (and many definitely would definitely not welcome) questions to which they may not have an immediate answer. I've never had a problem with saying, I don't know, but I'll look into it, but it is still perceived as heresy to do that by many teachers.
I also agree with you on the necessity to stimulate our brains, whatever the type of activity we're engaging in.
One approach towards which I'm working is Problem Based Learning ( https://en.wikipedia.org/wiki/...)which I think could help. (sorry as I'm replying to your post, I've somehow lost the edit toolbar and the possibility of in inserting a link properly)
Thanks again for your reply
|
|
rgio

Location: West Jersey Gender:  
|
Posted:
May 9, 2025 - 5:22am |
|
Egctheow wrote:
Thanks, exactly my concerns and worries. I agree that to some extent, this kind of cheating is not new. Up until now, however, it was called plagiarising, could be identified and there were clear codes as to accepted ways of quoting an author or referencing an idea.
My main concern is that almost everyone (educators included) is missing the point. The assignments that I give in class are not an end in themselves, just a means for students to learn things, and most importantly how to think. What I've seen over the last twenty years is a transition from researching, understanding, processing and then using that process to feed into answering a question or writing an essay to just "transmitting", i.e. trying to find a ready-made answer to the question asked or the topic proposed for reflection without "owning" the question or topic. Students seem content with just being "vectors", completely skipping the learning acquisition process. As if they were a dog fetching a stick, missing the point that the intention for them to go and cut out their own stick.
In the past, I used to try and explain learning processes to students using the metaphor of reading as digesting. One reads or listens to something, takes it in, transforms it into something that makes sense to them and can then use it as building blocks for their own body of knowledge. Very often I found that students were more into binging course material and then vomiting it into whichever test they are given, with very little content remaining after a couple of weeks, let alone a couple of months or years.
What many of my colleagues and students seem to miss is that our thinking processes, our capacity to take in and process information depends intimately on what we "know", whatever knowledge and experience we've accumulated. What we perceive and identify as relevant depends on the sum total of whatever knowledge and information we've made our own.
Unleashing AI on a cohort of learners who are unable to simply ask themselves, "what is the website from which I'm reading? Is the information trustworthy? Verifiable?" seems like a recipe for disaster. And that's even without wondering about the motivation of AI companies... I can just imagine an AI designed under Stalin in USSR or Hitler in Nazi Germany... I realise that these are extreme examples, but I think they're illustrative. More recently, imagine an HR professional using AI to make decisions or write up personnel policy after the ban of DEI, or before.
Personally, I'm wondering whether I should fight back (and plan my teaching so that students will not be able to, or rewarded for, using AI), or whether I should acknowledge the inevitability of AI and then guide students on how to use it in a way that teaches them something, and demand that they document their use of AI.
I'd very much like to hear from listeners about this.
If I can summarize your post, I think you're saying "what good are facts and information if you don't know how to assess and use them?"
We were just having a discussion earlier in the week within my nuclear family's group chat. My spouse is a substitute teacher, and received instructions from a 10th grade English teacher she was covering for to have everyone "turn their desks around" so that you can watch them from the desk and make sure they aren't using AI in doing a group assignment. The family note was sent with a tone of "is this really necessary".
I responded with a "50 word summary in the voice of a 3rd grader" to the assignment, written by my best friend Claude, in about 30 seconds... and it finally hit her just how powerful these tools are.
I am a firm believer that people at every age need to exercise their brains, occasionally to the point of exhaustion. I recall reading a study a few years ago that noted the lower instances of dementia among the medical profession. One conclusions was that high stress, high focus work like surgery and multi-factor analysis keeps the brain busy building pathways and solving problems, which results in better outcomes.
All of that is a bit of a preamble to the answer for the question you asked.
I would ban all technology from the classroom. Full stop.
Not only phones, but laptops too. Paper....pencils...overhead projectors. Old school. I can see a monumental difference in kids born last century and this one, primarily due to the explosion of the smart phone in the early 2010's. When middle-schoolers started using social media, their brain chemistry changed. The inability to "think" is a long-term contagion that's going to end badly.
So what do you do?
I think the banning of technology in the classroom has to come with fundamental changes to the methods used by educators. Learning activity "types" have been ignored with the advent of technology, and if they were appreciated more, the delivery of education would change. I suggest that "educational work" should be classified into individual and group activities. Historically, group activities were lectures (delivering new material at scale), and individual ones were "homework" (embedding that information within the student). The AI problem is that the embedding isn't happening. The mental effort isn't happening... thus the "education" isn't taking hold.
So I'm a firm believer in "flipping" the classroom. Do the lectures at home, and then do the homework in school. No technology....discussion. Active...engaged....discussion. The lectures can be delivered via laptop at home. The next day, make them read silently, together, for 30 mins before discussing what was just read. No AI. No YouTube. The materials, your physical abilities, and your brain. Focus.
Class size is an issue, as is teacher abilities. They need to be an expert in facilitation as much (or moreso) than Faust or Physics.
I am a huge fan of the Harkness table approach to education. It's still a primary tool used at some of the most elite private/boarding schools in the US, and I believe it's benefits have been proven.
AI is an amazing tool, but if you can't tell the insights from the hallucinations, it's not going to serve you well.
|
|
Egctheow

Location: In over my head Gender:  
|
Posted:
May 9, 2025 - 12:22am |
|
Thanks, exactly my concerns and worries. I agree that to some extent, this kind of cheating is not new. Up until now, however, it was called plagiarising, could be identified and there were clear codes as to accepted ways of quoting an author or referencing an idea.
My main concern is that almost everyone (educators included) is missing the point. The assignments that I give in class are not an end in themselves, just a means for students to learn things, and most importantly how to think. What I've seen over the last twenty years is a transition from researching, understanding, processing and then using that process to feed into answering a question or writing an essay to just "transmitting", i.e. trying to find a ready-made answer to the question asked or the topic proposed for reflection without "owning" the question or topic. Students seem content with just being "vectors", completely skipping the learning acquisition process. As if they were a dog fetching a stick, missing the point that the intention for them to go and cut out their own stick.
In the past, I used to try and explain learning processes to students using the metaphor of reading as digesting. One reads or listens to something, takes it in, transforms it into something that makes sense to them and can then use it as building blocks for their own body of knowledge. Very often I found that students were more into binging course material and then vomiting it into whichever test they are given, with very little content remaining after a couple of weeks, let alone a couple of months or years.
What many of my colleagues and students seem to miss is that our thinking processes, our capacity to take in and process information depends intimately on what we "know", whatever knowledge and experience we've accumulated. What we perceive and identify as relevant depends on the sum total of whatever knowledge and information we've made our own.
Unleashing AI on a cohort of learners who are unable to simply ask themselves, "what is the website from which I'm reading? Is the information trustworthy? Verifiable?" seems like a recipe for disaster. And that's even without wondering about the motivation of AI companies... I can just imagine an AI designed under Stalin in USSR or Hitler in Nazi Germany... I realise that these are extreme examples, but I think they're illustrative. More recently, imagine an HR professional using AI to make decisions or write up personnel policy after the ban of DEI, or before.
Personally, I'm wondering whether I should fight back (and plan my teaching so that students will not be able to, or rewarded for, using AI), or whether I should acknowledge the inevitability of AI and then guide students on how to use it in a way that teaches them something, and demand that they document their use of AI.
I'd very much like to hear from listeners about this.
|
|
R_P

Gender:  
|
|
|