Brexit and the waning days of the United Kingdom

Started by Josquius, February 20, 2016, 07:46:34 AM

Previous topic - Next topic

How would you vote on Britain remaining in the EU?

British- Remain
12 (11.8%)
British - Leave
7 (6.9%)
Other European - Remain
21 (20.6%)
Other European - Leave
6 (5.9%)
ROTW - Remain
36 (35.3%)
ROTW - Leave
20 (19.6%)

Total Members Voted: 100

Josquius

Quote from: Valmy on August 14, 2025, 12:11:33 PM
Quote from: Josquius on August 14, 2025, 12:10:08 PM
Quote from: Oexmelin on August 14, 2025, 11:57:21 AMTake home exams are getting useless, because AI.

That's the reality in the world of work.
If your teacher recieves some effortless AI generated slop and gives it an A then more fool them.

Yes. Successful cheating takes effort.

Er...so I have heard anyway.

What is cheating and what is not?
Is it cheating that I memorise the text book where someone else doesn't?

With a decent take home exam you still have to prompt the AI properly with the right input and do something useful with its output.

With skilled work very often relying solely on AI can actually take more effort than using conscious thought - there's a lot of vibe coding critique in this area.

All very topic dependent of course. I suppose in some topics it works better than in others. But in my field AI is a help but not a instant cheat.
██████
██████
██████

Oexmelin

Except I don't teach for the job market. I don't ask assignments because I need an output: the world doesn't need another book report on this or that. In fact, i actively teach against this moronic output-based worldview. I teach because i want to help shape critical thinking skills, to help thinking ethically, to put as much distance between humans and this fucking dystopia we are creating. AI assessment doesn't help me help that person. Take home exams are over.
Que le grand cric me croque !

Josquius

Quote from: Oexmelin on August 14, 2025, 12:21:08 PMExcept I don't teach for the job market. I don't ask assignments because I need an output: the world doesn't need another book report on this or that. In fact, i actively teach against this moronic output-based worldview. I teach because i want to help shape critical thinking skills, to help thinking ethically, to put as much distance between humans and this fucking dystopia we are creating. AI assessment doesn't help me help that person. Take home exams are over.
I'm not sure how your final sentence follows here.
You don't want people to rely on AI. You want them to think critically. It's about the journey and the process not the output....
Which is exactly why ai hasn't killed take home exams.
If they're properly designed and a competent person is marking them then just asking the AI something and copy pasting the answer won't pass.
██████
██████
██████

crazy canuck

Quote from: Josquius on August 14, 2025, 12:19:54 PM
Quote from: Valmy on August 14, 2025, 12:11:33 PM
Quote from: Josquius on August 14, 2025, 12:10:08 PM
Quote from: Oexmelin on August 14, 2025, 11:57:21 AMTake home exams are getting useless, because AI.

That's the reality in the world of work.
If your teacher recieves some effortless AI generated slop and gives it an A then more fool them.

Yes. Successful cheating takes effort.

Er...so I have heard anyway.

What is cheating and what is not?
Is it cheating that I memorise the text book where someone else doesn't?

With a decent take home exam you still have to prompt the AI properly with the right input and do something useful with its output.

With skilled work very often relying solely on AI can actually take more effort than using conscious thought - there's a lot of vibe coding critique in this area.

All very topic dependent of course. I suppose in some topics it works better than in others. But in my field AI is a help but not a instant cheat.

Please tell me you used an AI tool to draft this, and you don't actually think drafting an AI prompt takes "more effort than using cons
Quote from: Josquius on August 14, 2025, 12:23:28 PM
Quote from: Oexmelin on August 14, 2025, 12:21:08 PMExcept I don't teach for the job market. I don't ask assignments because I need an output: the world doesn't need another book report on this or that. In fact, i actively teach against this moronic output-based worldview. I teach because i want to help shape critical thinking skills, to help thinking ethically, to put as much distance between humans and this fucking dystopia we are creating. AI assessment doesn't help me help that person. Take home exams are over.
I'm not sure how your final sentence follows here.
You don't want people to rely on AI. You want them to think critically. It's about the journey and the process not the output....
Which is exactly why ai hasn't killed take home exams.
If they're properly designed and a competent person is marking them then just asking the AI something and copy pasting the answer won't pass.


 :frusty:
Awarded 17 Zoupa points

In several surveys, the overwhelming first choice for what makes Canada unique is multiculturalism. This, in a world collapsing into stupid, impoverishing hatreds, is the distinctly Canadian national project.

Josquius

#31324
Quote from: crazy canuck on August 14, 2025, 12:28:41 PM
Quote from: Josquius on August 14, 2025, 12:19:54 PM
Quote from: Valmy on August 14, 2025, 12:11:33 PM
Quote from: Josquius on August 14, 2025, 12:10:08 PM
Quote from: Oexmelin on August 14, 2025, 11:57:21 AMTake home exams are getting useless, because AI.

That's the reality in the world of work.
If your teacher recieves some effortless AI generated slop and gives it an A then more fool them.

Yes. Successful cheating takes effort.

Er...so I have heard anyway.

What is cheating and what is not?
Is it cheating that I memorise the text book where someone else doesn't?

With a decent take home exam you still have to prompt the AI properly with the right input and do something useful with its output.

With skilled work very often relying solely on AI can actually take more effort than using conscious thought - there's a lot of vibe coding critique in this area.

All very topic dependent of course. I suppose in some topics it works better than in others. But in my field AI is a help but not a instant cheat.

Please tell me you used an AI tool to draft this, and you don't actually think drafting an AI prompt takes "more effort than using cons

 :frusty:

:huh:
This doesn't make sense to me. Why would I use AI to write this? And what's frustrating?

And yes. AI can be more effort than doing the work yourself.
I learned this first hand with a recent vibe coding experiment. Getting the AI to basically do what I wanted - great. Big time saver. But getting it to do basic stuff like shrink margins? So much annoying back and forth when I could just go and change the number myself.

Then there was the other day looking for historic quotes about London.... It gave some general ones anyone would know and some hints of a guy that sounded interesting.... A bit of poking and it provided a direct quote from this guy which was just what I wanted.... But couldn't provide a source. It was made up.
It helped here. It gave a pointer to an author I didn't know and some old books to look up but I would still have had to find them myself. To continue trying to rely solely on the AI for this would have doomed me.

Oh. And let's not get started on anything with a graphical element. It can't do the most basic of diagramming.
██████
██████
██████

Oexmelin

Quote from: Josquius on August 14, 2025, 12:23:28 PMIf they're properly designed and a competent person is marking them then just asking the AI something and copy pasting the answer won't pass.

Please design for me a history take-home exam that cannot be simply regurgitated by AI.

I think you underestimate the capacity of AI to produce essays, and even to imitate the prose of mediocre students.

I also think you severely underestimate the time it takes to grade exams.
Que le grand cric me croque !

Josquius

Quote from: Oexmelin on August 14, 2025, 12:45:38 PM
Quote from: Josquius on August 14, 2025, 12:23:28 PMIf they're properly designed and a competent person is marking them then just asking the AI something and copy pasting the answer won't pass.

Please design for me a history take-home exam that cannot be simply regurgitated by AI.

I think you underestimate the capacity of AI to produce essays, and even to imitate the prose of mediocre students.

I also think you severely underestimate the time it takes to grade exams.

As I said it depends on the topic. Some will be easier than others. If it's simple essay writing rather than actually producing some output reflective of a real work environment then yes I imagine AI does a bit better.

Still, with history I've definitely ran into stuff in my free time where the ai just cant help. Anything slightly beneath the mainstream knowledge level where the relevant books haven't been published online and it doesn't get much further than "try this book maybe?" and even then I imagine the books it is suggesting are at the mainstream end of these niche topics.
██████
██████
██████

Oexmelin

Quote from: Josquius on August 14, 2025, 12:53:38 PMStill, with history I've definitely ran into stuff in my free time where the ai just cant help. Anything slightly beneath the mainstream knowledge level where the relevant books haven't been published online and it doesn't get much further than "try this book maybe?" and even then I imagine the books it is suggesting are at the mainstream end of these niche topics.

But that's not history. It's antiquarianism. It's erudition. I am never going to ask students to produce work on some super niche topic. Because it's not about producing outcomes. I am not handing out essays so that the world can learn how much did a shipping expedition to Luanda cost in 1752 (FYI, ChatGPT had good preliminary insights into that question). I am handing out essays so that students can think through what a question entails, when it comes to past societies.

I am not even opposed to using AI in the classroom. But the evaluation, in this case, becomes me evaluating how a student uses AI to sharpen their thinking, and I can't really do that if I am not privy to their process, but am only presented with their output. And if I am required to evaluate their process, you can be sure that I will refuse teaching classes with more than 20-ish students, because it's not just feasible. 
Que le grand cric me croque !

Jacob

Josq, in your version of using AI for take home exam, how does that create anything other than a [student uses AI to do the exam] --> [prof. uses AI to grade exam] process?

And if that's the process, what use is that process to anyone?

Josquius

#31329
QuoteBut that's not history. It's antiquarianism. It's erudition. I am never going to ask students to produce work on some super niche topic. Because it's not about producing outcomes. I am not handing out essays so that the world can learn how much did a shipping expedition to Luanda cost in 1752 (FYI, ChatGPT had good preliminary insights into that question). I am handing out essays so that students can think through what a question entails, when it comes to past societies.

I am not even opposed to using AI in the classroom. But the evaluation, in this case, becomes me evaluating how a student uses AI to sharpen their thinking, and I can't really do that if I am not privy to their process, but am only presented with their output. And if I am required to evaluate their process, you can be sure that I will refuse teaching classes with more than 20-ish students, because it's not just feasible. 
Honestly I'm surprised you were doing take home exams even before AI in history.
Seems a bit of a odd halfway there between doing proper decent research and relying on what you can cram.
As said in my experience take home exams always came up in topics where there was more of a solid "do the thing" to apply what you'd learned rather than just "write about the thing you've learned".

I have no idea what exactly you're teaching and I didn't study history at uni so you're asking the wrong person for an insider answer there.

Sad the niche topic suggestion doesn't work.  I thought key in studying history was learning how to research and analyse sources and all that.

In topics I've taught at this level the process is all.

Quote from: Jacob on August 14, 2025, 01:20:01 PMJosq, in your version of using AI for take home exam, how does that create anything other than a [student uses AI to do the exam] --> [prof. uses AI to grade exam] process?

And if that's the process, what use is that process to anyone?

Teachers I know do use AI. They know the students will use it so it's common sense to put through anything they're thinking of asking there first to see if it's worthwhile or the AI will just give a perfect answer.
The plagiarism software they use does this as standard with output too.

As said the goal of take home exams in my experience was never, beyond maybe a few warmup points at the start, about reciting the textbook.
Take home exams were used specifically in topics where they were meant to be practically applicable. Where you could give a challenge to be solved using what you had learned- with it being perfectly fine to check your books and the Internet in doing this as that's how the real world works.

In the real world you do get some chancers who just run things through AI and phone it in.
This output is usually obvious for what it is, doesn't represent the challenge properly at all, and completely misses the important part of how you even got to this end result.
With a properly designed challenge these people don't fail because they cheated with AI. They fail because their submission is shit.
██████
██████
██████

Oexmelin

Quote from: Josquius on August 14, 2025, 02:28:25 PMHonestly I'm surprised you were doing take home exams even before AI in history.

Why?

A take-home exam is indeed a hybrid between a research and ensuring that student did the work of doing the reading each week, and thus have succeeded in creating a structure for varying types of historical explanation, and historiography.

Many students in history are aiming to teach in secondary education. I don't do cramming, because no one's life is on the line in history, and teachers - one hopes - can prepare their sessions in advance. But I do need to make sure they have a basic framework in chronology, some common references - i.e, that they will not simply be utterly clueless the moment they get a question outside of the narrow band of history programs. 

Quotemy experience take home exams always came up in topics where there was more of a solid "do the thing" to apply what you'd learned rather than just "write about the thing you've learned".

A lot of the humanities rely on the written word as an integral part of their epistemology. It's not simply incidental, a neutral tool: it`s part of what you learn. Thus, the problem with AI.

QuoteI have no idea what exactly you're teaching and I didn't study history at uni so you're asking the wrong person for an insider answer there.

I didn't mean to get an actual exam - but rather to both learn what you thought an exam in history is, and to figure out how difficult it is to create one. I also was open to the possibility you may have some good insight from the outside.

QuoteSad the niche topic suggestion doesn't work.  I thought key in studying history was learning how to research and analyse sources and all that.

Realistically, that would mean that all history conducted at the undergraduate level would concern 19th-21st century in English (or French, or German, etc.). Most students don't have the linguistic capacity to properly interrogate sources in Ancient Greek, or Arabic, or even Old English. It's standard practice in US universities to hand out translated sources, but I was always ambivalent toward it, because it often gave the illusion that learning about ancient / other cultures is easy and tended to flatten diversity into contemporary categories.

I still assign research in advanced level classes, but mostly because these are usually more involved projects (and these can be niche), the classes are smaller so I know the students better (and thus can recognize their written style), and I try to instill a trust relationship, where I fully lean into my "disapproving dad / Tommy-Lee-Jones judging stare" mode to discourage them from using AI to write, create, research meaningfully.

One thing I found was that students use AI as writing partners. Which isn't a bad use of AI. I just find it a sad reflection of the atomization of students (they no longer use their peers as writing partners), and they trust it much more than they would trust their peers...

QuoteTake home exams were used specifically in topics where they were meant to be practically applicable. Where you could give a challenge to be solved using what you had learned- with it being perfectly fine to check your books and the Internet in doing this as that's how the real world works.

It's not a bad outlook. Think of it this way: the challenge, in history, isn't to "problem-solve", i.e., to "figure out what are the causes of WWI", much like in English, it isn't about "finding the hidden meaning in "The Sound and the Fury". The challenge is to construct, creatively, with fallible words, some element of historical, or literary reality.
Que le grand cric me croque !

The Minsky Moment

Quote from: Jacob on August 14, 2025, 01:20:01 PMJosq, in your version of using AI for take home exam, how does that create anything other than a [student uses AI to do the exam] --> [prof. uses AI to grade exam] process?

And if that's the process, what use is that process to anyone?

It's great for Nvidia, Microsoft, OpenAI.
We have, accordingly, always had plenty of excellent lawyers, though we often had to do without even tolerable administrators, and seen destined to endure the inconvenience of hereafter doing without any constructive statesmen at all.
--Woodrow Wilson

The Minsky Moment

Interesting thing I learned recently from Ted Gioia's Substack.  One technique Hunter S Thompson used to teach himself prose style was to retype (using a mechanical typewriter of course) the entire contents of The Great Gatsby and A Farewell to Arms.  It was a way to get him to directly experience the process of writing great prose. 

The mentality of our culture has gone 180 degrees in the opposite direction; it is all about short cuts and accepting the least common denominator as "good enough". AI makes drafting easy but it can't produce great work or even particularly good work.

The siren song is the idea that one can make produce good work faster by using the AI to do the basics and get to mediocre and then touching it up.  The problem is that that process does not actually replicate the process of generating great work.  It might train someone to be an editor but not a writer.
We have, accordingly, always had plenty of excellent lawyers, though we often had to do without even tolerable administrators, and seen destined to endure the inconvenience of hereafter doing without any constructive statesmen at all.
--Woodrow Wilson

Oexmelin

Quote from: The Minsky Moment on August 14, 2025, 05:52:18 PMInteresting thing I learned recently from Ted Gioia's Substack.  One technique Hunter S Thompson used to teach himself prose style was to retype (using a mechanical typewriter of course) the entire contents of The Great Gatsby and A Farewell to Arms.  It was a way to get him to directly experience the process of writing great prose. 

Ha. That's similar to Borgès' Pierre Ménard, author of the Quixote.
Que le grand cric me croque !

Sheilbh

Quote from: The Minsky Moment on August 14, 2025, 05:52:18 PMInteresting thing I learned recently from Ted Gioia's Substack.  One technique Hunter S Thompson used to teach himself prose style was to retype (using a mechanical typewriter of course) the entire contents of The Great Gatsby and A Farewell to Arms.  It was a way to get him to directly experience the process of writing great prose. 

The mentality of our culture has gone 180 degrees in the opposite direction; it is all about short cuts and accepting the least common denominator as "good enough". AI makes drafting easy but it can't produce great work or even particularly good work.
Or perhaps enjoy the process of work. It varies by jobs and ultimately I'm a boring lawyer so I don't get it often but there is a bit of satisfaction in craft sometimes. And I think a lot of education - at least in humanities - is in the craft (and enjoying the getting there in some way).

(I do look at my vibe shift thread and where I've changed and slightly worry that I've just become very Modernist in my outlook :ph34r: :lol:)

QuoteThe siren song is the idea that one can make produce good work faster by using the AI to do the basics and get to mediocre and then touching it up.  The problem is that that process does not actually replicate the process of generating great work.  It might train someone to be an editor but not a writer.
Yes. So I've experimented and been encouraged to experiment with it at work. Some people have found it useful for providing an outline or structure or even a first cut of something. I think this very much depends on how you work as some people swear by that. But actually working on the structure or outline is a really important step for me in organising my thinking and how I'll work through something.

On the other hand I find some of the summarisation tools (especially Notebook LM because of its citations) useful.

Obviously I'm in media so we've got fairly strict rules on an editorial perspective but there are absolutely uses there - summarisation of liveblogs when there's a big event is helpful because that basically needs to be done every couple of hours and takes an annoying amount of time. AIs are good at summarisation. Similarly actually suggesting visual description captions for pictures it's quite good or giving it an article and asking it to come up with, say, 10 different suggested headlines (which you can tweak tonally). All of these are useful. But they're not transformative.

It is really used by engineers but even there I've heard mixed experiences and I think it depends how bespoke your code base is.
Let's bomb Russia!