College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.

  • Mugmoor@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    4
    ·
    1 year ago

    When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn’t exactly a new development.

    • whatisallthis@lemm.ee
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      2
      ·
      1 year ago

      So what you’re telling me is that written tests have, in fact, existed before?

      What are you some kind of education historian?

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        He’s not pointing out that handwritten tests are not something new, but that using handwritten tests over typing them to reflect the student’s actual abilities is not new.

    • lunarul@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      6
      ·
      1 year ago

      I had some teachers ask for handwritten programming exams too (that was more like 20 years ago for me) and it was just as dumb then as it is today. What exactly are they preparing students for? No job will ever require the skill of writing code on paper.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        1 year ago

        What exactly are they preparing students for? No job will ever require the skill of writing code on paper.

        Maybe something like, a whiteboard interview…? They’re still incredibly common, especially for new grads.

        • lunarul@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          4
          ·
          1 year ago

          A company that still does whiteboard interviews one I have no interest in working for. When I interview candidates I want to see how they will perform in their job. Their job will not involve writing code on whiteboards, solving weird logic problems, or knowing how to solve traveling salesman problem off the top of their heads.

          • Dark Arc@social.packetloss.gg
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            That’s a valid opinion, and I largely share it. But, all these students need to work somewhere. This is something the industry needs to change before the school changes it.

            Also, I’ve definitely done white board coding discussions in practice, e.g., go into a room, write up ideas on the white board (including small snippets of code or pseudo code).

          • pinkdrunkenelephants@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            And what happens when you run into the company that wants people who can prove they conceptually understand what the hell it is they’re doing on their own, which requires a whiteboard?

            I program as a hobby and I’ll jot down code and plans for programs on paper when I am out and about during the day. The fuck kind of dystopian hellhole mindset do you have where you think all that matters is doing the bare minimum to survive? You know that life means more than that, don’t you?

            • lunarul@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              The ability to conceptually understand what they’re doing is exactly what I’m testing for when interviewing. Writing a full program on a whiteboard is definitely not required for that. I can get that from asking them question, observing how they approach the problem, what kind of questions they ask me etc.

              I definitely don’t want them to do just the bare minimum to survive or to need to ask me for advice at every step (had people who ended up taking more of my time than it would’ve taken me to do their job myself).

              I’ve never needed to write more than a short snippet of code at a time on a whiteboard, slack channel, code review, etc. in my almost 20 years in the industry. Definitely not to solve a whole problem blindly. In fact I definitely see it as a red flag when a candidate writes a lot of code without ever stopping to execute and test each piece individually. It simply becomes progressively more difficult to debug the more you add to it, that’s common sense.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          edit-2
          1 year ago

          Which is equally useless. In the end you’re developing a skill that will only be used in tests. You’re training to be evaluated instead of to do a job well.

        • lunarul@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I personally never had a problem performing well in those tests, I happen to have the skill to compile code in my head, and it is a helpful skill in my job (I’ve been a software engineer for 19 years now), but it’s definitely not a required skill and should not be considered as such.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Same. All my algorithms and data structures courses in undergrad and grad school had paper exams. I have a mixed view on these but the bottom line is that I’m not convinced they’re any better.

      Sure they might reflect some of the student’s abilities better, but if you’re an evaluator interested in assessing student’s knowledge a more effective way is to make directed questions.

      What ends up happening a lot of times are implementation questions that ask from the student too much at once: interpretation of the problem; knowledge of helpful data structures and algorithms; abstract reasoning; edge case analysis; syntax; time and space complexities; and a good sense of planning since you’re supposed to answer it in a few minutes without the luxury and conveniences of a text editor.

      This last one is my biggest problem with it. It adds a great deal of difficulty and stress without adding any value to the evaluator.