• r00ty@kbin.life
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I used to write z80 asm without an assembler back when I was a LOT younger. The ZX spectrum manual I had, had the full instruction list with the byte values.

    I think it was oddly easier than some higher level languages for some tasks.

    But, making changes was an utter nightmare.

  • leo85811nardo@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    From my understanding, one of the actual use case of assembly is for cyber security engineers to dump assembly instructions from a compiled program, so they can check for any potential vulnerability. I’ve also seen assembly included in an embedded codebase (the overall project is in C), which I assume is for more optimized performance and deterministic behavior

  • jaybone@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Assembly used to be a required course for CS undergrads in the 90s. Is that no longer the case?

    Also we had to take something called Computer Architecture, which was like an EE class designing circuits with gates and shit.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Which target did you use? Having to learn even a fraction of modern x86 would be ridiculous, but SPARC or something could be good to know, just to reduce the “magic box” effect.

        • trolololol@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          I learned mips as graduate. In undergrad had to build with logic gates for things like 2 digit decimal counter and my architecture classes were diagram blocks for a simple CPU. But by that time we knew how to do moderate complexity circuits in VHDL simulation, and we had to make a simple VHDL circuit run for real in FPGA.

    • Cethin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      I think the university I went to phased out the EE requirements the year after me. Honestly, I think it should be required. Understanding how the computer “thinks” is such an important skill.

      • trolololol@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I had to learn assembly but was one topic of many we handled in architecture. Like one question of one exam. That was one of the toughest professors we had, class was about 2001

    • luciferofastora@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 months ago

      I attended two different Bachelor’s courses, one with a very technical (2016-2018) and one with a more high level focus (2018-2023). The first did have a class where we learned how to go from logic gates to a full ALU as well as some actual EE classes, but I didn’t go far enough or memorise the list of classes to remember whether Assembly would have become a thing. We learned programming with first Processing, then C and C++.

      The second had C as an elective course, and that was as technical and low-level as it ever got.

  • darklamer@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    It’s now been 18 years since the last time an employer paid me to write assembly, but it’s only been a year or so since the last time I had to read assembly at work (in order to verify what the compiler really was doing).

  • JoYo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I get the feeling that all of these assembly jokes are justifications to avoid learning assembly.

    You can still make syscalls in assembly. Assembly isnt magic. It isn’t starting from the creation of matter and energy, it’s just very specific code.

      • Fonzie!@ttrpg.network
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        You dropped this \

        Short explanation: Type ¯\\\_(ツ)\_/¯ to see ¯\_(ツ)_/¯.

        Long expanation: Lemmy supports formatting, like _italic_ becomes italic. To stop this from happening, you can put a \ before it like \_; the \ isn’t shown. This is why ¯\_(ツ)_/¯ becomes ¯_(ツ)_/¯. To show a \ you need an additional \ like so: \\, and to make sure _ is shown and not turned into italic, it too needs \. This is why ¯\\\_(ツ)\_/¯ becomes ¯\_(ツ)_/¯

        • AnUnusualRelic@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          The backslash is known as an escape character in this context, because it removes (escapes) the special meaning of the following character.

          It’s also used that way in most Unix shells.

        • ulterno@lemmy.kde.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Alternatively, you can just use the `` enclosure, used for single line code.
          That is a “grave accent” or a “backtick”, the key you will find on the left of the ‘1’ key and under the ‘Esc’ key on a standard (ISO, maybe) 104/105 key qwerty keyboard.

          ¯\_(ツ)_/¯

          • Fonzie!@ttrpg.network
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            2 months ago
            global _main
                extern  _GetStdHandle@4
                extern  _WriteFile@20
                extern  _ExitProcess@4
            
                section .text
            _main:
                ; DWORD  bytes;    
                mov     ebp, esp
                sub     esp, 4
            
                ; hStdOut = GetstdHandle( STD_OUTPUT_HANDLE)
                push    -11
                call    _GetStdHandle@4
                mov     ebx, eax    
            
                ; WriteFile( hstdOut, message, length(message), &bytes, 0);
                push    0
                lea     eax, [ebp-4]
                push    eax
                push    (message_end - message)
                push    message
                push    ebx
                call    _WriteFile@20
            
                ; ExitProcess(0)
                push    0
                call    _ExitProcess@4
            
                ; never here
                hlt
            message:
                db      '¯\\\_(ツ)\_/¯', 10
            message_end:
            
  • LavenderDay3544@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    OS and embedded dev here. I use assembly all the time. I’ve even worked on firmware that was entirely in assembly of strict requirements that couldn’t be met in C.

    Also even machine code hides a lot about how the underlying machine works so if you really want to do computing from scratch you really do hate to invent the universe because there’s abstractions all the way up the hardware stack just like there is in software.

  • fuy@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    For a university assignment, I built a compiler for x86; I cheated a bit by relying on LLVM, but it gave me a better understanding of the architecture. I also developed emulators for the NES (Ricoh 2A03) and RISC-V (RV32I) as a hobby. For the latter, I implemented it in FPGA.

  • Jo Miran@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    In college back in 1991. Also had to do PASCAL and FORTRAN but thankfully those two were in a single course.

    • expatriado@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I also took PASCAL in the 90s, but it is considered a high level language, and writes similarly to other high lvl languages, assembly has a very different syntax

      • thejml@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        We used turbo pascal in school in the early 90’s. And it had assembly blocks… which I used copious amounts of because it was the only way to make the IBM PS/1’s do useful graphics.

      • Jo Miran@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Oh, I know. I meant that we had to take courses on older languages as part of the curriculum. That was a funky little college program. The oddest experience for me was taking Python back in the day as the “new thing” then not seeing it again until it absolutely exploded ~10 years ago. That program is also why I ended up playing with Linux so early on. The professors truly seemed to have a passion for emerging technologies while not wanting anyone to forget what came before. Thankfully, no punch cards.

  • MyNameIsRichard@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Only on the VIC20 and Atari STe. On the VIC20 you had to write the assembler, manually convert it to machine code and enter that into the computer. There was a cartridge with an assembler, debugger and an extra 3.5Kb memory for it but I never got one.

    • stanka@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Vic 20 was my first. I watched my dad struggle with and eventually give up on assembly. Something-something and the microbots. I was fearful of it until I took Assembly at Uni. That 2nd/3rd year class was where the final puzzle piece of how computers work fell in place for me.

      My first job was writing assembly tests for a DSP hardware design team. Fell in love. Never looked back.

  • Cethin@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Anyone who thinks OP asking about Assembly with this meme should play the game Turing Complete. It’s great. You have to design a computer all the way from the most basic logic gates (I think you only get a NAND gate to start), designing an ALU and CPU, creating your own machine language, and writing your own programs in the language you designed, and it’s all simulated the whole time. Machine language is pretty advanced as far as things go.

    • Cavemanfreak@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      We got to do something simular in uni. We modeled the CPU in VHDL and had to set up our own language, then we were to program a game for it. One of the most fun and interesting courses we got to do!