A skill probably only a few VeteranProgrammers still possess.
I myself have actually tried to figure out HexDumps of M68K assembly on AtariSt, but rather not fluently and only small routines (bootloader etc.), but I'm born 1973, so I'm not required to :-) -- GunnarZarncke
The skill of reading hex dumps is vanishing for many reasons:
- Far better tool support. Computers are far better than humans at the rote job of disassembly of binaries; the tools to do this nowadays are ubiquitous. If you think that your hex dump tool is broken, chances are you can download another one off the net within minutes.
- Code portability. Generations ago, programmers had to worry about exactly one architecture (whatever the mainframe in their workplace was). Nowadays, programmers write code which is frequently targeted to numerous different architectures, some of them virtual.
- Increased complexity in binary file formats and program loaders. The bits in your average ELF file are, most assuredly, NOT the bits that will be eventually executed by the processor. The OS or program loader will naturally have to relocate your code to a fixed address (especially on an x86, where position-independent code is expensive due to that architecture's register poverty), and link in goodness-knows-how-many DLLs/shared libraries. And many non-trivial programs further invoke the loader at runtime to load in libraries that aren't necessarily known at program load time.
- The proliferation of libraries and OS calls turns the addresses of thousands of functions into the "opcodes" that must be understood to know what a program does.
- Quite a few architectures, especially in the embedded world, have compressed instruction sets (such as ThumbCompression?)--thus turning an essentially stateless activity (mapping opcodes onto mnenomics) into a highly stateful one.
- The increased use of higher-level languages, where the mapping of programming constructs to machine constructs is often unspecified and/or complex.
- The increased use of optimizing compilers in systems-level languages like C/C++ make analysis of generated assembly code (let alone raw binaries) increasingly dicey; optimizing compilers use rare and complex instructions in nonorthodox ways (e.g. LEA to do arithmetic on the x86).
- The large increase of instruction set size and complexity with wave after wave of SIMD extensions.
- Figuring out what data goes where is reasonably easy; locating the stalls and interlocks of a superscalar pipeline is better left to tools.
- LifesTooShort.
Huh? I was born 1981 and have spent a good amount of time looking at sniffer dumps. Remember that networks are much more important to today's young hackers, and most network protocols are a.) binary and b.) proprietary. -- JonathanTang
I agree. Seems to depend more on ones special field. -- .gz