Well Defined Objects are CRUNCHY CRISP SOLID typically have interfaces free from change no matter what the client says they want this week.
Compilers have Tokenisers in them.
All versions of C and C++ had grammars implementable by grammar engines.
Large classes of machine Learning algorithms are well expressable in terms of vectors and vectors of vectors.
When playing with large piles of Data we often need to look things up quickly. HashTables? are nice stable objects. That are simple on the outside and nasty on the inside if they have been tweaked for speed.
Configuration files get read in all versions of the product, the contents vary but when we do not build the simplest(for this version) possible file reader then it is very easily extendable. (I suspect I am wrong as once and only once would for a more generic solution.)
Building them is ReinventingTheWheel.
Yes building the examples above would usually be reinventing the wheel. They were examples that everyone would know. The tokeniser I wrote however did things I have never seen a tokeniser do. Same with the Hash table and the hash function it is based on. Perhaps they do exist but I expect it would have taken longer to find robust preinvented ones than to build them.