Oddbean new post about | logout
 This is what I'm looking at that motivated the OP.  I'm creating my own triple schema.  OWL does not work for human cognition.  It works for machine cognition.  I should add that #CharlesPeirce is in this mix, as is #BarrySmith (#BFO).  I've also used emoji for a directly readable form:

0🔸6▪️⚗️🔸2🔹📒🔹This is really the most fabulous article.
0🔸6▪️⚗️🔸2🔹⬅️🔹⚗️🔸9
0🔸6▪️⚗️🔸2🔹⬅️🔹👤🔸CentA
0🔸6▪️⚗️🔸2🔹➡️🔹⚗️🔸7
0🔸6▪️⚗️🔸3🔹🏷🔹Mail\nRoom\nReceiving
0🔸6▪️⚗️🔸3🔹⬅️🔹⚗️🔸4
0🔸6▪️⚗️🔸3🔹➡️🔹💽🔸5

I am a hack, an amateur, doing outsider art.

https://cdn.fosstodon.org/media_attachments/files/111/058/221/056/369/386/original/b3c683850ecc1b14.png 
 Simply posting this made me commit to a human-readable translation.  Fully qualified emoji are difficult to code with.  You can, but most editors and operating systems get confused.  Ubuntu 22.04 can deal, particularly if you pin the font in Vim to a limited one like Bitstream Vera Sans, so the fallback is easy.  My latest version also uses a short UUID for the node ID, which isn't human readable, really, but any human-centric scheme, like incremental integers, has other problems.