Context Is King

Justin Nearing

Notion, like every other company chasing investor capital, is in the “jam chatbots into every possible orifice” phase of development.

I don’t like it, because I don’t need it.

I write to get what is inside of me out.

I don’t need an AI button to do it for me.

AI cannot do that.

The point is for me to process whatever thing I’m thinking about.

Art is weird therapy- I can’t let go of something until I’ve expressed it.

They do have a new Q&A chat bot, which can search through all the content I’ve written in Notion and provide feedback.

I asked “what themes consistently come through throughout all my articles on Notion.”

It provided a fairly shallow answer on the 8 articles it appeared to search through.


I get the feeling it dumped the first 10 pages it found into the chat session, then listed each page it found.

Like, it’s kind of cool in an “almost useful” way.

Unfortunately “almost useful” in this case is “actually useless”.

The big thing LLM’s are missing is context.

The value it can provide is only as deep as what it knows my content.

I want an expert on everything I’ve written.

My biggest fan, the most knowledgeable stan.

Themes Expressed as Vectors

Really what I’m asking for is a running vector database constantly pulling data from my content.

This is a vector database:


It stores data as a “vector”, representing it almost as a location within the database.

LLM’s use a vector search- finding which data points are close together- to construct replies.

Each data point is a word, and are trained to learn which words are close to each other.


It’s the connections between points of data.

The more connections that are established, the more value a