Just Give Me The Meat
Just Give Me The Meat

Just Give Me The Meat

Tags
AILLMmachine learningC
Owner
Justin Nearing

Anthropomorphizing consumer LLM’s is a bad idea.

image

Bing Copilot starts the majority of its responses with Certainly!

This is a bad idea.

When I ask a question that is undoubtedly wrong, it starts the response with “Certainly!”

This makes me think my assumption is right.

This makes me confused when the rest of the answer explains how I’m wrong.

The difference of those C allocators is the method of allocating memory.

calloc is not the “CPU memory allocator.”

It’s a contiguous allocator that initializes every byte to zero.

Useful if you need it, but not fundamentally different from malloc.

The meat of the answer tells me exactly that.

But the reinforcement training to make Copilot feel more friendly has only succeeded in confusing me.

Confusing the user is literally the worst thing a web search can do.

I would 1000% rather a LLM chatbot to be a cold, impersonal, fact-regurgitation machine than a over-helpful puppy containing the sum of all human knowledge.

Because those puppies eyes reflect the thousand-yard-stare of an entity that can provide step-by-step instructions for plutonium enrichment, if you can but slip past the puppy mask.

LLM’s should talk to humans like humans should talk to police: Just the facts.

No opinion, no color, just give the exact answer to the question to the best of your ability.

Leave it to the humans to have the opinions.

Just give me the meat.