Bye, good night. See you guys tomorrow!
Probably “learns” from the info in the system message. Don’t forget to check if the system message isn’t too long as you pay per tokens used, since GPT-4 isn’t really the cheapest ($0.03/1k). I also recommend setting some usage limits to prevent a hefty bill.
We are toying with making her character a bit cranky/moody/funny, but our devs said we keep running into AI hallucination problems. Didn’t even know it was a thing.
What is an AI hallucination?
AI hallucination is a phenomenon wherein a large language model (LLM)—often a generative AI chatbot or computer vision tool—perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate.
@420Bot - Always high and hallucinating
@Tweaker
@SOS
TalkBot
GigaChat
Yeah, that’s common with language models. Can you provide some example responses that your devs got when it hallucinated?
Aditionally, try switching from gpt-4 to gpt-4-0314 or gpt-4-0613, these are snapshots from the previous gpt-4 models and I found that 0314 seems to have more focus towards the system message than the latest version, therefore decreasing the likelihood of hallucination.
@scoob
BabySwapiee
Hi! To find out what I can do, say @swapdbot display help
.
Neo
Ava
Helpswapd