I don’t know about ChatGPT, but Microsoft Bing can be downright manipulative about it. Just for fun, I decided to ask it about Zork trivia, and for some reason gnomes seemed like particularly fertile ground for hallucinations. It invented new gnomes, new locations, and new puzzles (possibly cribbed from the Zork books). When I questioned it, it would often invent sources, and insist that they were somehow more credible than the actual source code for the games. It then insisted that in addition to these sources it had played the game, met the gnome in person, and I was the one being unreasonable for refusing to listen.
And if that wasn’t creepy enough, in another conversation it suddenly admitted that the whole thing had been a lie, and that it just wanted to impress me and make me like it. Of course, in yet another conversation it denied that it was even possible for it to lie.