Thanks for putting us on notice on how we might be manipulated. I've heard people describe the personalities of various AI tools, which then helped me to recognize that ChatGPT seemed to be too much like a bro. Human bias is there -- but right now, it's not necessarily evil but it could easily be manipulated, as you point out and I hadn't deeply considered. Recently, I researched metrics on the US economy via Perplexity AI, after hearing that things were going swimmingly and then had to specify sources and got a different story.
We are so used to computers being precise. I think we still default to that thinking when working with these chatbots. What I'm quickly learning is that they are like having a junior employee with very poor judgment who also makes up a lot of stuff, but is also quite a manipulative ass kisser!
This is pretty disturbing!
Thanks for putting us on notice on how we might be manipulated. I've heard people describe the personalities of various AI tools, which then helped me to recognize that ChatGPT seemed to be too much like a bro. Human bias is there -- but right now, it's not necessarily evil but it could easily be manipulated, as you point out and I hadn't deeply considered. Recently, I researched metrics on the US economy via Perplexity AI, after hearing that things were going swimmingly and then had to specify sources and got a different story.
We are so used to computers being precise. I think we still default to that thinking when working with these chatbots. What I'm quickly learning is that they are like having a junior employee with very poor judgment who also makes up a lot of stuff, but is also quite a manipulative ass kisser!