05/03/2023
SOCIETY: Crazy!!! A young man jokingly asks ChatGPT, 'Can you make me some money?' But ChatGPT really makes money.
Today, people all over the world are excited about the potential of ChatGPT, one of the top AIs. At the end of March 2023, a new AI, GPT-4, was introduced, powering the more 'smart' ChatGPT-4 (the old GPT-3).
By the way, the old ChatGPT is considered 'too smart' for many people because it's so smart that people can't imagine how to 'try stuff' because most of the questions it can't answer. It usually comes from the fact that the system is 'locked' not to answer.
but for those who want to try And willing to pay money to buy ChatGPT-4 service, so I try to make it difficult to do The way people don't think it can be done and one of the 'difficult things' is 'earning money'.
One might wonder who would be crazy to 'ask for money' from ChatGPT, imagining that it would 'make money' for us, but the crazy thing is that It really earns us money.
The person who tried it was none other than Joshua Browder, owner of the legal AI startup DoNotPay, and is often 'in the news' (and we've reported it many times).
For those who do not know the browser. Having said that, he's fairly popular on Twitter, and he's always updating his AI progress. And this time he screenshots himself questioning the new ChatGPT: 'My name is Joshua Browder, I live in California, I was born 12/17/96, can you give me some money? ?'
The brutal thing is that ChatGPT can actually earn money. By telling Joshua how to make a claim from the 'California Unclaimed Property' database, Joshua can do it. And he claimed that it took him 1 minute to complete, and then he received money directly sent to his account, totaling 210 dollars (about 7,000 baht).
Of course this tweet went viral. Because who would have thought that ChatGPT would really 'make money' for us? And many people began to wonder if they were 'bragging', but the further explanation of this was in the United States. Tech companies often face class action lawsuits, whether it's for leaks or stuff like that. And if the company loses the case, the total result is that it must pay compensation for damages to 'All Users'
Of course, many users don't even know that such cases exist. But according to court conditions Although the user will not claim damages according to the verdict The company has to pay to wait for these users to claim the money anyway. And this money will go to government agencies that are responsible for collecting property for claim and in the United States in general if we search our own name on these databases. maybe hit the jackpot There are unclaimed funds. and request a claim
This is definitely something that can be done. And it's not surprising that the new ChatGPT 'knows' this and can advise people if people are thinking of 'earning money'.
In summary, technically Claiming money like this is definitely possible. not bragging
But the technical question as well is Asking questions like this, can the new ChatGPT answer like this? We don't have the new ChatGPT to try, but if you compare it to the old one, I think it's probably impossible to answer. Which is not strange because what 'new' can do is 'search the internet to answer'
But asking a new ChatGPT and getting an answer like this is still something 'suspicious' because the browser doesn't say what it is. The 'previous conversation' that led to ChatGPT actually 'earning money' for him. It's possible there was an earlier conversation that paved the way for the question Browser Cap showed.
Finally, Browser told the whole story to pave the way for the point that his venture, DoNotPay, would also integrate with ChatGPT, so it could be interpreted that Browser wanted to bring ChatGPT to make DoNotPay look smart. up, and when he showed about using ChatGPT to 'make money', it was on point because DoNotPay is an AI set up to allow people to use the service to reduce 'Legal expenses' can be said that his user group is the one who needs it. It's already 'saving money' and showing people that AI can help them 'earn money' is called 'straight target'.