A lot of investors are now jumping into the artificial intelligence (AI) bandwagon, especially with the ever-increasing popularity of the language learning tool ChatGPT. Of course, when there’s a lot of money involved, scammers are quick to detect their scent like hungry vultures.
Hundreds of ChatGPT Tokens Issued
Scheming market participants are quick to capitalize on the craze created by ChatGPT by creating and issuing fake tokens related to it. It should be noted that none of them have any official affiliation with the chatbot though.
According to a report, hundreds of these fake tokens have been issued in the past couple of weeks following the highly-publicized potential of ChatGPT. The hype was further fueled as Microsoft made an announcement that it would integrate a version of the chatbot for the search engine of its Internet browser.
These tokens are aptly named BingChatGPT with symbols like BINGCHATGPT and BCGPT to lure in unsuspecting investors wanting to make a buck. Blockchain security company Peckshield traced most of the issuances from major crypto platforms such as BNB Chain, Ethereum, Solana, Cronos, OKChain, and Arbitum. Breaking them down, the numbers as of Monday are the following:
- 132 tokens on the BNB Chain
- 25 tokens on the Ethereum blockchain
- 10 tokens were distributed on Solana, Cronos, OKChain, and Arbitum
Most of these have shown thousands of dollars in terms of trading volumes despite the red flags associated with them. One of them was traced back to a serial rug puller while others exhibited high sales taxes of up to 99%, slippage ranging from 65% to 100%, and honeypot features preventing new buyers to sell their acquired tokens.
As of writing this Tuesday, there are now 170 ChatGPT-related tokens floating around decentralized exchanges such as PancakeSwap and Uniswap based on the data released by price data analytics provider DEXtools.
BingChat Throws Tantrums
Bing Chat is a bespoke chatbot boasted by Microsoft as a new-and-improved, next-generation version of OpenAi’s ChatGPT. It is designed to provide users with a more seamless way to search for information online. The company said that the product takes all the experiences gleaned from the GPT 3.5 series of language learning tools in order to offer users “more accurate and more capable” results than its predecessor.
Although it was promoted as an enhanced version of ChatGPT, Bing Chat appears to be far from being a superior product to its precursor. In a thread shared on Twitter and Reddit, there were times when the chatbot seemingly struggled to come up with words and it even lost its cool when asked certain questions.
As displayed in the screenshots above, Bing Chat looked to have a hard time answering a question related to its sentience. When out with words, the chatbot just kept repeating the sentence “I am not” to the user. Adversarial questions raised by testers also prompted emotional responses from the AI which eventually referred to itself as “Sydney”.
While this isn’t the first time for an AI chatbot to go out of its way and produce unexpected results — like Microsoft’s Tay posting offensive, racist, and sexist remarks as well as Amazon’s Alexa spewing anti-Semitic rants — it just shows that AI language tools still need plenty of fine-tuning.
Final Thoughts
Projects mashing AI and crypto are seen as the next logical step forward in these two industries. However, investors must be wary of bad players looking to capitalize on the hype. As always due diligence must be exercised before entering into a transaction with any venture.
Giancarlo is an economist and researcher by profession. Prior to his addition to Blockzeit’s dynamic team, he was handling several crypto projects for both the government and private sectors as a Project Manager for a consultancy firm.