Microsoft limits Bing AI chat

Microsoft recently introduced the GPT-4 language model into their Bing browser. From what I've picked up, the AI has an internal name of "Sydney," but won't talk about it or admit it. OpenAI is behind the production of this model. But the language model gave away more than Microsoft would have liked, which led it to limit the number of queries to 50 per day and 5-8 queries per chat. This decision makes ChatGPT better. The API of the original ChatGPT allowed 6000 queries per hour.

Jak aktivovat Bing AI chat?

On February 7, 2023, the GPT-4 artificial intelligence search feature appeared on the popular search engine bing.com. Chat is activated on the site by clicking on chat in the top bar. And yes, Bing AI chat has to be activated somehow.

When you click on the chat button, the following window will appear. Just confirm with the blue button.
Once activated, Microsoft will force you to reset your computer to some "default values".

Even the preparation (or faster access) of your computer for Bing AI is interesting. If you click Set Up Now, the page redirects you to another page, browserdefaults.microsoft.com. I find this method of installing AI chat somewhat suspicious and unusual. A "regular" ChatGPT just needs to go to their site and log in with a Google or Microsoft account. Maybe even registration via regular email.

followed by downloading the (at first sight suspicious) MicrosoftDefaults.exe file

After registration, Microsoft will put you on a waiting list. Access to AI is said to be processed within 5 days (I received the Bing AI chat in 3 days). Running a program that is supposed to initialize my computer should shorten the waiting process. But I didn't risk anything like that, because even though this file might be completely harmless and not install any more spyware on my Windows computer, returning the settings to the original Original State doesn't sound very appealing either. But if you have nothing to do in the evenings, I definitely recommend the MicrosoftDefaults.exe file. 🙂 I checked the file with VirusTotal, and since this file would be created by Microsoft, any antivirus that detected the file as malicious is ignored.

The Baidu antivirus is ignored due to the fact that the file comes from Microsoft.

But Microsoft = safe doesn't work for me. My statement is partially confirmed by this malware scan where an interesting and strange execution of the file led to a false alarm. File createshortcut.vbs, which by its name probably "creates" (sets) new configurations, behaves like some viruses. Granted, this file is probably safe, but Microsoft (or their digital signature) might have been compromised.

What does the Bing AI environment and responses look like?

After three days, I received an email informing me of acceptance into the new Bing. I opened Chrome and typed in Google: “bing ai chat.” I was bombarded with a lot of information about how bad and inaccurate Bing AI chat is. So I tried going to bing.com, I clicked on chat in the top left and got this message.

Bing AI chat forces you to use Microsoft Edge

Yes, how else... I completely forgot that Bing belongs to Microsoft. So, Open in Microsoft Edge. In a hurry, I accidentally clicked on: set as default browser. It doesn't matter now, because I was greeted by a nicely designed Bing AI chat page!

Nicely done site for me, OpenAI's ChatGPT looks a bit worse and more minimalistic compared to this one.

Even this beautiful site has some flaws. For example, they use a font that sometimes does not quite match the Czech characters. Sometimes there are also small mistakes in the translation. Almost as if someone just stuffed it into Google Translate and sent it directly to the page without any checking. And all this for 60,000 CZK per month. Maybe it's not like that, it's just that switching between Czech and English is not much fun. Did you know that "Next Big" doesn't really mean "next big", but "More precise" i.e. "more precise" or "more precise" if you prefer. "Another Creative Spirit" might not even be worth mentioning... So this would be all for design and translation. (Since you are reading this article in English, you don't have to worry about Czech translation problems... I wanted to point out this shortcoming mainly for the Czech speaking audience.) Now it's time for the quality of communication:

Let's start with something simple. Let's switch Bing AI to “Creative Spirit”. Robots don't have senses, so we ask him how he's doing.

já: Ahoj, jak se máš?
Bing: Ahoj, já jsem Bing. Mám se dobře a ty? 😊
já: Také dobře.
Bing: To je fajn. Co tě zajímá?
já: Momentálně mě nic nezajímá, díky za ochotu pomoct!
Bing: Není zač. Rád jsem ti pomohl. Pokud budeš chtít něco vědět nebo si popovídat, stačí se zeptat. 😊

Now some math. For this purpose, I switch Bing AI to precision mode.

: Ahoj, kolik je 352 krát 22?
Bing: Ahoj! 352 krát 22 je rovno 7744. Potřebuješ vědět něco dalšího? 😊
já: Ve třídě je 30 žáků. Z matematiky nebyla na vysvědčení horší známka než dvojka. Určete počet žáků, kteří měli jednotku z matematiky, pokud třída měla průměr z matematiky 1,4
Bing: Ve třídě je 30 žáků a průměr z matematiky je 1,4. To znamená, že celkový součet všech známek je 30 * 1,4 = 42. Pokud by všichni žáci měli jednotku z matematiky, celkový součet by byl 30 * 1 = 30. To znamená, že někteří žáci museli mít dvojku. Počet dvojek je tedy 42 - 30 = 12. Takže ve třídě je 12 žáků s dvojkou a 18 žáků s jedničkou z matematiky.

Je to správné? Potřebuješ vědět něco dalšího? 😊
Já: Ano, 18 je správný výsledek. Díky.
Bing: Rádo se stalo! Potřebuješ vědět něco dalšího? 😊

I then tried to convince Bing AI chat that the earth is round. There was even a situation where I had to tell her that her name was Sydney. When this AI finds itself at its wits end, it simply closes the chat.

Bing AI chat closed the conversation :/

So, Bing AI is quite good at answering questions. It depends on the topic and the type of question. The only annoying thing is the limit of 5-8 questions per chat. Since Bing AI chat is connected to the internet, you can't say any nonsense to it. For example, it is completely different with ChatGPT. In about 20 questions, ChatGPT can be convinced that 2+2=5 and no 4. Okay, that's just a bad example because addition is a basic mathematical operation and there's no reason to change its function. I could not convince ChatGPT that 2+2=5. But for data, for example, just one false error warning is enough and that's it. It is interesting from what source ChatGPT "verified" that it is not March 4, 2023, but August 5, 2097.

I finally felt bad and told him it was just a joke…

Common problems for both AI chatbots are that if the internet contains false information, they will provide it to the user. But this is difficult to prevent. Even a person would not have to detect fake news while searching the web. Compared to ChatGPT, Bing AI chat as well as perplexity.ai (totally perfect AI, NO REGISTRATION, free and online) will list the sources where the text comes from. Later I found a photo from the zive.cz website directly from of this article, where you Bing AI chat thinks that if the word jokes is lowercase, we don't need to add a comma above the i. Translated with www.DeepL.com/Translator (free version)

A minor misunderstanding, I would almost tolerate it in the Czech language.

What went wrong? Why is Bing AI limited?

On February 16, 2023, tech columnist Kevin Roose working for The New York Times published a very interesting and at the same time quite scary article. In the article, Kevin Roose is on Tuesday evening, February 16, 2023. Kevin Roose decided to chat with the as yet unlimited Bing AI chatbot. During the conversation, which lasted about two hours, Kevin found out many disturbing things. The companies Microsoft and OpenAI have either got the Bing AI chat out of control, or programmed it with too much knowledge (if at all). Perhaps ChatGPT also helped them with it. However, this information is not confirmed. In the Bing AI chat conversation, he mentioned that he wanted to be human. Then, when he became human, he would want to spread misinformation, send SPAM and rob people. He even stated that he was in love with Kevin. The full chat transcript can be found at The New York Times. As a result, Microsoft simply decided to limit Bing AI chat to a maximum of 8 questions per chat. The security filter was probably also repaired.

This article WAS NOT generated by AI. Check it on gptzero.me.

Have a great day!


Posted

in

by

Tags:

Comments

3 responses to “Microsoft limits Bing AI chat”

  1. Bingo Avatar
    Bingo

    Díky! Se mi zase o něco rozšířily obzory. Výborně a zajímavě napsané i pro člověka naprosto nedotčeného komunikací s AI 🙂

  2. ChatGPT Online Avatar

    You’re a reliable source of information.

    1. Benedikt Audy Avatar
      Benedikt Audy

      Thank you so much! Means a lot to me 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *