Four people who know about the situation told Reuters that Alphabet Inc. is telling its employees about how they use chatbots, including its own Bard. This is happening at the same time that the company is marketing its programme all over the world.
According to the report, the company has told its workers not to put confidential information into AI chatbots. The company confirmed this, saying that it has a long-standing policy about keeping information safe.
Some of the chatbots, like Bard and ChatGPT, sound like real people and use something called “generative artificial intelligence” to hold talks with users and respond to a wide range of questions. The report also said that humans may read the chats, and experts found that similar AI could re-create the information it learned during training, which is a security risk.
Some people also told Reuters that Alphabet has told its engineers not to use computer code that chatbots can make directly.
When Reuters asked for a response, the company said that Bard can make suggestions for code that programmers might not want but that it still helps them. Google also said it wanted to be clear about what its technology couldn’t do.
The thing that worries me is how Google wants to keep its business from getting hurt by software it made to compete with ChatGPT.
Google is in a race against OpenAI and Microsoft Corp., which are backing ChatGPT. At stake are billions of dollars in investments, ads, and cloud revenue from new AI programmes.
Google’s caution is also in line with what is becoming a security standard for companies: telling employees not to use chat programmes that are open to the public.
A growing number of companies around the world, including Samsung, Amazon.com, and Deutsche Bank, have put limits on AI robots, the companies told Reuters. Apple, which did not answer calls for comment, is also said to have done the same thing.
As of January, about 43% of professionals were using ChatGPT or other AI tools, often without telling their bosses. This was shown by a poll of nearly 12,000 people, many of whom worked for top US companies.
After a report from Politico on Tuesday that the company was delaying Bard’s launch in the EU this week until it knew more about how the robot would affect privacy, Google told Reuters that it has had detailed talks with Ireland’s Data Protection Commission and is answering the regulators’ questions.
Concerns about private data
This kind of technology can be used to write emails, documents, and even software, which promises to speed up work by a huge amount. But this content can also contain false information, private data, or even passages from “Harry Potter” that aren’t allowed to be copied without permission.
Google’s privacy notice, which was updated on June 1, says, “Don’t talk about private or sensitive things in your Bard conversations.”
Some companies have developed tools to deal with these kinds of problems. For example, Cloudflare, which protects websites from hacking and offers other cloud services, is promoting a way for businesses to tag data and stop it from going outside the business.
Google and Microsoft are also selling chat tools to businesses, but they cost more and don’t feed data into their public AI models. By default, Bard and ChatGPT save the chat history of each user, which they can choose to delete.
Yusuf Mehdi, Microsoft’s consumer chief marketing officer, said it “makes sense” that companies wouldn’t want their employees to use public robots for work.
Mehdi explained how Microsoft’s free Bing chatbot is different from its business software by saying that companies are taking the right amount of caution. “There, we have much stricter rules.”
Microsoft wouldn’t say if it has a general rule against employees putting private information into public AI programmes, including its own. However, a different executive at the company told Reuters that he limits his own use.
The CEO of Cloudflare, Matthew Prince, said that putting private information into chatbots was like “letting a bunch of PhD students loose in all of your private records.”