Microsoft has introduced a new ChatGPT-style artificial intelligence AI assistant that will be part of Office apps from November 1 after testing.
Microsoft 365 ‘Copilot’ can summarize every meeting in Teams. This will be helpful for those who don’t want to come to the meeting themselves.
In just a few seconds you can use it to write emails, create word documents, create spreadsheet graphs and even power point presentations.
Microsoft hopes that copilot will reduce work, but some technology experts fear that it could replace humans.
There have also been concerns that these apps will increase our reliance on artificial intelligence.
The app will also have to comply with the new rules on AI, as it is currently unable to tell when a piece of content has not been created by humans.
Artificial intelligence legislation in Europe and China stipulates that people must know when they are interacting with AI and when they are interacting with humans.
It’s up to the person using Copilot to define it, said Colette Stallbommer, head of Microsoft 365. “It’s a tool that people will be responsible for using.”
Content may not make it clear that I was created with the help of an AI assistant. But during this time humans are always in control.
However, European states have said that this obligation will apply to companies making AI tools to ensure their responsible use.
Before the launch, I was given a great opportunity to try Copilot. It is based on ChatGPT technology developed by OpenAI. Microsoft has invested billions of dollars in this company.
I tested this on the laptop of Microsoft staff member Derek Snyder. Copilot is linked to an individual’s individual account. It has access to your account and company data.
Microsoft says this data is kept secure and no further training of technology is done with it. “You only have access to the data you’re allowed to see anyway,” Stallbommer said. It (artificial intelligence) respects the data policy.
Copilot seems to be a useful tool initially. It can be a good companion for office workers, especially in companies that have adopted a savings policy.
I found that CoPilot easily summarized mock product launch emails in a matter of seconds.
Then he suggested a short answer. Through one of its options, we made our answer longer and softer in tone. The chatbot wrote a warm response to the consultation that appreciated all the ideas while showing interest in the project. During all this we had not read anything about this project.
Then we were given the option to edit the email before sending it. Nowhere in the mail did it say that all the content was created with the help of Copilot.
Then I also saw how this tool created a PowerPoint presentation in just 43 seconds. He was provided with the content in a Word document. He used the same images that were in the document. It also has the ability to find royalty-free images on its own.
Thus he made the presentation simple and effective. He also made suggested notes to be read with the presentation.
When I requested to make the presentation a little more ‘colorful’, he did not understand my request. He asked me to use the PowerPoint tool myself.
Finally we looked back at a Microsoft Teams meeting.
Copilot had the ability to identify different topics and summarize the discussion. It can identify points of agreement and disagreement and list the pros and cons of something. All this takes only a few seconds.
It is programmed to not answer a question about a person’s performance during a meeting, such as who was the best (or worst) speaker.
I asked Snyder if anyone would even consider going to the meeting, when the copilot could make this difficult for them easier. “A lot of meetings will just become webinars,” he jokes.
This technology cannot tell how many people are on Teams and how many people are participating in a meeting on a device. For this one has to verbally tell so.
Copilot will cost $30 per month. It is connected to the internet and does not work offline.
According to critics, such technology could cut administrative jobs.
Chrissa Wells, an AI expert at the University of Oxford, said she worries that such a tool will become too dependent on people.
He said that what will happen if the technology fails or gets hacked? It may also contain omissions or create new policies that you may not agree with.
What if you get so used to it that you lose the ability to function without it?