Copilot Chat and Chat for Edge at Texas A&M University

Overview 

Microsoft Copilot (Copilot Chat and Chat for Edge) gives Texas A&M employees and students access to artificial intelligence (AI)-powered chat with data protection. Employees can use Copilot to get work done faster, be more creative, or support customers. Students can use Copilot to personalize learning, help with brainstorming or receive feedback.

  • Quickly generate content, analyze data, summarize documents and more.
  • Chat data is not saved and data will not leak outside the university.
  • Microsoft has no access to chat records and data isn’t used to train AI models.

Microsoft Copilot is powered by ChatGPT-4. 

Access Copilot Chat at bing.com/chat and log in with your NetID and password. To protect yourself and the university's data when using Copilot, please ensure you are signed in with your Texas A&M NetID and password. You will know your data is protected when you see a green ‘Protected’ badge. Additionally, above the chat input box and on top of every chat answer, you should see a message confirming 'Your personal and company data are protected in this chat.'

Microsoft Edge browser interface showing the green protected badge icon indicating successful sign in to Bing Chat Enterprise account.

Copilot does not use any Texas A&M data or queries to train any of their models. The queries are all encrypted in transit and are not stored by Microsoft. 

The commercial Bing Chat (consumer-based Bing Chat service) should not be used with confidential or sensitive university data because the data may be used to train the AI model. There is also the potential to leak sensitive information when that data is fed back into the AI platform. 

There are many other widely available AI and Machine Learning (ML) tools on the market that should also be used with caution. Confidential or sensitive university data should only be used with approved university cloud resources and approved university storage. If you have questions about potential use of an AI/ML platform, please contact security@tamu.edu.

Copilot functionality

Copilot Chat is different from the consumer-based Bing Chat service in that

  • You must log in with your Texas A&M NetID account (@tamu.edu email address and password).
  • You will see the word 'Protected' at the top of the Microsoft Edge sidebar.
  • It is officially supported only in Microsoft Edge browser.
  • It supports up to 4000 characters per conversation input.
  • It supports up to 2000 characters in Microsoft Edge Bing Chat sidebar.
  • It does not keep the chat history feature found in the consumer-based Bing Chat service.
  • Your chat data is not used to train the underlying models.
  • Your chat prompts and responses are not retained.
  • Your data put into Copilot Chat is protected and will not leak outside the organization.
  • Copilot Chat is not able to see your data in Microsoft 365 unless you specifically copy Microsoft 365 data and enter it into a prompt.

Important: Copilot Chat is supported officially in Microsoft Edge; however, you can use the browser-based version on Chrome, Firefox, Safari, Android, and iOS. Only Microsoft Edge supports the Copilot sidebar.

Copilot replies or responses are not always accurate. Be aware of the possibility of incorrect information. Bing Chat is grounded in web search results, so by its very nature it will be more accurate than the regular or free version of ChatGPT-4. However, there is also incorrect information on the web. Be prepared to verify and fact-check answers given on factual topics.

Frequently Asked Questions

What kinds of things can students do with Copilot?

Students can use Copilot to personalize learning, help with brainstorming or receive feedback. For example, Copilot can help students better take notes, sharpen their writing skills, refine their study habits, and much more. While students do this, they can be assured their data is protected. For example, you can ask Copilot to help you:

  • Improve the way they study: “Give me a quiz to see what I remember about enzyme kinetics for biochemistry.”
  • Take better and more efficient notes: “If I paste in my lecture notes, can you identify the key ideas and give me some mnemonics to help me remember each topic?”
  • Improve their writing: “If I paste in a draft of my essay, can you help me revise it?”
  • Manage stress: “How can I track my progress this semester and adjust my plans as needed?”
  • Focus on their work: “Ask me questions to help generate a plan to help me focus this semester.”

What kinds of things should students be aware of with Copilot and other generative AI tools?

While Copilot and other generative AI tools are powerful resources and can assist students in many ways, there are certain things that students should avoid or be cautious about when using them. Specific examples of what to be aware about include:

  • Plagiarism
  • Cheating on assignments for exams
  • Not providing proper attributions or citations
  • Not understanding generated information
  • Over reliance on AI

How do I know if I’m using Copilot instead of Bing Chat?

Employees and students using Copilot will be signed in to Bing.com/chat using their Texas A&M account. Once you sign in, you will see that the Copilot design looks different from the consumer version of Copilot. Above the chat input box and on top of every chat answer, you will see a message that says, “Your personal and company data are protected in this chat”. 

I am signed in with my TAMU account, but I do not see a Protected badge.

If you do not see the Protected badge, sign out and sign back in. The Protected badge should be present when you go back to bing.com/chat.

What kinds of things can employees do with Copilot?

Employees and student can use Copilot to get work done faster, be more creative, or support customers better. For example, Copilot can help employees quickly generate content, analyze or compare data, summarize documents, learn new skills, write code, and much more. While employees do this, they can be assured their organizational data is protected. For example, you can ask Copilot to help you:

  • Understand the implications of a decision: “What are the pros and cons of offline marketing strategies?”
  • Learn new skills: “What are the top 5 things I should know when managing a large project?”
  • Analyze data: “If we’re forecasting 7% adoption growth this coming semester, how does our internal forecast compare with resource availability?
  • Summarize work-related PDFs open in Microsoft Edge Browser: “Recap the findings of this report and the top 3 concerns.” 
  • Write better code faster: “Write a regular expression in Python that matches email addresses.”
  • Generate social media content: “Use this messaging framework to generate 5 social media posts describing its value to university workers.”

What information should employees not use in Copilot?

You should NOT use information classified as Confidential or Critical with Copilot. As part of the university's enterprise agreement with Microsoft, Copilot is approved to interact with data classified up to and including University-Internal data. You can use the Technology Services Data Classification Tool to determine if the data you want to use is appropriate. Microsoft Copilot is the recommended way to use generative AI within the university environment.

What information can I use with generative AI tools other than Copilot?

Generally speaking, only public information should be used with free or non-enterprise versions of generative AI tools. Specific examples of information NOT appropriate for free or non-enterprise versions of generative AI tools include:

  • Do NOT use data classified as “university-internal”, “confidential”, or “critical” data.
  • Do NOT use data considered student, faculty, and/or staff intellectual property. (unless the individual submitting the intellectual property created it)
  • Do NOT share names and information about a real student, employee, research participant, or patient.
  • Do NOT ask an AI service to summarize and grade student papers or assignments.
  • Do NOT share employee-related data such as performance or benefit information.
  • Do NOT share grant proposals still under review.
  • Do NOT ask an AI service to generate code for Texas A&M systems protecting university data or sharing Texas A&M source code for editing

Do I have access to other Microsoft AI tools like Azure OpenAI and Azure AI?

Employees in a researcher or developer role who want access other Microsoft AI tools, such as Azure OpenAI and Azure AI, can request a subscription to Texas A&M Azure in AIP. However, please note that using these tools incurs a cost and you will be asked for a billing number as part of the request.

For more information about Microsoft AI tools, visit: https://it.tamu.edu/cloud/providers/azure.php.

Am I limited on the number of queries I can ask in a day or week?

You can give Microsoft Copilot 30 prompts per session, i.e. you can chat back and forth up to 30 times in one session. You are limited to 300 sessions per day.

When Copilot creates content or images, does anyone have a copyright?

No, content and images are not copyrighted.

Was this helpful?
0 reviews
Print Article

Details

Article ID: 405
Created
Thu 5/2/24 9:57 AM
Modified
Thu 6/20/24 2:15 PM

Related Services / Offerings (1)

The "Instant Messaging Support" Service Offering allows for incidents regarding Microsoft Teams, Slack, or Google Chat.