Norwegian version of this page

Regulations for the use of Microsoft Copilot (Bing Chat Enterprise)

Bing Chat Entreprise – Regulations

This page contains information about the generative artificial intelligence service and language model Microsoft Copilot, formerly known as Bing Chat Enterprise. The tool is referred to here as Bing Chat Enterprise. The tool is available to employees at the University of Oslo.

What can I use Bing Chat Enterprise for?

You can use the tool to process:

  • IInformation that can or should be available to everyone without specific access rights (green data) 

This includes information that can or should be accessible to everyone without special access rights. The majority of the information that the university manages is open, either as a result of the goals and purposes of the university's activities or as a result of requirements for openness in laws, regulations, and other rules governing public administration and business. Other parts of the information have no requirements for protection even though it is not openly accessible.

What can I not use Bing Chat Enterprise for?

You cannot use the service to process information at a higher classification level than open or freely accessible (green), and limited, and you are not allowed to include personal information - neither your own nor others.

What should I consider when using Bing Chat Enterprise?

When using Bing Chat Enterprise, you should consider the following:

  • Consider what you will use the service for. Remember that artificial intelligence is just a tool and does not replace critical thinking and evaluation skills. It is important to be aware of the limitations of artificial intelligence and not rely solely on it for decision-making and problem-solving.
  • The laws and regulations still apply, even if they are not specifically adapted to artificial intelligence.
    • Example: If the service is used as a support tool in public administration, make sure that you can meet the requirements for transparency and justification of a decision, as stated in, among others, Section 25 of the Public Administration Act. An important condition in the provision is that the justifications must refer to the factual circumstances on which the decision is based. In order to identify the factual circumstances emphasized in the AI system, one must understand the model's function and explain the factors that have resulted in a given outcome.

Always ensure the quality of the content generated by the service.

Remember that any infringement on individuals' rights requires a legal consideration.

Be critical of the answers provided by the service. You should expect that the language model may generate text with inaccurate information, but presented in a credible manner.

Be aware that AI can be discriminating and biased. AI is only as good as the data it is trained on. 

  • Always assess the risk versus the benefit, especially when using the service.
    • Identify - what unwanted events can occur?
    • Analyze - what are the consequences if these events happen?
    • Evaluate - can you accept this risk or do you need to implement risk-reducing measures?
  • Assume that the references given by the service may be incorrect. 
  • If the AI service has performed a task for you, you must still be able to explain the process. 
  • Manually verify the answer provided by the language model before forwarding or using it in another context. 
  • Be transparent and open about the use of AI. When using and disseminating answers, consider whether text generated by Bing Chat Enterprise should be marked as solved with the help of AI. 
  • Do not use the language model to process personal information, either about yourself or others. 
  • Do not use the language model to handle cases involving students or employees. 
    Be aware of the ownership of the answers generated by the service and ensure that you do not commit plagiarism or infringe on others' rights. 
  • Keep in mind that you may not get the same answer twice - do not use the language model for tasks that require integrity. 
  • Never share information with the service that it is not authorized for. 
    Do not input information that is confidential, such as internal documents or copyrighted works. 
  • Test and familiarize yourself with the tool before using it in critical processes
Published Jan. 2, 2024 10:55 AM - Last modified Jan. 11, 2024 1:27 PM