AI Tools & Data Guidance for FOA units
AI tools offer a powerful edge in streamlining business tasks and enhancing tech accessibility. To maximize these benefits securely, robust risk management and data security measures are necessary to safeguard your information and systems against misuse, abuse, and unauthorized access.
The University of California remains vigilant about information security, privacy, and governance to ensure responsible and safe utilization of these powerful tools. When using an AI tool at work, you should only use it with publicly available (P1) information unless campus leadership has authorized otherwise. For more information about permissible uses of AI and data security, visit "Sensitive Data and AI- Guidance from UC Davis Tech Leaders."
- Guidance for Users
- The AI tools listed in the table below claim no customer data is used to train their Large Language Models (LLM). However, this is not always the case and every third-party AI tool should be scrutinized in this regard so that users have a clear understanding of what happens to their data after it is processed by an AI/*LLM/*ML model.
*Machine Learning (ML)
We encourage you to regularly monitor the AI tool's performance. This includes checking for accuracy, reliability, and relevance. For best results, define the goals and intended use cases of the AI tool then add your human touch to ensure the tool meets the intended goals. - Resources
- - LMS Training: AI Essentials at UC Davis
- IET Gen AI Guidance and Resources
- PolicyWonk: UC Davis policy resource
- Rocky Aggie AI (beta): GenAI chat tool you can use to navigate your life at UC Davis - Vendor Risk Assessment
A VRA is required for AI tools, just as it is for other software and associated services. For insights into any VRAs currently underway and for further guidance, it is recommended to consult with the FOA VRA Team. If necessary, a VRA request may need to be submitted to the Information Security Office (ISO) to determine if an assessment is needed.
The AI tools specified in the table below have already been evaluated through a campus-wide VRA. Therefore, FOA users are not required to conduct a separate VRA for these tools.
- Other Tools Available for Use
- ChatGPT (VRA required)
AI Tools & Product Overview
The tools listed in the table below have undergone a Vendor Risk Assessment (VRA) by the UC Davis Information Security Office (ISO). The Data Security section defines the approved use of each tool as advised by the FOA Technical Unit Information Security Lead (UISL).
Microsoft Copilot | Otter.Ai | |
Description | A generative AI chatbot designed to assist users by providing intelligent suggestions and automating repetitive tasks. | Transcription software that provides live captions for speakers & generates written transcriptions of speech. |
Data Security (approved use cases) | up to P3 | up to P4 |
Cost | included for UCD staff | Inquire with Disability Management Services |
AI Models | GPT-4, GPT-4 Turbo, DALL-E 3 | GPT-3 |
App Integrations | no | Zoom, Teams, Slack & more |
Extended Memory | no | no |
Image Analysis | yes | no |
Image Generation | yes (using DALL•E 3) | no |
Internet Responses | yes (using Bing) | N/A |
Saved Chats | no | yes |
Text Extraction | yes (from images only) | built-in tools allow you to interact with + edit the transcribed text |
UCD Authentication | CAS / SSO | TBD |
Voice Chat | yes | no |