Skip to main content
Menu

Getting started with AI for students | AI Competency Centre

Blue banner

Getting started with AI for Students

Getting started with AI for students

Note: Students must refer to the University’s Guidance on safe and responsible use of GenAI tools for study. You should always follow the advice and guidance of your tutors and/or supervisors, and consult your department or faculty in relation to ethical use.

The role of generative AI in learning

Generative AI can be used in variety of ways to support learning. These include:

  • Supporting cognition: For example, by helping break down complex concepts or materials or making visual or audio versions of materials.
  • Supporting organisation: For example, by organising notes or creating actionable learning plans.
  • Supporting skills practice: For example, by creating supporting tools such as custom bots to assist with practice activities.

Contents:

What you should know about generative AI before starting

Generative AI is a term that covers tools powered by Large Language Models (LLMs). Large Language Models give these tools capabilities to work with natural language and images. There are three broad types of tools that can be powered by LLMs:

  1. Chatbots such as ChatGPT, Claude, Microsoft Copilot or Google Gemini;
  2. AI products such as NotebookLM (organising and exploring notes), Elicit (research), Gamma (presentations), Loveable (application builder)
  3. AI features in existing products such as organisation features in Notion, document summaries in Google Docs or Gmail, etc.

Before you start, make sure you:

  • Understand the general strengths and limitations of generative AI tools
  • Get to know the specific tool you are using
  • Consider whether using generative AI is causing you to bypass rather than augment your learning

Important: Generative AI tools are unlike traditional software. You are not just learning the features and interface but also a whole new mode of interaction. Make sure you spend enough time learning about how they work before you start relying on them.

The outputs they produce may contain critical errors, so build in processes for reviewing them critically and checking their accuracy.

Policy and guidance

The University of Oxford supports the responsible use of generative AI to enhance learning while maintaining academic standards. Clear policies ensure AI tools support rather than undermine educational goals.

Students must understand and follow the guidance on safe and responsible use of GenAI tools. Each course and assessment may have different requirements. Always check the specific guidelines provided by your tutors, supervisors, department or faculty, and in your course handbook.

Key principles include:

  • Transparency about any AI use in academic work
  • Following AI use in summative assessment policies without exception
  • Maintaining academic integrity and personal accountability
  • Developing authentic understanding alongside AI assistance

When using AI tools with University data, always follow information security guidelines and use only approved platforms when working with confidential or sensitive information.

Capabilities and limitations

What generative AI is designed for

Generative AI is powered by Large Language Model (LLM) technology. This allows it to perform tasks such as:

  • Generating text in almost any language and style based on a prompt
  • Translating between languages, genres and styles
  • Following examples and instructions
  • Generating computer code and translate between computer languages
  • Generating structured lists and tables from text
  • Translating lists and tables into prose
  • Describing images and transcribe audio

Critical limitations

Large Language Models (LLMs) have limitations that should always be kept in mind if you are using them to support learning. For example, LLMs do not:

  • Give exactly the same response every time
  • Always provide reliable facts, figures or quotations
  • Perform reliable calculations on numbers
  • Report accurately on their internal processes

Remember, anything generated by AI must be treated as a first draft to be checked or a hypothesis to be tested. Never assume that because AI performs well on one task, it will work equally well on a task that seems very similar.

The context window:

Every AI tool has a limit on how much text it can consider at once. This includes your prompts, the AI's responses, and all previous interactions in a chat. Different models have different limits. For example:

  • ChatGPT Edu Approximately 10,000 words for most models
  • Claude: Up to 100,000 words
  • Gemini: Up to 1 million tokens (approximately 750,000 words) for certain models When working with long texts, aim to use no more than two-thirds of the stated limit to ensure quality responses.

Tools and ‘thinking’:

Advanced models (often called ‘reasoning’) can break down problems into smaller steps and call on external tools. For example, they may draw on:

  • Code interpreters for calculations and data analysis
  • Web search for current information
  • Image generation for visual representations

Some models show their ‘thinking’ process, but remember this is generated text, not actual reasoning.

Hallucination risk:

AI tools generate plausible text that often includes accurate information but can also contain complete fabrications. This is particularly common with:

  • Numbers, dates, and statistics
  • References and quotations
  • Specific facts about people
  • Claims about what the AI has generated itself
  • Academic citations and page numbers

Privacy considerations:

Be cautious about information shared with AI tools. Always:

  • Check privacy policies and opt-out options
  • Use University-approved tools for work or study related activity
  • Avoid sharing any personal or confidential research data with non-approved AI platforms
  • Remember that free versions you access without your University SSO (abcd1234@dept.ox.ac.uk) may use your data for training AI models.

Verification and iteration:

  • Don't accept the first result; iterate and ask for improvements.
  • Don't assume that one experience is enough to understand how AI works.
  • Don't expect to be able to predict quality of the response based on the last time you tried something similar.
  • If a conversation goes off the rails, abandon it and start over in a new one.
  • Treat everything you get from AI as a first draft or a hypothesis to be tested.
  • Never trust an AI tool when it tells you about how it achieved a result.
  • Always check any number, quote, name or date generated by AI.
  • Many AI tools such as ChatGPT now include ‘search’ functions. Don’t assume that the search results are comprehensive, or that everything on the retrieved pages is accurately quoted or includes the necessary context.
  • Do not assume that internet search results are the best sources of information on a particular subject.

Understanding hallucination

The only thing generative AI does is produce plausible text. This often includes accurate information but it can just as easily be completely fictitious. This is called hallucination.

AI tools can hallucinate anything at any time in an extremely plausible way. Hallucinated text is often mixed with perfectly accurate information. Common hallucinations include:

  • Numbers, facts and figures
  • References and exact quotes
  • Suggestions about what the AI itself did

Tips for getting started

  1. Begin with low-stakes practice: Use AI for note organisation or study planning before applying it to assessed work.
  2. Experiment with different tools: Each platform has strengths - ChatGPT for general tasks, Claude for long texts, Gemini for multimedia.
  3. Document your process: Keep records of prompts and interactions, especially when AI contributes to assessed work.
  4. Build gradually: Start with simple tasks like flashcard generation before moving to complex applications like research synthesis.
  5. Stay informed: AI tools update frequently - join our communities to learn about new features and best practices.

The key to successful AI use in learning lies not in replacing your own understanding, but in enhancing your ability to engage with complex materials while demonstrating academic integrity and developing critical thinking skills.

Exploring different AI tools

Types of GenAI

General-purpose chatbots: Multi-functional AI assistants for diverse learning tasks - ChatGPT, Claude, Gemini, and Microsoft Copilot offer varying capabilities for text generation, analysis, and conversation.

Academic research tools: Specialised platforms for literature review and research - Elicit, Consensus, and Scholarcy provide targeted support for academic paper analysis and synthesis.

Study companion tools: Tools designed specifically for learning workflows - NotebookLM for note-taking, Quizlet with AI features for flashcards, and various subject-specific applications.

Product-specific tips

Here are some useful tips to bear in mind when using different GenAI platforms:

  • ChatGPT: ‘Refresh’ button drafts a new response to the same prompt but you can still view previous versions
  • Gemini: ’Redo’ button drafts a new response to the same prompt but you can still view previous versions
  • Claude: Copy responses before regenerating, as previous versions disappear.

University-supported platforms

These tools can be accessed securely using your SSO email (abcd1234@ox.ac.uk) to log in. These are the only tools which have been approved for use with University data.

ChatGPT Edu: The University's version of ChatGPT with data protection. Features include voice interaction, canvas for live text collaboration, and Study Mode for exploring academic concepts. You can create custom GPTs for specific learning tasks or subjects.

Google Gemini: Google’s chatbot, powered by Google’s Gemini language models. The University-supported account you can access with your SSO (abcd1234@ox.ac.uk) has equivalent features to the free/public version but offers enhanced data security. You may want to explore using this for summarising videos or creating applets.

NotebookLM: Google’s notebook, powered by Gemini models. You can log into this with your SSO (abcd1234@ox.ac.uk) to collect multiple documents, make your own notes about them, or combine those with AI-generated insights. Where appropriate, you could also explore using them to create study aids such as mind maps, flashcards, quizzes, and even audio podcasts or video summaries.

Microsoft Copilot Chat: Microsoft’s standalone chatbot, powered by GPT models from OpenAI, is a limited alternative to ChatGPT, offered for free by Microsoft. You should login with your SSO (abcd1234@ox.ac.uk) for enhanced data security. Note: The free version of Copilot Chat does not offer integration with Microsoft Office apps.

Specialised academic applications

Elicit: Analyses and compares research papers in tabular format. Searches academic literature using natural language questions and extracts key information systematically.

Consensus: Searches academic literature to provide overviews of agreement or disagreement on research questions. Synthesises findings across multiple papers.

Scholarcy: Provides reading assistance for academic papers including summarisation, key points extraction, and question-answering capabilities. It has a study mode and research mode.

Perplexity: AI-powered search engine which can restrict searches to academic sources.

Using generative AI tools to support learning

Anyone can take advantage of generative AI across various aspects of their learning process.

Note: Make sure you are familiar with the possible limitations of Large Language Models (LLMs) before considering using them in your studies. You should always follow the advice and guidance of your tutors and/or supervisors, and consult your department or faculty in relation to ethical use.

You may find it useful to try a few different AI tools, and you should be aware that different tools will give different outputs to one another from the same prompts. Within a single tool, you may also get different outputs in response to the same prompt.

Advanced tools and techniques

Working with multiple documents and media:

NotebookLM: Upload multiple documents to explore connections between them. Particularly effective for synthesising information across different sources and identifying thematic links.

Google AI Studio: Accepts various file types including audio and video. Can generate summaries, lecture notes, or descriptions from multimedia content. For longer videos, speed them up before uploading to work within time limitations.

Transforming content across modalities:

Visualisations: Request mind maps, concept diagrams, or flowcharts to represent complex textual information visually. Tools like Gemini and Claude can generate code for interactive visualisations. There are also specialised tools for visualisation such as Napkin or Gamma.

Audio generation: Use tools like ElevenReader or Google AI Studio to convert text to speech for auditory learning. Suno can transform academic content into songs in various styles for memorable learning.

Visual to text: Upload diagrams, charts, or handwritten notes to tools like Gemini or ChatGPT to extract and structure information.

Deep Research and application building:

Deep research features: Platforms like Google Gemini have features that conduct multiple internal searches and analyses before compiling comprehensive reports with references. This can be useful for literature reviews and topic exploration.

Agentic coding: Platforms like Google AI Studio, Bolt, and Lovable allow creation of simple applications without extensive coding knowledge. You can build interactive study tools, physics simulations, or data visualisation apps by describing what you want in natural language.

Effective prompting strategies

Provide clear structure: "Write an explanation with bullet points, in three sections: 1. Key points to know, 2. Common misunderstandings, 3. Table with key terms."

Use personas: "You are an expert in linguistics famous for providing clear explanations..."

Request step-by-step reasoning: Add "think step by step" or "Let's work this out in a step-by-step way to be sure we have the right answer" to problem-solving prompts.

Specify academic level: Always indicate your level of study - "I'm a second-year undergraduate" - to ensure appropriate responses.

Example prompts:

For writing development

Getting feedback:

  • "Review this paragraph and suggest improvements to academic style"
  • "Identify any gaps in my argument"
  • "Suggest alternative phrasings for clarity"

Important caveats:

  • AI may subtly change meaning when rephrasing
  • Not all suggestions will be appropriate for your context
  • Always maintain your own voice and understanding
  • Never present AI-generated text as your own work
For organisation and study planning

Converting notes:

  • “Transform my notes into tables, mind maps, or timelines”
  • “Generate flashcards from these notes with clear questions and answers”
  • “Create mnemonic devices for memorisation”
  • “Extract key concepts organised by theme”

Study scheduling:

  • “Create a study timetable based on my available study time”
  • “Generate study priority lists from my course materials”
  • “Extract action items from these lecture notes”
  • “Suggest effective revision strategies”
For language learning applications

Conversation practice tips:

  • Specify your proficiency level explicitly
  • Define conversation topics and contexts
  • Request feedback on grammar and vocabulary
  • Practice at incrementally increasing difficulty

Limitations to consider:

  • Grammar may be imperfect in morphologically complex languages
  • Consistency at beginner levels can be poor
  • Context windows are shorter for non-Latin scripts
  • Less common languages may have limited support

Responsible use principles

Core academic integrity

Never present AI work as your own: Any use of AI must be properly acknowledged according to your course requirements. This includes paraphrased content, structural suggestions, and generated examples.

Follow assessment guidelines: Always check and strictly follow the AI use policy for each assignment. When in doubt, ask your tutor or lecturer for clarification.

Maintain accountability: You remain fully responsible for any work submitted, including any errors or inaccuracies from AI assistance.

Support from the University’s AI Competency Centre

Training and Workshops: Access our workshops and webinars which are designed specifically for members of the University. Popular topics include responsible AI use, effective prompting, and subject-specific applications.

Community Resources: The Oxford AI student Society provides a platform to educate, build, and connect an interdisciplinary AI community.

Additional resources

Getting started guides: Use our onboarding guides for ChatGPT Edu, Google Gemini, and Microsoft Copilot, to make sure you have got the optimal configuration for your use of generative AI.

Foundation materials: If you are new to AI, we recommend starting with our Generative AI for Beginners guide, before exploring subject-specific applications.

Subject-specific resources: Many departments offer discipline-specific guidance on using AI. Appropriate use of AI varies in different departments and on different courses. Check your department's website for advice and if you are an Oxford student, ask your tutor or supervisor about available resources.