Frequently Asked Questions

General

What is "AI" and what is "generative AI"?

Artificial intelligence (AI) is a broad term for computer systems that perform tasks that typically require human intelligence. Generative AI is a type of AI that can create new content—like text, images, audio, video, code or summaries—based on prompts you provide. NIU’s AI Quick Start Guide and Student Guide to AI include resources that introduce tools like ChatGPT and explain what they can (and can’t) do well.

Is AI allowed at NIU?

AI use is often allowed, but the rules depend on context (course expectations, research requirements, data sensitivity, workplace policy, etc.). In courses, expectations are set by your professor and rules can vary not only by department but even across different courses within the same department.

Always review your syllabus and assignment instructions and ask your professor if you are unsure whether or how AI tools may be used for a particular assignment. NIU provides guidance and learning resources through the AI at NIU website and related university policies to help students, faculty and staff navigate these decisions responsibly.

What’s the most important rule to remember about AI?

You are responsible for what you submit or implement. AI can be helpful, but it can also produce errors, bias or fabricated details. Always verify outputs and follow the rules that apply to your situation (course, unit, policy, law) and consider disclosing what you used AI for. NIU’s AI guidance explicitly emphasizes critical use, ethics and accountability.

In addition, be mindful of the environmental impact of AI use and weigh the costs and benefits of each query—considering whether AI is necessary for the task and using it thoughtfully when the benefits justify the resource and energy use involved.

Can I trust AI outputs?

Treat AI output as a draft or starting point, not an authority. Verify facts with reliable sources, check calculations and confirm citations. With any AI output, there are trustworthiness concerns and the need to validate outputs.

What about privacy and sensitive data?

Use caution before entering personal, confidential or sensitive information into AI tools. NIU’s AI Ethical Guidelines call for careful consideration of data sensitivity and "extreme caution" with personally identifiable information and vulnerable populations. In addition, NIU’s Intellectual Property Policy reminds faculty, staff and students to protect university-owned, faculty-created and student-created intellectual property when using third-party tools, including AI platforms.

Examples of personally identifiable or sensitive information include, but are not limited to:

  • Names, student ID numbers, employee IDs, usernames or passwords
  • Social Security numbers, driver’s license numbers, passport numbers
  • Contact information such as home addresses, personal email addresses or phone numbers
  • Academic records, grades, advising notes, disciplinary records or evaluations
  • Employment records, performance reviews or personnel files
  • Health, disability or counseling-related information
  • Financial information, such as bank details, tax information or aid records

Examples of intellectual property or proprietary content include, but are not limited to:

  • Faculty-created course materials (such as lectures, slides, assignments, exams, rubrics and instructional content)
  • Student assignment submissions, projects, papers, creative works or portfolios
  • Research data, manuscripts, grant proposals or unpublished scholarly work
  • University-owned materials, internal documents or non-public reports

Examples of vulnerable populations may include:

  • Students
  • Individuals with disabilities or documented accommodations
  • Patients or research participants
  • Individuals receiving academic, financial or mental-health support services
  • Employees or students involved in grievance, disciplinary or legal processes

When working with information in these categories, avoid entering it into AI tools unless the tool is institutionally approved and you have clear authorization to do so.

Why should I prefer NIU-licensed tools when possible?

NIU-licensed tools are selected and supported through university channels and are described as aligning with NIU’s AI ethical guidance. For example, NIU’s Microsoft 365 Copilot access includes commercial data protection within the platform, which means that:

  • Content you enter (such as prompts, documents or files) is not used to train public AI models
  • Your data is handled under enterprise-grade security and compliance standards, rather than consumer terms of service
  • Access is tied to your NIU account, helping ensure appropriate authentication, auditing and oversight
  • Data is subject to contractual protections that are consistent with institutional privacy, security and risk-management expectations
What about using AI for grading, feedback, or evaluations?

If you use AI to support feedback or evaluation tasks, do so carefully:

  • Maintain human judgment and final responsibility
  • Avoid sharing sensitive student information with non-approved tools
  • Consider bias, transparency and fairness

NIU’s ethical guidance highlights accountability, transparency considerations and human oversight.

What are the environmental impacts I should consider when weighing whether and how to use AI?

Using AI has environmental costs that are often invisible to users. Training and operating large AI systems require substantial amounts of electricity, computing hardware and cooling, which contribute to energy use, carbon emissions and electronic waste.

Key environmental considerations include:

  • Energy consumption: Large AI models run on data centers that use significant electricity, often continuously.
  • Carbon footprint: The energy used to train and run AI systems can contribute to greenhouse gas emissions, depending on how the electricity is generated.
  • Water use: Many data centers rely on water-intensive cooling systems, which can affect local water resources.
  • Hardware impacts: AI systems depend on specialized chips and servers that require raw materials to manufacture and eventually become electronic waste.

Because of these impacts, thoughtful use matters. When deciding whether and how to use AI, consider:

  • Whether AI is necessary for the task or whether a simpler tool would suffice
  • Using AI efficiently (for example, fewer prompts, clearer questions and avoiding unnecessary re-runs)
  • Favoring institutionally supported tools and platforms that align with sustainability, security and compliance goals
  • Being mindful that small individual choices, multiplied across many users, can have meaningful environmental effects

Using AI responsible includes not only ethical and academic considerations, but also awareness of its broader environmental footprint.

Students

Can I use AI (like ChatGPT) for homework or assignments?

It depends on your course. At NIU, expectations can vary by professor and assignment. If you’re not sure, check your syllabus and/or ask before using AI. Your professor determines if and how AI technologies may be used in coursework.

If AI is allowed, do I have to disclose or cite it?

Many faculty will require acknowledgement and/or citation when AI meaningfully contributed to what you submit (ideas, text, code, images, etc.). NIU provides examples of course AI policies (ranging from prohibited, to permitted with permission, to permitted with acknowledgement).

Because expectations can vary by course and by assignment, it is important to check with your professor if you are unsure whether AI use must be disclosed or how it should be cited. When in doubt, ask before submitting your work—clear communication helps you avoid misunderstandings and potential academic integrity concerns.

Can AI help me study without violating course rules?

Often yes—if it doesn’t cross into doing graded work you’re expected to do yourself and if your professor permits it. Common "study support" uses include: summarizing your notes, generating practice questions, explaining concepts or helping you plan how to tackle a project. See the NIU Student Guide to AI to learn more about AI's capabilities/limitations and using it responsibly.

Can I use AI for research and sources?

Use AI carefully for research. AI tools can help you brainstorm search keywords, narrow down your topic, generate search strategies to copy/paste, summarize material you provide or clarify complex concepts. However, they are not reliable authorities and should never be treated as primary sources.

In particular:

  • AI tools may invent citations, misattribute ideas or misrepresent what a real source actually says.
  • AI tools may reproduce or closely paraphrase passages from copyrighted texts without clearly identifying or citing the original source.
  • AI tools may provide references that look real but do not exist or do not support the claims being made.

For these reasons, you should always:

  • Locate and read the original source that AI cites or summarizes
  • Confirm that the source exists and that it actually supports the point being made
  • Cite the real source, not the AI tool, unless your professor explicitly tells you otherwise

In addition, be careful about what materials you upload into AI tools. Even when AI is allowed for study support, you should not upload faculty-created course materials—such as lecture notes, slides, readings, assignments, exams or recorded lectures—into third-party AI tools unless your professor has explicitly given permission to do so. These materials may be protected by copyright, intellectual property or course-use restrictions, and uploading them may violate course policies or university guidance.

Similarly, avoid uploading:

  • Other students' work
  • Exam or quiz questions
  • Restricted course content
  • Any materials labeled as confidential, proprietary or for class use only

AI can be a useful starting point for discovery, but all final source selection, interpretation and citation decisions should be made by you using reliable academic resources such as library databases, course readings and peer-reviewed literature—and in ways that respect privacy, intellectual property and your professor's expectations.

How do I avoid academic misconduct with AI?

Don't submit AI-generated work as if it's solely your own if your professor doesn't allow it—or if disclosure is required. Remember that academic dishonesty applies broadly in the academic environment and that research misconduct includes serious violations like claiming another person's or entity's work as your own.

When using AI, be especially careful about plagiarism and misattribution, including:

  • Submitting AI-generated text that closely mirrors existing books, articles or websites
  • Copying or lightly editing AI output that reproduces ideas or language from a real author without a citation
  • Relying on AI-generated citations without checking the original sources
  • Citing sources you have not personally read or verified

Because AI tools can sometimes reproduce or closely paraphrase copyrighted material without citation, you should always:

  • Read the original sources yourself
  • Write in your own words and academic voice
  • Cite real authors and publications accurately
  • Disclose permitted AI use when required by your professor

Become familiar with NIU’s Student Code of Conduct and course-specific AI policies and ask your professor if you are unsure whether a particular use of AI is allowed or how it should be acknowledged. Also, the Northern Pact outlines the values of the NIU community and describes how you can help support them. How you decide to engage with AI should be guided by these values.

If AI can just give me the right answer, why not use it?

Learning is not the same thing as getting the "right" answer. College is about exploring new ideas, thinking about things in a different way, challenging what you know and building your own knowledge. If you rely on AI, you miss out on the benefits of learning which involves your own thinking. Using AI eliminates the necessary friction to learn.

In addition, using AI simply to obtain answers also carries a real environmental cost. Each AI query requires energy, computing resources and data-center infrastructure. When AI use is not necessary for learning, it's worth weighing whether the educational benefit justifies the environmental impact involved.

Why do misunderstandings about AI use sometimes happen?

AI tools are widely discussed, but expectations about their use can vary by course, assignment and professor. In addition, some AI detection tools are unreliable and may incorrectly flag student work. Taking a few proactive steps can help protect you from misunderstandings or accusations made in error.

What is the most important thing I can do to protect myself?

Check with your professor and follow the course policy. Always review the syllabus and assignment instructions for guidance on AI use. If expectations are unclear, ask your professor before submitting your work. Clear communication up front is your best protection.

Should I disclose if I used AI?

Yes—disclosure is always preferable to silence. If you used AI in any meaningful way, be transparent about how you used it (for example, brainstorming ideas, outlining, checking clarity or revising language) and follow your professor’s specific instructions for acknowledgement or citation.

Because expectations vary by course and by assignment, always review your syllabus and assignment guidelines and ask your professor if you are unsure how AI use should be disclosed. Clear, proactive disclosure helps protect you from misunderstandings and potential academic integrity concerns.

How can I document my work process?

Keeping evidence of your writing or problem-solving process can be helpful if questions arise later. Examples include:

  • Drafts or revision history in Word or Google Docs
  • Notes, outlines or brainstorming documents
  • Saved versions of code or calculations
  • Citations or sources you consulted

This kind of documentation shows how your work developed over time.

Can AI detection tools make mistakes?

Yes. AI detection tools are known to produce false positives, particularly for:

  • Students whose first language is not English
  • Neurodivergent students
  • Students with developing or distinctive writing styles

Because of this, your professor should not rely on detection tools alone. Still, maintaining good documentation and communication protects you.

What kinds of AI use are more likely to cause problems?

Problems are more likely if you:

  • Use AI when the assignment or course prohibits it
  • Submit AI-generated work as if it were entirely your own
  • Fail to disclose AI use when disclosure is required
  • Rely on AI for final answers without understanding or verifying them

Avoiding these situations reduces risk.

Faculty

Are there NIU-supported AI tools for instruction?

Yes, NIU provides several AI-powered tools licensed for the NIU community (availability may vary by role) that have broad instructional use. The list includes tools like Adobe Creative Cloud, AI Assistant for Adobe Acrobat (faculty and staff), Blackboard AI Design Assistant (faculty), Quinncia career readiness tools (students) and Microsoft 365 Copilot.

When possible, NIU faculty are encouraged to use Micorosft 365 Copilot for instructional and academic work because it operates within NIU’s licensed Microsoft environment and includes commercial data protection, offering stronger privacy, security and compliance standards than many publicly available AI tools.

Should I use AI detection tools to determine whether students used AI in their coursework?

Use of AI detection tools is discouraged for evaluating student work. Research and practitioner experience show that these tools frequently produce false positives and are not reliable indicators of whether AI was used.

In particular, AI detection tools have been shown to disproportionately flag work written by:

  • Neurodivergent students
  • Students whose first language is not English
  • Students with distinctive writing styles or evolving academic voice

Instead of AI detectors, faculty are encouraged to:

  • Clearly communicate expectations for AI use in the syllabus and assignment instructions
  • Design assignments that emphasize process, reflection or course-specific context
  • Engage students in conversations about how (or whether) AI tools were used
  • Use existing academic integrity processes grounded in evidence, documentation and professional judgment

Staff (and Supervisors)

What information should I avoid entering into AI tools?

As a baseline, avoid entering sensitive or personally identifiable information (PII) unless you are using an approved tool and you understand how data is stored/used. NIU's AI Ethical Guidelines call for careful handling of sensitive data and "extreme caution" with PII and vulnerable populations.

This includes information such as:

  • Student, employee or applicant identifiers (such as names combined with ID numbers, student IDs, employee IDs or usernames)
  • Contact details (such as home addresses, personal email addresses or phone numbers)
  • Academic or personnel records (such as grades, advising notes, evaluations, disciplinary records or performance reviews)
  • Health, disability, counseling or accommodation-related information
  • Financial information (such as bank numbers, tax information or financial aid data)
  • Research data involving human subjects, confidential datasets or protected populations

When working with these types of information, do not enter them into AI tools unless their use is explicitly approved and appropriate safeguards are in place.

Research, Scholarship and Integrity-related FAQs

Can AI be used in NIU research and scholarship?

AI can support many research tasks (e.g., brainstorming, coding support, summarizing literature you provide), but it must be used responsibly and transparently, consistent with disciplinary norms, sponsor requirements and university policies.

Should AI be listed as an “author” on a paper or report?

Typically, no—because authorship generally requires human responsibility and accountability. However, you may need to disclose AI assistance depending on publisher, discipline or sponsor rules. When in doubt, follow the relevant journal or conference guidance and consult your department or research office.

More Questions?

Not finding the answer to your question about AI use at NIU? Email your question to ai@niu.edu.