Generative AI is transforming teaching and learning experiences for both students and faculty. This webpage serves to familiarize, support, and guide Nova Southeastern University faculty in AI matters that affect both their practice and the student experience. Here, you will find resources that provide comprehensive insights into the principles of Generative AI, its applications, and ethical considerations. All faculty and staff have access to Microsoft Copilot, your AI assistant for education.
Visit NSU's AI website to read about the University's AI policy and gain access to other support resources, tools, and services related to AI.
Empower your teaching and scholarship with Generative AI as your dedicated "Faculty Assistant." Think of it as an intelligent partner that can help you explore new ideas, refine existing materials, and ultimately free up your time to focus on what matters most: connecting with students and advancing your field. Explore the resources below to discover how GenAI can assist you in areas such as identifying AI-generated content, boosting your productivity, crafting effective prompts, and more.
From the file cabinet of old tests in the basement of fraternity houses to online “homework help” sites like CourseHero, students have long sought shortcuts to academic success. Generative AI tools are the latest front in this battle, allowing students to mimic higher-order thinking skills by turning in AI work as their own. Students recognize that AI tools can often provide a better output than they can, faster than they can. The short-term goal of producing work that gets a high grade is often more important to students than the long-term goal of developing their own thinking and writing skills.;
GenAI detection tools raise the hopes of faculty by claiming to be able to spot the use of AI in student work. Some claim accuracy rates as high as 99%. However, tests conducted by educators have shown that AI detection tools often result in false positives and false negatives. Even more troubling is the higher frequency of false positives among students writing in a second language, students with poor writing skills, and students on the autism spectrum. AI detection tools operate with the assumption that a student’s writing style and voice are consistent throughout their work, which is not the case for many. To avoid false positives, or to conceal actual AI use, students often utilize online “humanizing” tools to rewrite their work before turning it in.
Inside Higher Education, the Chronicle of Higher Education, and the American Association of Colleges and Universities all caution faculty against putting their faith in AI detection tools. These organizations and others instead recommend that faculty restructure their courses to reduce the temptation to use AI tools. This might mean explaining the rationale behind major assignments, incorporating more in-class activities, breaking up large assignments into multi-stage projects, or requiring students to articulate their workflow. In both undergraduate and graduate level courses, faculty may need to provide opportunities to practice higher-order thinking skills on low stakes assignments throughout the course. Colleges and departments can coordinate these efforts, but their success will depend on the involvement of faculty members.
Faculty can use Generative AI for a variety of administrative tasks, which can help faculty save time and effort while they focus more on teaching and engaging students. GenAI can also help reduce errors, structure information logically, and offer diverse and creative ideas. Examples of how faculty can use generative AI for administrative tasks include writing email messages, summarizing information, creating presentations, and managing projects. While generative AI does help streamline administrative responsibilities, it is important to implement such assistance with care and consideration. Outputs from generative AI should always be reviewed for accuracy, accessibility, and inclusivity among other considerations.
Other Considerations for Generative AI Output
Personalization: Some forms of output may need to be personalized so that a person’s voice or persona is reflected. For example, emails and other personal communication may need to be edited so the voice of the writer is apparent. Tone, format, and length are also factors that should be customized from a generative AI output.
Privacy: Pprivate data, such as names, ID numbers, and any other information that identifies people should not be included in any generative AI prompt or chat. Compliance with University policies is always a requirement.
Responsibility and Ethics: Output may also need to be reviewed for responsible and ethical considerations.
Security: Make sure that the generative AI tools you are using comply with University policy and appropriate security classifications.
Artificial Intelligence can be a useful tool to support your research process. It is not a replacement for the researcher, but rather a tool that can help save time, do analysis and summaries, and connect resources. If you have used a web-based repository that collects and organizes scholarly content, you have likely been benefiting from AI, perhaps without realizing it.
Some of the most beneficial uses of AI tools in research involve the ability to summarize or paraphrase content. This can save a lot of time when doing research, and be especially useful when preparing for things like a literature review. They also have the ability to help you discover research gaps, and to see how research on a topic has evolved over time. These tools can respond to specific questions you may ask, extract key information, explain implications, summarize methods, highlight important phrases, extract key words, paraphrase, and, in general, summarize entire documents or specific sections within them. They can also be used to gather and organize research related to your topic, or by a certain author.
AI research tools have multiple advantages compared to traditional databases. They usually are intuitive to use, they can present the information in graphic or visual formats, and they can act as a single place to do multiple tasks such as finding literature, summary and analysis, reference management, and writing assistance. They can also analyze based on the actual content of the papers, rather than just by titles, key words, and citations.
There are also some limitations. For one, the tools may have limited access to full text versions of papers and the results can be based on abstracts alone. Some tools require that you create an account and while some tools are free, others require paid subscriptions. They can also mislead you into thinking that you don't need to do your own reading, summary and analysis, and critical assessment of your sources. Finally, there is a lack of transparency as to how these tools produce their results and recommendations. It is useful to remember that these are tools you can use to support your own research, but you should always take a critical view of any AI results and not become to overly dependent on them.
These tools can be most helpful in the early stages of research to gather sources, get an idea of existing research on your topic or related topics, and to work on a literature review. They can be used hand in hand with other databases, where full text versions of the research papers are available, and specialized journals on your topic area.
It is good to already have some relevant research sources that you can upload or input into these tools so that they can begin to find related and relevant sources for you. When doing so, it is good to include more than one or two examples because the more you add the more precise your results will be.
There are many different tools available now for utilizing AI in research, with new ones being released regularly. They each have different features, strengths, and weaknesses. The one(s) that are best suited for you will depend on your particular needs. Below is a list of popular tools that can be a starting place to test out some of the features, and see which ones may be right for you.
The information on this page draws from a presentation by Mihaela Micu from NSU's Alvin Sherman Library. To view the entire presentation and learn more about AI for research and more details on AI research tools please click the link below. You will be asked to fill out a short registration form to gain access to the video and resources.
Effectively integrating Generative AI requires both proactive conversations with students and a clear syllabus policy. Engage students early and consistently to discuss appropriate AI use, building upon the framework provided by your syllabus. Explain the reasoning behind your policy and its implications.
Proactively guide your students in the appropriate use of Generative AI by implementing a formal course policy. Clearly define what these tools are, what they can do, and how students can ethically integrate them into their learning. Incorporating this policy into your syllabus and relevant assignment information will provide students with the necessary framework for success.
Generative AI has the potential to significantly improve accessibility for people with disabilities. For example, AI can generate accessible text formats like captions, descriptions, or summaries for people with visual impairments. It can also adapt to individual needs by providing tailored support based on user preferences and disability types. AI can also facilitate communication by translating languages or providing assistive features for individuals with speech difficulties, and it can be used to identify potential accessibility issues in digital content.
While Generative AI has many benefits for accessibility, concerns with accessible content are also presented. One way that AI can generate inaccessible content is by creating visual content that lacks alt-text or descriptions. Automatic speech recognition is another area that affects accessibility. Sometimes, these systems fail to interpret the speech patterns of individuals with speech impairments accurately. Furthermore, the lack of diverse training data means that AI systems may not be adequately trained on a broad spectrum of disability types and communication styles, limiting their ability to provide effective support for all users. While data collection does become more inclusive over time, users of AI can be intentional with data input to ensure the output is accessible.
Generative AI tools present various ethical dilemmas and considerations. One of the most obvious is the potential for academic or professional dishonesty. Bias is also a major concern, since these models are trained on a wide range of online information which itself includes biases. There are also concerns about intellectual property, since these models are often trained on content that wasn’t approved for use by the authors of that content, and also since these tools can generate content in the style of content creators such as writers, artists, etc. without their permission.
As you consider these ethical challenges, it is also valuable to educate students about them so that they can be aware and develop their genAI literacy skills. Let students know that while there are appropriate uses for genAI, there are also inappropriate and unethical uses, and share with them concerns about bias and intellectual property rights. Express the value of what you are teaching them so they can be motivated to use the tools appropriately when allowed, and not use them in a way that hinders their learning. Be clear with students, both verbally and in written policies, as to when it is appropriate to use these tools for your courses and whether and how they should disclose that they used them.
Recent trends show a significant increase in job postings that are looking for people with experience in artificial intelligence, and particularly Generative AI (LinkedIn, 2023). High percentages of jobs worldwide continue to be changed by GenAI (World Economic Forum, 2023), and by 2024, over 60% of workers were already utilizing GenAI at their jobs (McKinsey & Company, 2024). At the same time, more than half of recent college graduates say they feel unprepared for their careers when it comes to genAI, with a supermajority saying genAI training should be incorporated into their courses (Cengage Group, 2024).
Giving students guided exposure to genAI in their coursework builds the GenAI literacy skills that will help them know how to use these tools ethically, appropriately, and effectively. Career-specific training will help students to be better prepared for their professional future. It is useful to consider this when designing courses and making curricular decisions that will affect students’ career readiness. Students graduating with genAI skills will be more sought after, and more successful, in the job market.
References:
Cengage Group. (2024). CG-2024-Employability-Survey-Report.pdf. Cengage.widen.net.
https://cengage.widen.net/s/bmjxxjx9mm/cg-2024-employability-survey-report
LinkedIn. (2023). Future of work report AI at work. https://economicgraph.linkedin.com/content/dam/me/economicgraph/en-us/PDF/future-of-work-report-ai-august-2023.pdf
McKinsey & Company. (2024). The state of AI in early 2024: Gen AI adoption spikes and starts to generate value | McKinsey. Www.mckinsey.com. https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-state-of-ai
World Economic Forum. (2023). Future of jobs report 2023. In World Economic Forum.
Generative AI tools use information from prompts and results to further train the models. For this reason, it is important to be aware that anything you put into the tool has the potential to be shared or found by others in the future. It is advised that you refrain from putting any personal data or proprietary information into these tools to ensure data privacy for you, our students, and NSU as a whole. All FERPA guidelines should be considered when it comes to data privacy and genAI. Identifiable information, like names and personal data, should be removed from any inputs. Proprietary content should not be put into genAI tools.
An exception to this standard comes with NSU-approved tools. For these, you may consider allowing course content or other NSU content, but steps should still be taken to remove identifiable information if possible, and information such as financial data, social security numbers, etc. should never be put into any tool whether NSU-approved or not. Please check NSU guidance and policy regularly to keep track of which genAI tools have been officially approved for use.
Use the F.A.S.T.E.R principle to help ensure Generative AI is being used responsibly.
Examine these Responsible Use Scenarios for GenAI. These scenarios can be used with students as an activity or to generate discussions on the appropriate use of GenAI.
Generative AI Professional Development - Dive into this SharkMedia playlist for insights on strategically integrating AI to enhance your pedagogy.
Wharton Interactive Crash Course: Practical AI for Instructors and Students - Check out this highly-rated, five-part series on YouTube for an overview of AI large language models.
Mako Commons - Learn more about this virtual community of practice (vCOP) and connect with NSU faculty to share ideas, information, and resources.
Last updated February 2025