Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Conversation
The Conversation
Terri L. Griffith, Keith Beedie Chair in Innovation and Entrepreneurship, Simon Fraser University

Why using AI tools like ChatGPT in my MBA innovation course is expected and not cheating

But if students misrepresent or omit sources, including generative AI, that's a problem. (Shutterstock)

I teach managing technological innovation in Simon Fraser University’s Management of Technology MBA program. Thanks to the explosion of generative artificial intelligence, I’m rewriting my 2023 syllabus and assignments.

No matter our industry or field, we should regularly review our tools and workflows. New tools, like AI, are excellent triggers for this assessment. Sorting out how best to adjust our work, as per the values and existing norms of different fields, takes a systematic approach.

My research examines how companies can adjust how they use talent, technology and technique to hit work targets and stay aligned with the times — what I’ve called thinking in 5T.

Educators in MBA programs, who are concerned with building students’ professional capacities, can also use this lens to support the critical thinking that students need. We can help students consider how and when to use AI in their academic and professional lives.

Abrupt availability of AI tools

ChatGPT, DALL-E and Writesonic are examples of publicly available generative AI. These are “generative” in that humans provide a prompt and the AI outputs text or images based on machine learning.

I didn’t think to mention generative AI in my September 2022 syllabus. Class discussions included my expectation that students would use Grammarly or other proofreading tools to support their professional writing.

People seen sitting around a table collaborating.
It’s important for innovation students to learn how and when to use AI in academic and professional life. (Kampus Productions)

We discussed different citation styles for business writing and how incorrectly citing sources can negatively affect one’s career.

Some students asked whether Grammarly’s more sophisticated ability to rewrite sentences was a problem. I said, no, it’s an innovation course and we should use the tools we have.

Considering social and technical aspects

The 5T framework is my modernized presentation of sociotechnical systems theory — a theory describing how workers and leaders must manage social and technical aspects of work to achieve performance and well-being.

Thinking in 5T means you set a target and then consider:

  • the times (context) in which you make a decision;
  • available talent (knowledge, skills, abilities, human reactions, limitations);
  • technology (from AI and smart watches to shovels and conference room furniture);
  • technique (practices, workflows and so on) as you look for the right balance of all these elements.

Research suggests that people who are more systems savvy have a greater ability to see the connections across these different domains and construct synergies appropriate for their work.

No silver bullet

Thinking in 5T means you never expect a “silver bullet” change to work: for example, just blocking ChatGPT on an organization’s network with no other adjustments. Instead, you look to manage all aspects of your human and technological variables.

My target is for my students to improve their ability to identify and evaluate existing innovations and create valuable new ones.

To date, my syllabus has said “your final submission must be your individual work and words,” but now I will need to clarify what this means.

Learning how to use AI

I agree with Kevin Kelly of Wired that asking ChatGPT how to do things — in technical terms, writing AI prompts — requires work and expertise. We also need to be careful consumers of what the generative AI produce.

Generative AI are often wrong. Both students of innovation and business professionals will need to understand how the tools generate responses to assure factual answers and correct references.


Read more: Unlike with academics and reporters, you can't check when ChatGPT's telling the truth


Beyond fact-checking, my students must use critical thinking and show they can apply course concepts. As I teach innovation skills, we can cover how to engage with ChatGPT and other generative AI effectively.

My innovation students create personalized templates that allow them to take course concepts, apply them in the real world and improve the application of these concepts throughout their careers. How might students write an AI prompt for ChatGPT to help them use design thinking in their work?

An effective ChatGPT prompt would be: “Create a playbook to support design thinking. Include alternatives for expert versus novice team members and teams working virtually versus face to face.”

Such a prompt guides ChatGPT to return a response drawing on both the social and technical aspects of work — the thinking in 5T approach from my course.

Academic integrity

While academic discussions are ongoing about the ethical and knowledge implications of using generative AI, academic integrity does provide some firm boundaries.


Read more: Unlike with academics and reporters, you can't check when ChatGPT's telling the truth


For example, at Simon Fraser University, students must demonstrate “a commitment not to engage in or tolerate acts of falsification, misrepresentation or deception.”

In my course, the notion of “individual work” must change.

I’ll be adjusting the assignments and requiring an appendix describing the toolkit and practices students use. Using AI is not cheating in my course, but misrepresenting your sources is.

CBC News video.

Work doesn’t exist in a vacuum

The AI will get better, and there will be more of them. Guidelines in work and education need to keep pace and be thoughtfully aligned to how knowledge is constructed in different fields.

We’re learning that some journals won’t accept AI as credited authors. Other publishers have announced that while you can’t list ChatGPT as an author, AI tools can be used in some stages of preparation, as long as you disclose this in the manuscript.

We need the various manuals of style to update their rules to include work generated by an AI. Given the pace of AI change, writers may need to highlight the specific versions of the AI they use (much as the APA Style requests dates for Wikipedia articles).

I like an approach some photographers use: share your tools and critical settings.

The Conversation

Terri L. Griffith receives funding in support of her research at Simon Fraser University from the Social Sciences and Humanities Research Council of Canada, the Natural Sciences and Engineering Research Council of Canada, and the Negotiation and Team Resources Institute. Prof. Griffith is a member of the Academy of Management, INFORMS, and the International Society of Service Innovation Professionals. She does not work for, consult, own shares in, or receive funding from any company or organisation that would benefit from this article and has disclosed no other relevant affiliations.

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.