top of page

Quality Assurance plays an integral part of my practice as I am commit to fostering a culture of continuous improvement and accountability through evidence-informed QA that enhances learning, trust, and institutional excellence.

Quality Assurance

Quality Assurance plays an integral part of my practice as I am commit to fostering a culture of continuous improvement and accountability through evidence-informed QA that enhances learning, trust, and institutional excellence.

Philosophical Alignment

Quality assurance can be a contentious topic for educators, academic development practitioners, and leaders. While some look upon QA as an integral part of their work, others might see it as an obstacle to innovation. My approach to QA has remained within the former as it in ingrained my practice. I use QA frameworks to guide my processes and inspire me to think differently about my practice. This does not meant that I always agree with specific frameworks but this is important as these mechanisms can also be used to challenge unconscious bias.


My work in QA has involved multiple levels ranging from courses, programs, and institutions. I have collated a few examples below to showcase my ongoing work in this space as I continue to explore the intersections of QA and professional practice.

Examples of Practice

Course Level

Description

My work in digital course design has required me to create various QA tools to ensure that online learning experiences are built in the interest of students. I have developed proprietary QA frameworks for multiple institutions that included QA checklists, QA review frameworks for online course development projects, and associated QA training for subject-matter experts. While I may not be able to share these examples here, these elements are inspired by existing frameworks such as Quality Matters and the Online Learning Consortium. These rubrics were included in the literature review for new QA tools as I sought to create something that was more agile than existing frameworks.


What worked?

The structure of a QA checklist works best when it uses simplified language for subject-matter-experts and is an integral part of the onboarding process for new projects. Many SMEs have little curriculum development experience so the QA metrics and language needs to be accessible to all parties.


What didn’t work?

Implementing the same QA tools as a universal guide for all future projects. Each course design initiative has its own challenges so it became evident that replicating some QA metrics across all courses was never going to work. This was mainly due to the nature of the course of the target audience. This experience convinced me to create standards that would be shared across all courses but another section of sub-standards that would be discipline specific or specific to specific audiences.


How would I continue to use this practice?

I continue to use QA tools in all my course development processes whether I am the primary instructor, instructional designer, or faculty development facilitator. These tools ground me in my work and help me focus on key elements by also providing some liberties in adding more later down the road as the standards evolve with the needs of learners.


References

Program Level

Description

Within the Ontario College sector, I have created many mechanisms that govern program quality. These efforts have included the creation of Curriculum Mapping processes, Program Reviews, and larger Program Development cycles that include various elements of QA. The competitive landscape of HigherEd prevents me from openly sharing the resources that I have created for other institutions but I can use this space to share a few stories about my practice.


What worked?

Collaboration. The program development process is a prime example where QA standards must be at the forefront of every decision. These decisions must be made in collaboration with multiple departments and representatives so that the initiative moves as one entity. For example, the marketing department needs to understand that a program title must include specific elements or an industry partner will need to approve the direction of a curriculum change if the experience cannot be replicated in a learning setting.


What didn’t work?

Unrealistic Timelines.

Academic leaders that have little involvement in program development are ill-equipped to set production targets. In my experience, these leaders mistakenly think that the process can be expedited internally with more resources but they fail to comprehend the potential bottlenecks with external bodies that impact the process.


How would I continue to use this practice?

Collaboration has been a key element of my quality assurance practice as it helps engage all parties in a shared commitment to quality. This also helps avoid the common pitfall of unrealistic timelines as I have used open collaboration to bridge expectations of senior executives with external guidelines. This negotiation has served me well as it also provides me with an opportunity to promote the impact of QA on the larger program development process and enlists the support of senior executives in supporting this vision with their newfound knowledge on process.

Service Level

Description

As the leader of a Centre for Teaching and Learning, I created an integrated quality assurance framework that assessed our collective performance as team of academic development practitioners. This was done using a single survey tool that would be implemented at the end of each workshop, 1on1 consultation, or faculty development programming. This data was also synchronized with other data points such as enrollment, completion records, and other trends within a larger PowerBi dashboard.

The data collection was also complemented by a service standard that was created collectively amongst the team as part of a larger strategic planning session. This service standard outlined our commitment to operational excellence by establishing clear guidelines and practices that would be used by all members.


What worked?

The integrated data approach was well received by the college community as participants had one point of access to provide various forms of feedback. I used this data to assess our service standards at any given time and was able to identify specific trends ahead of larger changes.


What didn’t work?

Our initial effort in data collection was flawed as we had too many data points and the process became grueling. This was corrected later on with an exploration of PowerBi and a revamped enrollment process that was centralized to meet the needs of our quality assurance framework.


How would I continue to use this practice?

The creation of a Service Standard and associated data collection process for feedback provided instrumental support to my practice as a leader. I continue to use this approach across my other projects as it contributes to my larger efforts in promoting transparency in my work. This is important as creating QA frameworks for services can lead to uncomfortable discussions with under performing team members if this initiative is not embraced by all parties.


References

Institutional Level

Description

My work as a HigherEd Consultant has enabled me to complete larger institutional audits. This experience was paramount to my work QA practice as I utilized my skills as a practitioner to contextualize the expectations set forther by an external QA framework.


What worked?

Active listening was a key element of this process as auditors were required to engage in various discussions with members of the institution to validate the evidence that was provided in a self-study report. The adoption of an existing framework was also valuable as this created a sense of consistency across all auditors.


What didn’t work?

While I was fortunate to have experienced positive interactions during my time as an auditor, I have learned that other teams encountered many challenges in reaching consensus. These challenges stemmed from conflicting philosophies and an inability to bridge expectations using the corresponding QA framework. These stories shaped my own practice as I furthered my investment in active listening so that I may better understand the position of other members prior to engaging in a full debate.


How would I continue to use this practice?

My QA practice has been shaped by this experience in various ways as I now have a full appreciation for both sides of the QA process (i.e., applicant and auditor). This has made me rethink how I present evidence in my QA practice but also highlighted the importance of narratives. In the end, I believe that I am a stronger practitioner now that I have experienced both sides as I have a more fulsome understanding of QA processes from distinct perspectives.


Reference

·      QA Report (French) – LaCite (2025) // in-draft

·      QA Report (French) – Boreal (2025) // in-draft



  • LinkedIn

2025

Request a detailed CV or share some feedback by completing the form in the contact section. 

The resources shared on this site include materials with Creative Commons Licenses, images from public events previously shared on other social media platforms, and content co-authored with generative Ai tools.

bottom of page